Overview

The vision community has put tremendous efforts into 3D object modeling over the past decade, and has achieved many impressive breakthroughs. However, there is still a large gap between the current state of the art and industrial needs for 3D vision. One of the major reasons is that existing 3D shape benchmarks provide only 3D shapes with dreamlike or no textures. Another imperfection is that there is no large-scale dataset in which 3D models exactly match the objects in images. These are insufficient for comprehensive and subtle research in areas such as texture recovery and transfer, which is required to support industrial production.

The goal of the workshop is to conclude state-of-the-art 3D geometry and vision algorithms, to facilitate innovative research on high-quality 3D shape understanding and generation, and to build a bridge between academic research and 3D applications in industry. To support this goal, we will release 3D-FUTURE (3D FUrniture shape with TextURE), which contains 20,000+ clean and realistic synthetic scenes in 5,000+ diverse rooms. The scenes involve 10,000+ unique industrial 3D instances of furniture with high-resolution informative textures developed by professional designers. Building on 3D-FUTURE, we propose the Alibaba 3D Artificial Intelligence Challenge 2020 with three tracks: 1) Cross-domain image-based 3D shape retrieval, 2) 3D object reconstruction, and 3) Instance segmentation. In addition, a discussion panel is included in the workshop to provide a forum for potential valuable research topics such as the recovery of both 3D shape and texture from 2D images.

Competition Tracks

The competition tracks are based on the 3D-FUTURE benchmark, which is provided by Alibaba Topping Homestyler, and organized by Alibaba Tao Technology Department. The competition platform is supported by Alibaba Tianchi.

Cross-domain Image-based 3D Shape Retrieval

In this challenge, participants are required to retrieval the corresponding 3D shape given a 2D query image. We expect to foster the development of shape retrieval methods that are robust to slight occlusions and changes in diverse complicated surroundings. The performance will be mainly measured by TopK Recall. Details can be found here.

3D Object Reconstruction from A Single Image

In this challenge, participants will reconstruct a 3D shape from a single RGB image. The objects contained in the input RGB images may slightly occluded or patrially incompleted. The Chamfer Distance and F-score will be used to measure the quality of the reconstruction. Details can be found here.

Instance Segmentation

In this challenge, participants are required to label each foreground pixel with the appropriate object and instance. We include this challenge because it will motivate vision-based image generation, thereby improving relevant industrial production chains, for example, by partially reducing the requirement for expensive 3D rendering. Details can be found here.

Prizes

1st Place: $1500

2nd Place: $1000

3rd Place: $500

Presentation at our

IJCAI-PRICAI 2020 Workshop

Co-author of our

IJCAI-PRICAI 2020 Workshop report

Leaderboard

One can track the individual leaderboards on each competition site. We will report the final leaderboards here after the competition deadline.

Important Dates

  • 2020-09-22        Release of report for 3D-FUTURE dataset.

  • 2020-03-30        Release of training and test samples.

  • 2020-04-03        Release of the entire training and validation data.

  • 2020-05-20        Release of evaluation scripts, toolbox, and baseline.

  • 2020-07-19        Registration deadline. Release of test set (23:59:59 PST).

  • 2020-07-24        Submission deadline (23:59:59 PST).

  • 2020-08-07        Technical report deadline.

  • 2020-08-21        Challenge award notification.

Program

TBD.

Invited speaker

TBD.

Workshop Organizers

Huan Fu

Alibaba-inc

Rongfei Jia

Alibaba-inc

Lin Gao

ICT.CAS

Mingming Gong

University of Melbourne

Binqiang Zhao

Alibaba-inc

Steve Maybank

Birkbeck College, University of London

Dacheng Tao

The University of Sydney

Partners