Dex1B: Learning with 1B Demonstrations for Dexterous Manipulation


Jianglong Ye, Keyi Wang, Chengjing Yuan, Ruihan Yang, Yiquan Li, Jiyue Zhu, Yuzhe Qin, Xueyan Zou, Xiaolong Wang

Paper ID 106

Session 11. Manipulation II

Poster Session (Day 3): Monday, June 23, 6:30-8:00 PM

Abstract: Generating large-scale demonstrations for dexterous manipulation remains a challenging problem, and various approaches have been proposed in recent years to address it. Among these, generative models have emerged as a promising paradigm, enabling the efficient generation of diverse and plausible demonstrations. In this paper, we introduce Dex1B, a large-scale, diverse, and high-quality demonstration dataset created using generative models. The dataset includes 1 billion demonstrations and focuses on two fundamental tasks: grasping and articulation. To achieve this, we propose a unified generative model that incorporates diverse conditions, such as contact points and hand orientation, to synthesize actions and other essential properties that can be utilized for both data generation and policy deployment. We validate the proposed model on both established and newly introduced simulation benchmarks, demonstrating significant improvements over previous state-of-the-art methods. Furthermore, we showcase the model’s effectiveness and robustness through real-world robot experiments.