Dexonomy: Synthesizing All Dexterous Grasp Types in a Grasp Taxonomy


Jiayi Chen, Yubin Ke, Lin Peng, He Wang

Paper ID 105

Session 11. Manipulation II

Poster Session (Day 3): Monday, June 23, 6:30-8:00 PM

Abstract: Generalizable dexterous grasping is a fundamental skill for intelligent robots. To develop such skills, a large-scale, high-quality, and diverse dataset of robotic dexterous grasps—covering the GRASP taxonomy—is essential but extremely challenging to collect. Previous dexterous grasp synthesis methods are often limited to specific grasp types or object categories, and tend to suffer from issues like penetration and unnatural poses. In this work, we address these challenges by proposing an efficient method capable of synthesizing physically plausible, contact-rich, and penetration-free dexterous grasps for any grasp type, object, and articulated robotic hand. Starting from only one human-annotated template per hand and grasp type, our pipeline first uses a lightweight global alignment stage to optimize the object pose and then a simulation-based local refinement stage to adjust the hand pose. Next, to validate the synthesized grasps, we introduce a contact-aware control strategy that applies desired forces to each contact point on the object. The validated grasps can further enrich the grasp template library and facilitate future synthesis. Experimental results demonstrate the superiority of our pipeline over existing grasp synthesis approaches for both fingertip and other grasp types. Furthermore, using our synthesized grasps, we show that a type-conditional generative model can successfully learn and perform the desired grasp type in both simulation and the real world.