Demonstrating Learning from Humans on Open-Source Dexterous Robot Hands


Kenneth Shaw, Ananye Agarwal, Shikhar Bahl, Mohan Kumar Srirama, Alexandre Kirchmeyer, Aditya Kannan, Aravind Sivakumar, Deepak Pathak
Paper Website

Paper ID 14

Session 3. Manipulation

Poster Session day 1 (Tuesday, July 16)

Abstract: Emulating human-like dexterity with robotic hands has been a long-standing challenge in robotics. In recent years, machine learning has demanded robot hands to be reliable, inexpensive and easy-to-reproduce. For the past few years we have been investigating how to address these demands. We will demonstrate our three robot hands that address this problem ranging from rigid easy-to-simulate hand to soft but strong dexterous robot hands performing three different machine learning tasks. Our first machine learning task will be teleoperation, where we will develop a new mobile arm and hand motion capture system that we will bring to RSS 2024. Second, we will demonstrate how to use human-video and human motion to teach robot hands. Finally, we will show how to continually improve these policies using reinforcement learning in both simulation and the real-world. This demo will be engaging, will serve to demystify dexterous manipulation and inspire researchers to bring robot hands into their own projects. Please see our website at https://leaphand.com/rss2024demo for more interactive information.