Articulated Object Tracking

Organizers: Tanner Schmidt, Dieter Fox, Jeannette Bohg, Roberto Roberto Martín-Martín


In recent years, the robotics and vision communities have provided many techniques for estimating and tracking the pose of articulated objects such as robot manipulators, doors, tools, human hands, and human bodies. There are model-based techniques, learning-based techniques, and hybrid techniques, all showing exciting progress towards solving this challenging problem and each having their own limitations and advantages. There are still many open challenges, however, especially when the scene of interest contains multiple interacting articulated objects. In these scenarios, occlusions, partial observability, and high-dimensional state spaces make it very difficult to maintain a real-time state estimate that is sufficiently accurate for safe planning and manipulation.

The goal of this workshop is to bring the robotics and vision communities together to discuss both recent successes in articulated object tracking, its use in robotic manipulation, and also the remaining limitations. The ultimate goal is to identify directions for further improvement of articulated object tracking systems that can be exploited in robotics to allow robots and humans to work together in a shared space safely, robustly, and efficiently.