Siwei Chen (National University of Singapore), Xiao Ma (National University of Singapore), Yunfan Lu (National University of Singapore), David Hsu (National University of Singapore) |
|
Paper #071 |
Interactive Poster Session V | Interactive Poster Session VIII |
This paper presents Particle-based Object Manipulation (PROMPT), a new approach to robot manipulation of novel objects ab initio, without prior object models or pre-training on a large object data set. The key element of PROMPT is a particle-based object representation, in which each particle represents a point in the object, the local geometric, physical, and other features of the point, and also its relation with other particles. Like the model-based analytic approaches to manipulation, the particle representation enables the robot to reason about the object’s geometry and dynamics in order to choose suitable manipulation actions. Like the data-driven approaches, the particle representation is inferred online in real-time from visual sensor input, specifically, multi-view RGB images. The particle representation thus connects visual perception with robot control. PROMPT combines the benefits of both model-based reasoning and data-driven learning. We show empirically that PROMPT successfully handles a variety of everyday objects, some of which are transparent. It handles various manipulation tasks, including grasping, pushing, etc,. Our experiments also show that PROMPT outperforms a state-of-the-art data-driven grasping method on the daily objects, even though it does not use any offline training data.