Cable Manipulation with a Tactile-Reactive Gripper

Yu She, Siyuan Dong, Shaoxiong Wang, Neha Sunil, Alberto Rodriguez, Edward Adelson


Cables are complex, high dimensional, and dynamic objects. Standard approaches to manipulate them often rely on conservative strategies that involve long series of very slow and incremental deformations, or various mechanical fixtures such as clamps, pins or rings.We are interested in manipulating freely moving cables, in real time, with a pair of robotic grippers, and with no added mechanical constraints. The main contribution of this paper is a perception and control framework that moves in that direction, and uses real-time tactile feedback to accomplish the task of following a dangling cable. The approach relies on a vision-based tactile sensor, GelSight, that estimates the pose of the cable in the grip, and the friction forces during cable sliding.We achieve the behavior by combining two tactile-based controllers: 1) Cable grip controller, where a PD controller combined with a leaky integrator regulates the gripping force to maintain the frictional sliding forces close to a suitable value; and 2) Cable pose controller, where an LQR controller based on a learned linear model of the cable sliding dynamics keeps the cable centered and aligned on the fingertips to prevent the cable from falling from the grip. This behavior is possible by a reactive gripper fitted with GelSight-based high-resolution tactile sensors. The robot can follow one meter of cable in random configurations within 2-3 hand regrasps, adapting to cables of different materials and thicknesses. We demonstrate a robot grasping a headphone cable, sliding the fingers to the jack connector, and inserting it. To the best of our knowledge, this is the first implementation of real-time cable following without the aid of mechanical fixtures.

Live Paper Discussion Information

Start Time End Time
07/14 15:00 UTC 07/14 17:00 UTC

Virtual Conference Presentation

Paper Reviews

Review 2

The paper is original in the sense that this is a solution for an unsolved application. The task could be relevant because it is related to other cloth manipulation tasks, for instance, edge tracing. However, their approach is not generic enough and it is not motivated enough. Also, the bibliography research seemed very short, considering the large amount of rope manipulation literature that exists. The paper is clear and well explained, and the video is also very clarifying. However, the overall work doesn't have a lot of significance for the community, unless you need to solve the exact same application and are willing to buy that same sensor. I think the controllers could be applied to other sensors as long as you can estimate the pose of the cable, but the authors have not made any effort to make the contribution a bit more general. Authors should also clearly state the limitations of their approach, as it seems they can solve it pretty well, but in the final paragraph they state that better learning of the dynamic model could increase accuracy, but it didn't seem necessary for the shown results. Also, the presented controllers perform just slightly better than the baseline solutions that are based on very simple approaches. In conclusion, I think this is a nice work that solves very well the task, but the authors have to better convince me of the novelty of their solution and why it could be significant for the community of deformable object manipulation.

Review 3

The most impressive aspect of this paper is how little precedent there is for such a task. Cable manipulation in general is rarely attempted; to the best of my knowledge, a task of the complexity of the one shown here has not been previously shown. The experimental performance is also remarkable. While none of the individual building blocks introduced here is particularly complex or novel, their combination is, and there is significant novelty in the complete system being able to accomplish such a task. Further showing how different this work is from previous literature, there is really no baseline in the literature for the authors to compare their results against. They thus do a very thorough comparison against ablated versions of their own system, as well as a naive open loop execution. While open loop execution fails utterly, some of these ablations do hold their own very well against the full system, and have the benefit of being much simpler. Still, the complete system, including the cable pose controller, does perform best, especially in terms of requiring fewest re-grasps and going furthest on one grasp (within error bounds, but still). The authors should be commended for these complete results. A few clarifications would be helpful. The authors state in the introduction that the gripper has force control capabilities, but that does not seem to be the case. If I understand correctly, there is a spring in series with the motor, which allows conversion between position displacements and output force, but the actual value of the output force is neither measured nor regulated. (The grip controller operates directly on desired size of tactile imprint.) It can be hard to figure out what exactly is the output of the cable pose controller, and also how these two controllers operate together (if I understand correctly, they are orthogonal, regulating completely different outputs). Finally, it would be great to see how well the linear dynamic model fit to cable pose data works. A better characterization of this model could include training error, generalization error, an analysis of how well it works over a wide range of situations (cable poses, distance from the fixed point), etc. Overall however, the paper is clearly written and easy to follow. In conclusion, the paper introduces a complete system-and-method approach to a task not previously attempted, and obtains remarkable experimental results. It is also a valuable piece of work towards the introduction of manipulation algorithms that use tactile sensing, as opposed to tactile sensors in search of applications. It would be useful and interesting to the community and conference audience.