Task-Informed Grasping (TIG-II): From Perception to Physical Interaction


Organizers: Amir Ghalamzan Esfahani, S. Hamidreza Kasaei, Gerhard Neuman

Website: https://lcas.lincoln.ac.uk/wp/tig-ii

Robots need to physically interact with their environment to perform the desired manipulative tasks for efficient functioning in our society. Such interactions, e.g. grasping, require complex cognitive processes such as to perceive, plan, predict and to act. In robotic research, each of these sub-problems of grasping is typically considered in isolation. This is in contrast to the finding in cognitive science research on primates. Hence, robots are still far away from primates grasping ability. For instance, current robotic approaches separately detect an object in an image, segment it, synthesis grasp configuration using geometrical features of the perceived point cloud of the object, plan the manipulative movements, and perform the actions to deliver the object at the desired pose. This sequential open-loop pipeline of grasping and manipulative movements is not robust as a task constraint does neither influence the grasping, segmentation nor object detection whereas each of these components directly determines limitation on the solution space of the subsequent ones. In this workshop, we would like to discuss task-informed perception, grasping and manipulation, and the critical role of cognition. Specifically, the robot performs active perception to gain information sufficient for synthesising the grasps that facilitate performing the desired task. In such scenarios, the robot actively perceives the state of the environment and evaluates its actions to guarantee its own success across all the components of the manipulation task. This allows the robot to find a solution to each component of the manipulation pipeline that provides sufficient initial conditions for successfully performing the successive tasks in the pipeline; otherwise, for example, a grasp choice does not allow manipulative movements.

This workshop brings together researchers working in the area of robotic grasping/manipulation, planning, robot learning, and cognitive robotics; and it discusses the possible solution to the corresponding problems.
Topics of interest include (but not limited to):

  • Deep learning for task-informed grasping
  • Affordance informed grasping
  • Grasping and manipulative movements in cluttered environments
  • Task driven robotic perception
  • Active perception
  • joint planning of grasping pose and manipulative movements
  • Benchmarking and dataset for grasping and manipulation
  • Challenges of soft manipulation