SAM-RL: Sensing-Aware Model-Based Reinforcement Learning via Differentiable Physics-Based Simulation and Rendering

Jun Lv
Shanghai Jiao Tong University
Yunhai Feng
University of California San Diego
Cheng Zhang
Meta Reality Labs Research
Shuang Zhao
University of California, Irvine
Lin Shao
National University of Singapore
Cewu Lu
Shanghai Jiao Tong University
Paper Website

Paper ID 40

Nominated for Best System Paper

Session 5. Simulation and Sim2Real

Poster Session Wednesday, July 12

Poster 8

Abstract: Model-based reinforcement learning (MBRL) is recognized with the potential to be significantly more sample efficient than model-free RL. How an accurate model can be developed automatically and efficiently from raw sensory inputs (such as images), especially for complex environments and tasks, is a challenging problem that hinders the broad application of MBRL in the real world. In this work, we propose a sensing-aware model-based reinforcement learning system called SAM-RL. Leveraging the differentiable physics-based simulation and rendering, SAM-RL automatically updates the model by comparing rendered images with real raw images and produces the policy efficiently. With the sensing-aware learning pipeline, SAM-RL allows a robot to select an informative viewpoint to monitor the task process. We apply our framework to real world experiments for accomplishing three manipulation tasks: robotic assembly, tool manipulation, and deformable object manipulation. We demonstrate the effectiveness of SAM-RL via extensive experiments. Videos are available on our project webpage at