Fast Traversability Estimation for Wild Visual Navigation


Jonas Frey
ETH Zürich
Matias Mattamala
University of Oxford
Nived Chebrolu
University of Oxford
Cesar Cadena
ETH Zürich
Maurice Fallon
University of Oxford
Marco Hutter
ETH Zürich
Paper Website

Paper ID 54

Session 7. Mobile Manipulation and Locomotion

Poster Session Wednesday, July 12

Poster 22

Abstract: Natural environments such as forests and grasslands are challenging for robotic navigation because of the false perception of rigid obstacles from high grass, twigs, or bushes. In this work, we propose Wild Visual Navigation (WVN), an online self-supervised learning system for traversability estimation which uses only vision. The system is able to continuously adapt from a short human demonstration in the field. It leverages high-dimensional features from self-supervised visual transformer models, with an online scheme for supervision generation that runs in real-time on the robot. We demonstrate the advantages of our approach with experiments and ablation studies in challenging environments in forests, parks, and grasslands. Our system is able to bootstrap the traversable terrain segmentation in less than 5 min of in-field training time, enabling the robot to navigate in complex outdoor terrains - negotiating obstacles in high grass as well as a 1.4 km footpath following. While our experiments were executed with a quadruped robot, ANYmal, the approach presented can generalize to any ground robot. Project page: https://bit.ly/3M6nMHH