This paper presents a method for building a vision system that predicts terrain’s physical properties from images for efficient robotic locomotion. The system uses Active Sensing Motor Policies, which are trained to explore locomotion behaviors that increase the accuracy of estimating physical parameters. For instance, the robot learns to swipe its foot against the ground to estimate the friction coefficient accurately. The system is trained with a small amount of real-world traversal data, and it can even work with overhead images captured by a drone, despite being trained on data collected by cameras attached to a robot walking on the ground.

 

Publication date: 2 Nov 2023
Project Page: https://gmargo11.github.io/active-sensing-loco
Paper: https://arxiv.org/pdf/2311.01405