Interface-level Intent Inference for Environment-agnostic Robot Teleoperation Assistance


Larisa Y.c. Loke, Brenna Argall

Paper ID 81

Session 9. HRI

Poster Session (Day 3): Monday, June 23, 12:30-2:00 PM

Abstract: In robot teleoperation, humans often issue control signals through an interface that requires physical actuation. This interface-level interaction largely goes unmodeled within the field, yet the robot’s interpretation of an interface-level command can differ from what was intended by the user, as a result of diminished human ability or inadequate mappings from raw interface signals to robot control signals. Interface-aware systems aim to address this limitation in robot teleoperation by explicitly considering the impact of a control interface on user input quality when interpreting interface signals for robot control. This work presents an interface-aware formulation for the direct inference of intended interface-level commands given known interaction characteristics of a control interface using data-driven modeling, allowing for teleoperation assistance without knowledge of the human’s policy. In our specific implementation, we tailor the formulation to model a user’s operation of a sip/puff interface using a network of Gated Recurrent Units, chosen for their ability to model temporal patterns and suitability for data-scarce domains. The resulting model is agnostic to the robot being controlled, which allows for its use in task- and environment-agnostic robot teleoperation assistance. We deploy this model in two variations of assisted eleoperation frameworks using a sip/puff to control a 7-DoF robotic arm, and conduct a human subjects study with spinal cord injured participants to evaluate the efficacy of our method. Our proposed task- and environment-agnostic formulation is effective in reducing collisions during teleoperation, and is preferred by users over teleoperation baselines for ease and intuitiveness of robot operation.