Proximity Human-Robot Interaction Using Pointing Gestures and a Wrist-mounted IMU

We present a system for interaction between co-located humans and mobile robots, which uses pointing gestures sensed by a wrist-mounted IMU. The operator begins by pointing, for a short time, at a moving robot. The system thus simultaneously determines: that the operator wants to interact; the robot they want to interact with; and the relative pose among the two. Then, the system can reconstruct pointed locations in the robot's own reference frame, and provide real-time feedback about them so that the user can adapt to misalignments. We discuss the challenges to be solved to implement such a system and propose practical solutions, including variants for fast flying robots and slow ground robots. We report different experiments with real robots and untrained users, validating the individual components and the system as a whole.

Published in:
Presented at:
2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, May 20-24, 2019
Aug 12 2019
Other identifiers:
Additional link:

 Record created 2019-10-31, last modified 2019-11-06

Rate this document:

Rate this document:
(Not yet reviewed)