Embodied Flight with a Drone
Most human-robot interfaces, such as joysticks and keyboards, require training and constant cognitive effort and provide a limited degree of awareness of the robots’ state and its environment. Embodied interactions, that is the bidirectional link between the physical bodies and control systems of the robot and of the human, could not only enable a more intuitive control of robots, even for novices, but also provide users with more immersive sensations. But providing an embodied interaction by mapping human movements into a non-anthropomorphic robot is particularly challenging. In this paper, we describe a natural and immersive embodied interaction that allows users to control and experience drone flight with their own bodies. The setup uses a commercial flight simulator that tracks hand movements and provides haptic and visual feedback. The paper discusses how to map body movement with drone motion, and how the resulting embodied interaction provides a more natural and immersive flight experience to unskilled users with respect to a conventional RC remote controller.
Screenshot 2022-09-26 at 16.28.24.png
Thumbnail
openaccess
copyright
103.38 KB
PNG
0b2a008e5e0d182928a4c2529afa38e1
Embodied flight.pdf
openaccess
CC BY
527.79 KB
Adobe PDF
dd19894493b1c4e11a225c3ca6ab78e2