Data-driven body–machine interface for the accurate control of drones
The teleoperation of nonhumanoid robots is often a demanding task, as most current control interfaces rely on mappings between the operator’s and the robot’s actions, which are determined by the design and characteristics of the interface, and may therefore be challenging to master. Here, we describe a structured methodology to identify common patterns in spontaneous interaction behaviors, to implement embodied user interfaces, and to select the appropriate sensor type and positioning. Using this method, we developed an intuitive, gesture-based control interface for real and simulated drones, which outperformed a standard joystick in terms of learning time and steering abilities. Implementing this procedure to identify body-machine patterns for specific applications could support the development of more intuitive and effective interfaces.
Screenshot 2022-09-26 at 13.18.22.png
Thumbnail
openaccess
copyright
85.17 KB
PNG
a59dcd279f68a4bbb21d3dc709352331
1718648115.full.pdf
openaccess
1.79 MB
Adobe PDF
a11ecf7778eb6b13870c551ae5cd25ba