Action Filename Description Size Access License Resource Version
Show more files...


Robotic teleoperation is fundamental to augment the resilience, precision, and force of robots with the cognition of the operator. However, current interfaces, such as joysticks and remote controllers, are often complicated to handle since they require cognitive effort and learned skills. Wearable interfaces can enable more natural and intuitive interactions with robots, which would make robotic teleoperation accessible to a larger population of users for demanding tasks, such as manipulation or search-and-rescue. The aim of this thesis is to explore solutions to simplify our interactions with robots promoting their access to a broader range of the population. To achieve this, we are presenting a soft upper body exoskeleton, called the FlyJacket, for the bidirectional control of drones. Drones can greatly benefit us as they extend our perception and range of action. The exoskeleton controls a drone by recording torso movement and, through embedded haptic feedback devices, renders either kinesthetic guidance to improve the flight performance or tactile feedback to render the sensation of flying. We developed and tested an interface to control both a simulated and a real drone. The FlyJacket is a soft exoskeleton with arm support conceived to address the challenges of adapting to different morphologies and supporting the user during flight to prevent fatigue. We demonstrated that this novel interface allowed more consistent performance than when performing the same task with a remote controller and users felt more immersed into the flight. Interacting with a robot can be greatly enhanced by having multiple channels of sensorial feedback to increase the awareness of the operator. Information on the state of the drone can be intuitively rendered with haptic feedback. To create this bidirectional interaction with the drone, the two types of haptic feedback - kinesthetic and tactile - have been explored. Kinesthetic feedback was implemented with a cable-driven system to give guidance to the user’s torso position. Performing user studies, we could determine that the embedded guidance improved the flight performance and that a quadratically shaped force feedback curve was the most adequate profile to guide the user. We also established the minimal force difference, defined the perceived magnitude of this system and studied the learning process of users. Tactile feedback was investigated to render the sensation of flying by enhancing flight awareness, realism and immersion. To this end, we developed and embedded a new type of soft actuator that was compliant and lightweight such that it remained wearable and portable. Four devices, placed on the torso, provided feedback by compressing closed air pouches against the skin rendering the sensation of air pressure. A mechanical model and simulation of the pouch device were developed to determine appropriate parameters. We evaluated whether it conveyed useful information to the user and whether it enhanced the experience of flying. We demonstrated that users were able to understand the direction of the cues without prompting, could distinguish the cues quickly, and do so with high accuracy. The device was also used in a simulated flight task and users indicated that it increased the flight realism. We believe that the contributions of this thesis provides insights to the design of intuitive interfaces for human-robot interaction and increases their accessibility to a wider range of the population.