Most human-robot interfaces, such as joysticks and keyboards, require training and constant cognitive effort and provide a limited degree of awareness of the robots’ state and its environment. Embodied interactions, that is the bidirectional link between the physical bodies and control systems of the robot and of the human, could not only enable a more intuitive control of robots, even for novices, but also provide users with more immersive sensations. But providing an embodied interaction by mapping human movements into a non-anthropomorphic robot is particularly challenging. In this paper, we describe a natural and immersive embodied interaction that allows users to control and experience drone flight with their own bodies. The setup uses a commercial flight simulator that tracks hand movements and provides haptic and visual feedback. The paper discusses how to map body movement with drone motion, and how the resulting embodied interaction provides a more natural and immersive flight experience to unskilled users with respect to a conventional RC remote controller.