Recent experiments have indicated the possibility to use the brain electrical activity to directly control the movement of robotics or prosthetic devices. In this talk we report results with a portable non-invasive brain-computer interface that makes possible the continuous control of a mobile robot in a model house. The interface uses 8 surface electrodes to measure electroencephalogram (EEG) signals from which a statistical classifier recognizes 3 different mental states. Until now, brain-actuated control of robots has relied on invasive approaches --requiring surgical implantation of electrodes-- since EEG-based systems have been considered too slow for controlling rapid and complex sequences of movements. But, recently, we have shown for the first time that online analysis of a few EEG channels, if used in combination with advanced robotics and machine learning techniques, is sufficient for humans to continuously control a mobile robot. Two human subjects learned, within a few days, to drive the robot between rooms in a house-like environment by mental control only. Furthermore, mental control was only one third slower than manual control on the same task. The key novel idea is that the users mental states are associated with high-level commands (e.g., turn right at the next occasion) that the robot executes autonomously using the readings of its on-board sensors. Another critical feature is that a subject can issue high-level commands at any moment. This is possible because the operation of the brain interface is asynchronous and, unlike synchronous approaches, does not require waiting for external cues.