Files

Abstract

Search and rescue, autonomous construction, and many other semi-autonomous multi-robot applications can benefit from proximal interactions between an operator and a swarm of robots. Most research on proximal interaction is based on explicit communication techniques such as gesture and speech. This study proposes a new implicit proximal communication technique to approach the problem of robot selection. We use electroencephalography (EEG) signals to select the robot at which the operator is looking. This is achieved using steady-state visually evoked potential (SSVEP), a repeatable neural response to a regularly blinking visual stimulus that varies predictively based on the blinking frequency. In our experiments, each robot was equipped with LEDs blinking at a different frequency, and the operator’s SSVEP neural response was extracted from the EEG signal to detect and select the robot without requiring any conscious action by the user. This study systematically investigates several parameters affecting the SSVEP neural response: blinking frequency of the LED, distance between the robot and the operator, and color of the LED. Based on these parameters, we study two signal processing approaches and critically analyze their performance on 10 subjects controlling a set of physical robots. Our results show that despite numerous artifacts, it is possible to achieve a recognition rate higher than 85% on some subjects, while the average over the ten subjects was 75%.

Details

Actions

Preview