On the use of brain decoded signals for online user adaptive gesture recognition systems
Activity and context recognition in pervasive and wearable computing ought to continuously adapt to changes typical of open-ended scenarios, such as changing users, sensor characteristics, user expectations, or user motor patterns due to learning or aging. System performance inherently relates to the user's perception of the system behavior. Thus, the user should be guiding the adaptation process. This should be automatic, transparent, and unconscious. We capitalize on advances in electroencephalography (EEG) signal processing that allow error related potentials (ErrP) recognition. ErrP are emitted when a human observes an unexpected behavior in a system. We propose and evaluate a hand gesture recognition system from wearable motion sensors that adapts online by taking advantage of ErrP. Thus the gesture recognition system becomes self-aware of its performance, and c an self-improve through re-occurring detection of ErrP signals. Results show that our adaptation technique can improve accuracy of user independent gesture recognition by 9.58% when ErrP recognition is perfect. When ErrP recognition errors are factored in, recognition accuracy increases by 3.29%. We characterize the boundary conditions of ErrP recognition guaranteeing beneficial adaptation.The adaptive algorithms are applicable to other forms of activity recognition, and can also use explicit user feedback rather than ErrP.