Robust Activity Recognition Combining Anomaly Detection and Classifier Retraining

Activity recognition systems based on body-worn motion sensors suffer from a decrease in performance during the deployment and run-time phases, because of probable changes in the sensors (e.g. displacement or rotatation), which is the case in many real-life scenarios (e.g. mobile phone in a pocket). Existing approaches to achieve robustness tend to sacrifice information (e.g. by rotation-invariant features) or reduce the weight of the anomalous sensors at the classifier fusion stage (adaptive fusion), ignoring data which might still be perfectly meaningful, although different from the training data. We propose to use adaptation to rebuild the classifier models of the sensors which have changed position by a two-step approach: in the first step, we run an anomaly detection algorithm to automatically detect which sensors are delivering unexpected data; subsequently, we trigger a system self-training process, so that the remaining classifiers retrain the "anomalous" sensors. We show the benefit of this approach in a real activity recognition dataset comprising data from 8 sensors to recognize locomotion. The approach achieves similar accuracy compared to the upper baseline, obtained by retraining the anomalous classifiers on the new data.

Published in:
2013 IEEE International Conference On Body Sensor Networks (BSN)
Presented at:
IEEE International Conference on Body Sensor Networks
New York, IEEE
978-1-4799-0330-6; 978-1-4799-0331-3

 Record created 2014-06-02, last modified 2018-03-18

Download fulltext

Rate this document:

Rate this document:
(Not yet reviewed)