Robust activity recognition for assistive technologies: Benchmarking machine learning techniques
An increasing need for healthcare provision and assistive technologies (AT) calls for the development of machine learning techniques able to cope with the variability inherent to real-world deployments. In the particular case of activity recognition applications sensor networks may be prone to changes at different levels ranging from sensor data variability to network reconfiguration. Robust methods are required to deal with those changes providing graceful degradation upon failure or self-configuration and adaptation capabilities that ensure their proper operation for long periods of time. Currently there is a lack of common tools and datasets that allow for replicable and fair comparison of different recognition approaches. We introduce a large database of human daily activities recorded in a sensor-rich environment. The database provides large amount of instances of the recorded activities using a significant number of sensors. In addition, we reviewed some of the techniques that have been proposed to cope with changes in the system, including missing data, sensor location/orientation change, as well as the possibility to exploit data from unknown discovered sensors. These techniques have been tested in the aforementioned datasets showing its suitability to emulate different sensor network configurations and recognition goals.
Record created on 2010-11-05, modified on 2016-08-08