Robust activity recognition for assistive technologies: Benchmarking machine learning techniques

An increasing need for healthcare provision and assistive technologies (AT) calls for the development of machine learning techniques able to cope with the variability inherent to real-world deployments. In the particular case of activity recognition applications sensor networks may be prone to changes at different levels ranging from sensor data variability to network reconfiguration. Robust methods are required to deal with those changes providing graceful degradation upon failure or self-configuration and adaptation capabilities that ensure their proper operation for long periods of time. Currently there is a lack of common tools and datasets that allow for replicable and fair comparison of different recognition approaches. We introduce a large database of human daily activities recorded in a sensor-rich environment. The database provides large amount of instances of the recorded activities using a significant number of sensors. In addition, we reviewed some of the techniques that have been proposed to cope with changes in the system, including missing data, sensor location/orientation change, as well as the possibility to exploit data from unknown discovered sensors. These techniques have been tested in the aforementioned datasets showing its suitability to emulate different sensor network configurations and recognition goals.

Presented at:
Workshop on Machine Learning for Assistive Technologies at the Twenty-fourth Annual Conference on Neural Information Processing Systems (NIPS), Whistler, Canada, December 10, 2010

Note: The status of this file is: Involved Laboratories Only

 Record created 2010-11-05, last modified 2018-03-18

Download fulltextPDF
External link:
Download fulltextURL
Rate this document:

Rate this document:
(Not yet reviewed)