Tree-structured Classifier for Acceleration-based Activity and Gesture Recognition on Smartwatches

This paper proposes a new method for recognizing both activities and gestures by using acceleration data collected on a smartwatch. While both activity recognition techniques and gesture recognition techniques employ acceleration data, these techniques are studied independently due to the large difference between the characteristics of activity sensor data and gesture sensor data. In this study, we combine their recognition using a tree structured classifier that combines features that are widely used to recognize activities with dynamic time warping-based k-nearest neighbor classifiers. Our method can recognize both activities and gestures with low computational cost by executing only the minimal set of feature extraction and classification processes that are required to recognize an input sensor-data segment. An experiment on 30 sessions of sensor data shows that our method can recognize both activities and gestures simultaneously with 95.8% accuracy while reducing computation costs by 97.3% when compared with a baseline method.

Published in:
2016 Ieee International Conference On Pervasive Computing And Communication Workshops (Percom Workshops)
Presented at:
IEEE International Conference on Pervasive Computing and Communication Workshops (PerCom Workshops), Sydney, AUSTRALIA, MAR 14-19, 2016
New York, Ieee

 Record created 2016-10-18, last modified 2018-03-17

Rate this document:

Rate this document:
(Not yet reviewed)