An energy-aware method for the joint recognition of activities and gestures using wearable sensors

This paper presents an energy-aware method for recognizing time series acceleration data containing both activities and gestures using a wearable device coupled with a smartphone. In our method, we use a small wearable device to collect accelerometer data from a user's wrist, recognizing each data segment using a minimal feature set chosen automatically for that segment. For each collected data segment, if our model finds that recognizing the segment requires high-cost features that the wearable device cannot extract, such as dynamic time warping for gesture recognition, then the segment is transmitted to the smartphone where the high-cost features are extracted and recognition is performed. Otherwise, only the minimum required set of low-cost features are extracted from the segment on the wearable device and only the recognition result, i.e., label, is transmitted to the smartphone in place of the raw data, reducing transmission costs. Our method automatically constructs this adaptive processing pipeline solely from training data.


Published in:
Proceedings of the 2015 ACM International Symposium on Wearable Computers, 101-108
Presented at:
2015 ACM International Symposium on Wearable Computers (ISWC 2015), Osaka, Japan, September 7-11, 2015
Year:
2015
Publisher:
New York, ACM
ISBN:
978-1-4503-3578-2
Keywords:
Laboratories:




 Record created 2015-09-19, last modified 2018-10-01

n/a:
Download fulltext
PDF

Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)