The exploitation of smartphones for Human Activity Recognition (HAR) has been an active research area in which the development of fast and efficient Machine Learning approaches is crucial for preserving battery life and reducing computational requirements. In this work, we present a HAR system which incorporates smartphone-embedded inertial sensors and uses Support Vector Machines (SVM) for the classification of Activities of Daily Living (ADL). By exploiting a publicly available benchmark HAR dataset, we show the benefits of adding smartphones gyroscope signals into the recognition system against the traditional accelerometer-based approach, and explore two feature selection mechanisms for allowing a radically faster recognition: the utilization of exclusively time domain features and the adaptation of the L1 SVM model which performs comparably to non-linear approaches while ne- glecting a large number of non-informative features.
Training Computationally Efficient Smartphone-based Human Activity Recognition Models
L. Oneto;
2013-01-01
Abstract
The exploitation of smartphones for Human Activity Recognition (HAR) has been an active research area in which the development of fast and efficient Machine Learning approaches is crucial for preserving battery life and reducing computational requirements. In this work, we present a HAR system which incorporates smartphone-embedded inertial sensors and uses Support Vector Machines (SVM) for the classification of Activities of Daily Living (ADL). By exploiting a publicly available benchmark HAR dataset, we show the benefits of adding smartphones gyroscope signals into the recognition system against the traditional accelerometer-based approach, and explore two feature selection mechanisms for allowing a radically faster recognition: the utilization of exclusively time domain features and the adaptation of the L1 SVM model which performs comparably to non-linear approaches while ne- glecting a large number of non-informative features.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.