Machine learning techniques have proven to be effective in human activity recognition (HAR) from inertial signals. However, they often suffer from intra-class variability and inter-class similarity problems due to strong differences among individuals and in how they perform activities. Recently, data-centric approaches have demonstrated efficacy; however, they require extensive datasets encompassing numerous readings across multiple subjects, incurring significant costs during acquisition campaigns. This study introduces a novel homogenization procedure to address dataset heterogeneity in HAR, enabling the integration of diverse datasets into a unified framework. Using eight publicly available HAR datasets, we evaluated the performance of two neural network architectures, a simplified convolutional neural network (S-CNN) and a long short-term memory (LSTM) network. The proposed method reduces the F1-score gap with baseline models from 24.3 to 7.8% on average, reflecting a relative improvement of 16.5%. Additionally, fine-tuning improves model adaptability, achieving a 2.5% accuracy increase for new users. These findings highlight the feasibility of data-centric strategies for robust HAR systems. In particular, the merging procedure, combined with fine-tuning techniques, confirms that diverse data sources and appropriate adaptation methods can yield performance outcomes closely resembling those of the original datasets. Our methodology has been implemented in the continual learning platform (CLP), which has been made available to the scientific community to facilitate future research and applications.
Amrani, H., Micucci, D., Mobilio, M., Napoletano, P. (2025). Leveraging dataset integration and continual learning for human activity recognition. INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS [10.1007/s13042-025-02569-1].
Leveraging dataset integration and continual learning for human activity recognition
Amrani H.;Micucci D.;Mobilio M.;Napoletano P.
2025
Abstract
Machine learning techniques have proven to be effective in human activity recognition (HAR) from inertial signals. However, they often suffer from intra-class variability and inter-class similarity problems due to strong differences among individuals and in how they perform activities. Recently, data-centric approaches have demonstrated efficacy; however, they require extensive datasets encompassing numerous readings across multiple subjects, incurring significant costs during acquisition campaigns. This study introduces a novel homogenization procedure to address dataset heterogeneity in HAR, enabling the integration of diverse datasets into a unified framework. Using eight publicly available HAR datasets, we evaluated the performance of two neural network architectures, a simplified convolutional neural network (S-CNN) and a long short-term memory (LSTM) network. The proposed method reduces the F1-score gap with baseline models from 24.3 to 7.8% on average, reflecting a relative improvement of 16.5%. Additionally, fine-tuning improves model adaptability, achieving a 2.5% accuracy increase for new users. These findings highlight the feasibility of data-centric strategies for robust HAR systems. In particular, the merging procedure, combined with fine-tuning techniques, confirms that diverse data sources and appropriate adaptation methods can yield performance outcomes closely resembling those of the original datasets. Our methodology has been implemented in the continual learning platform (CLP), which has been made available to the scientific community to facilitate future research and applications.File | Dimensione | Formato | |
---|---|---|---|
Amrani-2025-International Journal of Machine Learning and Cybernetics-VoR.pdf
accesso aperto
Descrizione: This article is licensed under a Creative Commons Attribution 4.0 International License, To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
Tipologia di allegato:
Publisher’s Version (Version of Record, VoR)
Licenza:
Creative Commons
Dimensione
3.17 MB
Formato
Adobe PDF
|
3.17 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.