The use of wearable devices with inertial sensors for gesture recognition is becoming increasingly common in extended reality and remote control applications. Fast and accurate gesture recognition is critical for real-time services such as industrial control and thus drives the shift of computation toward edge computing. This work aims to optimize deep learning models for direct integration into real-world edge devices. In this context, several models based on convolutional neural networks with different configurations are developed and tested through standard performance benchmarks (accuracy and Macro F1 Score). Moreover, in order to decrease models' size with minimal accuracy loss, they were quantized using full-integer quantization. Non-quantized and quantized models have been tested on different boards through the STM32 Edge AI Developer Cloud, as an additional benchmark to verify that the proposed models meet speed and accuracy requirements also on low-cost, low-power edge nodes. The results show that the best lightweight configuration with post-training quantization achieves an inference time of 17.31 ms on the STM32L4R9I-DISCO board, while maintaining an accuracy of 95.95 % ± 0.31 %, competitive against the state of the art and making it suitable for integration into industrial control frameworks.

Esposito, M., Raggiunto, S., Napoletano, P., Belli, A., Sciarroni, M., Storti, E., et al. (2025). A Lightweight CNN-Based Solution for Inertial Gesture Recognition on Tiny Edge Devices. In 2025 IEEE 30th International Conference on Emerging Technologies and Factory Automation (ETFA). Institute of Electrical and Electronics Engineers Inc. [10.1109/ETFA65518.2025.11205794].

A Lightweight CNN-Based Solution for Inertial Gesture Recognition on Tiny Edge Devices

Napoletano P.;
2025

Abstract

The use of wearable devices with inertial sensors for gesture recognition is becoming increasingly common in extended reality and remote control applications. Fast and accurate gesture recognition is critical for real-time services such as industrial control and thus drives the shift of computation toward edge computing. This work aims to optimize deep learning models for direct integration into real-world edge devices. In this context, several models based on convolutional neural networks with different configurations are developed and tested through standard performance benchmarks (accuracy and Macro F1 Score). Moreover, in order to decrease models' size with minimal accuracy loss, they were quantized using full-integer quantization. Non-quantized and quantized models have been tested on different boards through the STM32 Edge AI Developer Cloud, as an additional benchmark to verify that the proposed models meet speed and accuracy requirements also on low-cost, low-power edge nodes. The results show that the best lightweight configuration with post-training quantization achieves an inference time of 17.31 ms on the STM32L4R9I-DISCO board, while maintaining an accuracy of 95.95 % ± 0.31 %, competitive against the state of the art and making it suitable for integration into industrial control frameworks.
paper
edge computing; gesture recognition; human-machine interaction; Industry 5.0; Inertial Measurement Unit; wearable sensors;
English
30th IEEE International Conference on Emerging Technologies and Factory Automation, ETFA 2025 - 09-12 September 2025
2025
2025 IEEE 30th International Conference on Emerging Technologies and Factory Automation (ETFA)
9798331553838
2025
none
Esposito, M., Raggiunto, S., Napoletano, P., Belli, A., Sciarroni, M., Storti, E., et al. (2025). A Lightweight CNN-Based Solution for Inertial Gesture Recognition on Tiny Edge Devices. In 2025 IEEE 30th International Conference on Emerging Technologies and Factory Automation (ETFA). Institute of Electrical and Electronics Engineers Inc. [10.1109/ETFA65518.2025.11205794].
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/583042
Citazioni
  • Scopus 1
  • ???jsp.display-item.citation.isi??? ND
Social impact