Machine Learning has recently found a fertile ground in EMG signal decoding for prosthesis control. However, its understanding and acceptance are strongly limited by the notion of AI models as blackboxes. In critical fields, such as medicine and neuroscience, understanding the neurophysiological phenomena underlying models' outcomes is as relevant as the classification performances. In this work, we adapt state-of-the-art XAI algorithms to EMG hand gesture classification to understand the outcome of machine learning models with respect to physiological processes, evaluating the contribution of each input feature to the prediction and showing that AI models recognize the hand gestures by mapping and fusing efficiently high amplitude activity of synergic muscles.This allows us to (i) drastically reduce the number of required electrodes without a significant loss in classification performances, ensuring the suitability of the system for a larger population of amputees and simplifying the realization of near real-time applications and (ii) perform an efficient selection of features based on their classification relevance, apprehended by the XAI algorithms. This feature selection leads to classification improvements in term of robustness and computational time, outperforming correlation based methods. Finally, (iii) comparing the physiological explanations produced by the XAI algorithms with the experimental setting highlights inconsistencies in the electrodes positioning over different rounds or users, then improving the overall quality of the process.

Gozzi, N., Malandri, L., Mercorio, F., Pedrocchi, A. (2022). XAI for myo-controlled prosthesis: Explaining EMG data for hand gesture classification. KNOWLEDGE-BASED SYSTEMS, 240(15 March 2022) [10.1016/j.knosys.2021.108053].

XAI for myo-controlled prosthesis: Explaining EMG data for hand gesture classification

Malandri L.
;
Mercorio F.;
2022

Abstract

Machine Learning has recently found a fertile ground in EMG signal decoding for prosthesis control. However, its understanding and acceptance are strongly limited by the notion of AI models as blackboxes. In critical fields, such as medicine and neuroscience, understanding the neurophysiological phenomena underlying models' outcomes is as relevant as the classification performances. In this work, we adapt state-of-the-art XAI algorithms to EMG hand gesture classification to understand the outcome of machine learning models with respect to physiological processes, evaluating the contribution of each input feature to the prediction and showing that AI models recognize the hand gestures by mapping and fusing efficiently high amplitude activity of synergic muscles.This allows us to (i) drastically reduce the number of required electrodes without a significant loss in classification performances, ensuring the suitability of the system for a larger population of amputees and simplifying the realization of near real-time applications and (ii) perform an efficient selection of features based on their classification relevance, apprehended by the XAI algorithms. This feature selection leads to classification improvements in term of robustness and computational time, outperforming correlation based methods. Finally, (iii) comparing the physiological explanations produced by the XAI algorithms with the experimental setting highlights inconsistencies in the electrodes positioning over different rounds or users, then improving the overall quality of the process.
Articolo in rivista - Articolo scientifico
EMG signal decoding; eXplainable AI; Myo-controlled prosthesis;
English
4-gen-2022
2022
240
15 March 2022
108053
none
Gozzi, N., Malandri, L., Mercorio, F., Pedrocchi, A. (2022). XAI for myo-controlled prosthesis: Explaining EMG data for hand gesture classification. KNOWLEDGE-BASED SYSTEMS, 240(15 March 2022) [10.1016/j.knosys.2021.108053].
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/434958
Citazioni
  • Scopus 33
  • ???jsp.display-item.citation.isi??? 26
Social impact