This study proposes a wearable system consisting of inertial and electromyography (EMG) sensors for the recognition of activities of daily living in the context of neurodegenerative diseases. The value of Human Activity Recognition (HAR) rests in its ability to test motor cognitive abilities objectively and precisely at home and to keep track of activities performed, for instance when it's necessary to follow the impact of pharmacological therapy. Current technologies for HAR primarily rely on two types of systems: camera-based systems and wearable sensor systems. In this study, a wearable multimodal system based on the fusion of EMG and inertial sensors was designed and developed. Data have been collected from 21 young subjects performing 5 gestures and, after a data pre-processing, a multi-modal dataset has been created. Then, classification has been performed using two machine learning techniques, i.e. a multiclass Support Vector Machine (SVM) and a Random Forest (RF). The best combination of sensors is analyzed, considering both classifiers’ accuracy and wearability. Following the proposed approach, levels of accuracy of 79% are reached using EMG and the inertial sensor on the dominant index finger and wrist, whereas, using EMGs and only the hand's wrist sensor achieves an accuracy of 78%, showing that an excellent yet non-invasive activity recognition system can be obtained with just a few sensors. Finally, this system can be a reliable solution to be used both in clinics and home care.

A multimodal wearable system for Human Activity Recognition during daily activities / Benvenuti P.; Rovini E.; Fiorini L.; Cavallo F.. - ELETTRONICO. - (2023), pp. 0-0. (Intervento presentato al convegno 8th National Congress of Bioengineering tenutosi a Padova nel 21-23 June 2023).

A multimodal wearable system for Human Activity Recognition during daily activities

Benvenuti P.;Rovini E.;Fiorini L.;Cavallo F.
2023

Abstract

This study proposes a wearable system consisting of inertial and electromyography (EMG) sensors for the recognition of activities of daily living in the context of neurodegenerative diseases. The value of Human Activity Recognition (HAR) rests in its ability to test motor cognitive abilities objectively and precisely at home and to keep track of activities performed, for instance when it's necessary to follow the impact of pharmacological therapy. Current technologies for HAR primarily rely on two types of systems: camera-based systems and wearable sensor systems. In this study, a wearable multimodal system based on the fusion of EMG and inertial sensors was designed and developed. Data have been collected from 21 young subjects performing 5 gestures and, after a data pre-processing, a multi-modal dataset has been created. Then, classification has been performed using two machine learning techniques, i.e. a multiclass Support Vector Machine (SVM) and a Random Forest (RF). The best combination of sensors is analyzed, considering both classifiers’ accuracy and wearability. Following the proposed approach, levels of accuracy of 79% are reached using EMG and the inertial sensor on the dominant index finger and wrist, whereas, using EMGs and only the hand's wrist sensor achieves an accuracy of 78%, showing that an excellent yet non-invasive activity recognition system can be obtained with just a few sensors. Finally, this system can be a reliable solution to be used both in clinics and home care.
2023
8th National Congress of Bioengineering
8th National Congress of Bioengineering
Padova
21-23 June 2023
Benvenuti P.; Rovini E.; Fiorini L.; Cavallo F.
File in questo prodotto:
File Dimensione Formato  
2023_Benvenuti_GNB_paper_534.pdf

Accesso chiuso

Tipologia: Pdf editoriale (Version of record)
Licenza: Creative commons
Dimensione 378.25 kB
Formato Adobe PDF
378.25 kB Adobe PDF   Richiedi una copia

I documenti in FLORE sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificatore per citare o creare un link a questa risorsa: https://hdl.handle.net/2158/1354373
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact