Social assistive robotics aims at improving the quality of life of elderly people and caregivers. Human Activity Recognition (HAR) AQ1 is one of the capabilities the assistive robot should be endowed with, to allow aged people to independently live in their homes. This work deals with the problem of performing HAR by employing two wearable inertial sensors and one RGB-D camera, mounted on the social robot Pepper. Specifically, the main purpose is to prove that Pepper robot is able to correctly recognize daily living activities by exploiting the information coming from the RGB-D camera and one inertial sensor placed on the index finger of the subject. Ten users were asked to perform ten activities while wearing an inertial glove, SensHand, and while being recorded by the camera. Two different perspectives of the robot were studied to understand if a good activity recognition could be obtained when the robot is in front of the person and on his side. The results show that almost the same recognition performances are obtained when combining the visual sensor, no matter the chosen perspective, with the inertial sensor only on the index (95%), with respect to the fusion of the same camera with the inertial sensor on the index and on the wrist (96%). This supports the conclusion that elderly people could just wear a small ring on the index finger to allow the robot to recognize their activities, taking advantage from a system which is comfortable and easy-to-wear.

Combined Vision and Wearable System for Daily Activity Recognition / Loizzo F.G.C.; Fiorini L.; Sorrentino A.; Di Nuovo A.; Rovini E.; Cavallo F.. - ELETTRONICO. - 884 LNEE:(2022), pp. 216-234. (Intervento presentato al convegno Foritaal 2020: Forum Italiano di Ambient Assisted Living) [10.1007/978-3-031-08838-4_16].

Combined Vision and Wearable System for Daily Activity Recognition

Fiorini L.
;
Sorrentino A.;Rovini E.;Cavallo F.
2022

Abstract

Social assistive robotics aims at improving the quality of life of elderly people and caregivers. Human Activity Recognition (HAR) AQ1 is one of the capabilities the assistive robot should be endowed with, to allow aged people to independently live in their homes. This work deals with the problem of performing HAR by employing two wearable inertial sensors and one RGB-D camera, mounted on the social robot Pepper. Specifically, the main purpose is to prove that Pepper robot is able to correctly recognize daily living activities by exploiting the information coming from the RGB-D camera and one inertial sensor placed on the index finger of the subject. Ten users were asked to perform ten activities while wearing an inertial glove, SensHand, and while being recorded by the camera. Two different perspectives of the robot were studied to understand if a good activity recognition could be obtained when the robot is in front of the person and on his side. The results show that almost the same recognition performances are obtained when combining the visual sensor, no matter the chosen perspective, with the inertial sensor only on the index (95%), with respect to the fusion of the same camera with the inertial sensor on the index and on the wrist (96%). This supports the conclusion that elderly people could just wear a small ring on the index finger to allow the robot to recognize their activities, taking advantage from a system which is comfortable and easy-to-wear.
2022
Ambient Assisted Living: Italian Forum 2020
Foritaal 2020: Forum Italiano di Ambient Assisted Living
Loizzo F.G.C.; Fiorini L.; Sorrentino A.; Di Nuovo A.; Rovini E.; Cavallo F.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in FLORE sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificatore per citare o creare un link a questa risorsa: https://hdl.handle.net/2158/1281399
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? ND
social impact