Robots will become part of our everyday life as helpers and companions, sharing the environment with us. Thus, robots should become social and able to naturally interact with the users. Recognizing human activities and behaviors will enhance the capabilities of the robot to plan an appropriate action and tailor the approach according to what the user is doing. Therefore, this paper addresses the problem of providing mobile robots with the ability to recognize common daily activities. The fusion of heterogeneous data gathered by multiple sensing strategies, namely wearable inertial sensors, depth camera, and location features, is proposed to improve the recognition of human activity. In particular, the proposed work aims to recognize 10 activities using data from a depth camera mounted on a mobile robot able to self-localize in the environment and from customized sensors worn on the hand. Twenty users were asked to perform the selected activities in two different relative positions between them and the robot while the robot was moving. The analysis was carried out considering different combinations of sensors to evaluate how the fusion of the different technologies improves the recognition abilities. The results show an improvement of 13% in the F-measure when different sensors are considered with respect to the use of the sensors of the robot. In particular, the system is able to recognize not only the performed activity, but also the relative position, enhancing the robot capabilities to interact with the users.

Enhancing Activity Recognition of Self-Localized Robot Through Depth Camera and Wearable Sensors / Manzi, Alessandro; Moschetti, Alessandra; Limosani, Raffaele; Fiorini, Laura; Cavallo, Filippo. - In: IEEE SENSORS JOURNAL. - ISSN 1530-437X. - STAMPA. - 18:(2018), pp. 9324-9331. [10.1109/JSEN.2018.2869807]

Enhancing Activity Recognition of Self-Localized Robot Through Depth Camera and Wearable Sensors

Fiorini, Laura;Cavallo, Filippo
2018

Abstract

Robots will become part of our everyday life as helpers and companions, sharing the environment with us. Thus, robots should become social and able to naturally interact with the users. Recognizing human activities and behaviors will enhance the capabilities of the robot to plan an appropriate action and tailor the approach according to what the user is doing. Therefore, this paper addresses the problem of providing mobile robots with the ability to recognize common daily activities. The fusion of heterogeneous data gathered by multiple sensing strategies, namely wearable inertial sensors, depth camera, and location features, is proposed to improve the recognition of human activity. In particular, the proposed work aims to recognize 10 activities using data from a depth camera mounted on a mobile robot able to self-localize in the environment and from customized sensors worn on the hand. Twenty users were asked to perform the selected activities in two different relative positions between them and the robot while the robot was moving. The analysis was carried out considering different combinations of sensors to evaluate how the fusion of the different technologies improves the recognition abilities. The results show an improvement of 13% in the F-measure when different sensors are considered with respect to the use of the sensors of the robot. In particular, the system is able to recognize not only the performed activity, but also the relative position, enhancing the robot capabilities to interact with the users.
2018
18
9324
9331
Manzi, Alessandro; Moschetti, Alessandra; Limosani, Raffaele; Fiorini, Laura; Cavallo, Filippo
File in questo prodotto:
File Dimensione Formato  
IP041 - Enhancing activity recognition of self-localized robot through depth camera and wearable sensors.pdf

Accesso chiuso

Dimensione 612.24 kB
Formato Adobe PDF
612.24 kB Adobe PDF   Richiedi una copia

I documenti in FLORE sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificatore per citare o creare un link a questa risorsa: https://hdl.handle.net/2158/1210802
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 18
  • ???jsp.display-item.citation.isi??? 17
social impact