Assistive social robots are becoming increasingly important in our daily life. In this context, humans expect to be able to interact with them using the same mental rules applied to human-human communication, including the use of non-verbal channels. Therefore, over the last years, research efforts have been devoted to develop behavioral models for robots that can perceive the user state and properly plan a reaction. In this context, this paper presents a study where 30 healthy subjects were requested to interact with the Pepper robot that elicited three emotions (i.e. positive, negative, and neutral) using a set of 60 images, retrieved from a standardized database. The paper aimed to assess the robot's performance in the recognition of emotion and to analyze the role of the robot's behavior (coherent and incoherent) using three supervised machine learning, namely Support Vector Machine, Random Forest, and K-Nearest Neighbor. The results underline a good recognition rate (accuracy higher than 0.85 with the best classifiers), suggesting that the use of multimodal communication channels improves the recognition of the user's emotional state.

Can I Feel You? Recognizing Human’s Emotions During Human-Robot Interaction / Fiorini, Laura; Loizzo, Federica G. C.; D'Onofrio, Grazia; Sorrentino, Alessandra; Ciccone, Filomena; Russo, Sergio; Giuliani, Francesco; Sancarlo, Daniele; Cavallo, Filippo. - ELETTRONICO. - 13817 LNAI:(2023), pp. 511-521. (Intervento presentato al convegno 14th International Conference on Social Robotics (ICSR 2022)) [10.1007/978-3-031-24667-8_45].

Can I Feel You? Recognizing Human’s Emotions During Human-Robot Interaction

Fiorini, Laura
;
Sorrentino, Alessandra;Cavallo, Filippo
2023

Abstract

Assistive social robots are becoming increasingly important in our daily life. In this context, humans expect to be able to interact with them using the same mental rules applied to human-human communication, including the use of non-verbal channels. Therefore, over the last years, research efforts have been devoted to develop behavioral models for robots that can perceive the user state and properly plan a reaction. In this context, this paper presents a study where 30 healthy subjects were requested to interact with the Pepper robot that elicited three emotions (i.e. positive, negative, and neutral) using a set of 60 images, retrieved from a standardized database. The paper aimed to assess the robot's performance in the recognition of emotion and to analyze the role of the robot's behavior (coherent and incoherent) using three supervised machine learning, namely Support Vector Machine, Random Forest, and K-Nearest Neighbor. The results underline a good recognition rate (accuracy higher than 0.85 with the best classifiers), suggesting that the use of multimodal communication channels improves the recognition of the user's emotional state.
2023
Social Robotics. ICSR 2022. Lecture Notes in Computer Science()
14th International Conference on Social Robotics (ICSR 2022)
Fiorini, Laura; Loizzo, Federica G. C.; D'Onofrio, Grazia; Sorrentino, Alessandra; Ciccone, Filomena; Russo, Sergio; Giuliani, Francesco; Sancarlo, Da...espandi
File in questo prodotto:
File Dimensione Formato  
ICSR2022_Fiorini.pdf

Accesso chiuso

Tipologia: Pdf editoriale (Version of record)
Licenza: Tutti i diritti riservati
Dimensione 595.32 kB
Formato Adobe PDF
595.32 kB Adobe PDF   Richiedi una copia

I documenti in FLORE sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificatore per citare o creare un link a questa risorsa: https://hdl.handle.net/2158/1386359
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 3
  • ???jsp.display-item.citation.isi??? 2
social impact