The aim of this work is to provide an automatic analysis to assess the user attitude when interacts with a companion robot. In detail, our work focuses on defining which combination of social cues the robot should recognize so that to stimulate the ongoing conversation and how. The analysis is performed on video recordings of 9 elderly users. From each video, low-level descriptors of the behavior of the user are extracted by using open-source automatic tools to extract information on the voice, the body posture, and the face landmarks. The assessment of 3 types of attitude (neutral, positive and negative) is performed through 3 machine learning classification algorithms: k-nearest neighbors, random decision forest and support vector regression. Since intra- and intersubject variability could affect the results of the assessment, this work shows the robustness of the classification models in both scenarios. Further analysis is performed on the type of representation used to describe the attitude. A raw and an auto-encoded representation is applied to the descriptors. The results of the attitude assessment show high values of accuracy (>0.85) both for unimodal and multimodal data. The outcome of this work can be integrated into a robotic platform to automatically assess the quality of interaction and to modify its behavior accordingly.

Exploring Human attitude during Human-Robot Interaction / Sorrentino A.; Fiorini L.; Fabbricotti I.; Sancarlo D.; Ciccone F.; Cavallo F.. - ELETTRONICO. - (2020), pp. 195-200. (Intervento presentato al convegno 29th IEEE International Conference on Robot and Human Interactive Communication, RO-MAN 2020 tenutosi a ita nel 2020) [10.1109/RO-MAN47096.2020.9223527].

Exploring Human attitude during Human-Robot Interaction

Fiorini L.;Cavallo F.
2020

Abstract

The aim of this work is to provide an automatic analysis to assess the user attitude when interacts with a companion robot. In detail, our work focuses on defining which combination of social cues the robot should recognize so that to stimulate the ongoing conversation and how. The analysis is performed on video recordings of 9 elderly users. From each video, low-level descriptors of the behavior of the user are extracted by using open-source automatic tools to extract information on the voice, the body posture, and the face landmarks. The assessment of 3 types of attitude (neutral, positive and negative) is performed through 3 machine learning classification algorithms: k-nearest neighbors, random decision forest and support vector regression. Since intra- and intersubject variability could affect the results of the assessment, this work shows the robustness of the classification models in both scenarios. Further analysis is performed on the type of representation used to describe the attitude. A raw and an auto-encoded representation is applied to the descriptors. The results of the attitude assessment show high values of accuracy (>0.85) both for unimodal and multimodal data. The outcome of this work can be integrated into a robotic platform to automatically assess the quality of interaction and to modify its behavior accordingly.
2020
29th IEEE International Conference on Robot and Human Interactive Communication, RO-MAN 2020
29th IEEE International Conference on Robot and Human Interactive Communication, RO-MAN 2020
ita
2020
Sorrentino A.; Fiorini L.; Fabbricotti I.; Sancarlo D.; Ciccone F.; Cavallo F.
File in questo prodotto:
File Dimensione Formato  
2020 - Exploring Human attitude during Human-Robot Interaction.pdf

accesso aperto

Tipologia: Versione finale referata (Postprint, Accepted manuscript)
Licenza: Open Access
Dimensione 458.01 kB
Formato Adobe PDF
458.01 kB Adobe PDF

I documenti in FLORE sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificatore per citare o creare un link a questa risorsa: https://hdl.handle.net/2158/1255027
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 1
social impact