Future social robots should have personalized behaviors based on user emotional state to fit more in ordinary users’ activities and to improve the human–robot interaction. Several, literature works use cameras to record emotions. However, these approaches may not be effective in everyday life, due to camera obstructions and different types of stimulation, which can be related also with the interaction with other human beings. Therefore, in this work, it is investigated the electrocardiogram, the electrodermal activity, and the electric brain activity physiological signals as main informative channels. The aforementioned signals have been acquired through the use of a wireless wearable sensor network. An experimental methodology was proposed to induce three different emotional states by means of social interaction. Two different combinations of sensors were analyzed using three different time-window frames (180s, 150s, and 120s) and classified with three unsupervised machine learning approaches (K-Means, K-medoids and Self-organizing maps). Finally, their classification performances were compared to the ones obtained by four commonly used supervised techniques (i.e. Support Vector Machine, Decision Tree and k-nearest neighbor) to discuss the optimal combination of sensors, time-window length, and unsupervised classifier. Fifteen healthy young participants were recruited in the study and more than 100 instances were analyzed. The proposed approaches achieve an accuracy of 77% in the best-unsupervised case and 85% with the best-supervised ones.

Unsupervised emotional state classification through physiological parameters for social robotics applications / Fiorini L.; Mancioppi G.; Semeraro F.; Fujita H.; Cavallo F.. - In: KNOWLEDGE-BASED SYSTEMS. - ISSN 0950-7051. - ELETTRONICO. - 190:(2020), pp. 105217-105227. [10.1016/j.knosys.2019.105217]

Unsupervised emotional state classification through physiological parameters for social robotics applications

Fiorini L.;Cavallo F.
2020

Abstract

Future social robots should have personalized behaviors based on user emotional state to fit more in ordinary users’ activities and to improve the human–robot interaction. Several, literature works use cameras to record emotions. However, these approaches may not be effective in everyday life, due to camera obstructions and different types of stimulation, which can be related also with the interaction with other human beings. Therefore, in this work, it is investigated the electrocardiogram, the electrodermal activity, and the electric brain activity physiological signals as main informative channels. The aforementioned signals have been acquired through the use of a wireless wearable sensor network. An experimental methodology was proposed to induce three different emotional states by means of social interaction. Two different combinations of sensors were analyzed using three different time-window frames (180s, 150s, and 120s) and classified with three unsupervised machine learning approaches (K-Means, K-medoids and Self-organizing maps). Finally, their classification performances were compared to the ones obtained by four commonly used supervised techniques (i.e. Support Vector Machine, Decision Tree and k-nearest neighbor) to discuss the optimal combination of sensors, time-window length, and unsupervised classifier. Fifteen healthy young participants were recruited in the study and more than 100 instances were analyzed. The proposed approaches achieve an accuracy of 77% in the best-unsupervised case and 85% with the best-supervised ones.
2020
190
105217
105227
Goal 3: Good health and well-being for people
Fiorini L.; Mancioppi G.; Semeraro F.; Fujita H.; Cavallo F.
File in questo prodotto:
File Dimensione Formato  
2020 - Unsupervised emotional state classification through physiological parameters for social robotics applications.pdf

Accesso chiuso

Tipologia: Pdf editoriale (Version of record)
Licenza: Tutti i diritti riservati
Dimensione 1.17 MB
Formato Adobe PDF
1.17 MB Adobe PDF   Richiedi una copia

I documenti in FLORE sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificatore per citare o creare un link a questa risorsa: https://hdl.handle.net/2158/1213605
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 57
  • ???jsp.display-item.citation.isi??? 52
social impact