Emotion recognition through machine learning techniques is a widely investigated research field, however the recent obligation to wear a face mask, following the COVID19 health emergency, precludes the application of systems developed so far. Humans naturally communicate their emotions through the mouth; therefore, the intelligent systems developed to date for identifying emotions of a subject primarily rely on this area in addition to other anatomical features (eyes, forehead, etc..). However, if the subject is wearing a face mask this region is no longer visible. For this reason, the goal of this work is to develop a tool able to compensate for this shortfall. The proposed tool uses the AffectNet dataset which is composed of eight class of emotions. The iterative training strategy relies on well-known convolutional neural network architectures to identify five sub-classes of emotions: following a pre-processing phase the architecture is trained to perform the task on the eight-class dataset, which is then recategorized into five classes allowing to obtain 96.92% of accuracy on the testing set. This strategy is compared to the most frequently used learning strategies and finally integrated within a real time application that allows to detect faces within a frame, determine if the subjects are wearing a face mask and recognize for each one the current emotion.
Emotion recognition in the times of COVID19: Coping with face masks / Magherini R.; Mussi E.; Servi M.; Volpe Y.. - In: INTELLIGENT SYSTEMS WITH APPLICATIONS. - ISSN 2667-3053. - ELETTRONICO. - 15:(2022), pp. 0-0. [10.1016/j.iswa.2022.200094]
Emotion recognition in the times of COVID19: Coping with face masks
Magherini R.;Mussi E.;Servi M.
;Volpe Y.
2022
Abstract
Emotion recognition through machine learning techniques is a widely investigated research field, however the recent obligation to wear a face mask, following the COVID19 health emergency, precludes the application of systems developed so far. Humans naturally communicate their emotions through the mouth; therefore, the intelligent systems developed to date for identifying emotions of a subject primarily rely on this area in addition to other anatomical features (eyes, forehead, etc..). However, if the subject is wearing a face mask this region is no longer visible. For this reason, the goal of this work is to develop a tool able to compensate for this shortfall. The proposed tool uses the AffectNet dataset which is composed of eight class of emotions. The iterative training strategy relies on well-known convolutional neural network architectures to identify five sub-classes of emotions: following a pre-processing phase the architecture is trained to perform the task on the eight-class dataset, which is then recategorized into five classes allowing to obtain 96.92% of accuracy on the testing set. This strategy is compared to the most frequently used learning strategies and finally integrated within a real time application that allows to detect faces within a frame, determine if the subjects are wearing a face mask and recognize for each one the current emotion.I documenti in FLORE sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.