Affective facial expressions recognition for human-robot interaction


Affective facial expression is a key feature of nonverbal behaviour and is considered as a symptom of an internal emotional state. Emotion recognition plays an important role in social communication: human-to-human and also for humanto-robot. Taking this as inspiration, this work aims at the development of a framework able to recognise human emotions through facial expression for human-robot interaction. Features based on facial landmarks distances and angles are extracted to feed a dynamic probabilistic classification framework. The public online dataset Karolinska Directed Emotional Faces (KDEF) [1] is used to learn seven different emotions (e.g. angry, fearful, disgusted, happy, sad, surprised, and neutral) performed by seventy subjects. A new dataset was created in order to record stimulated affect while participants watched video sessions to awaken their emotions, different of the KDEF dataset where participants are actors (i.e. performing expressions when asked to). Offline and on-the-fly tests were carried out: leave-one-out cross validation tests on datasets and on-the-fly tests with human-robot interactions. Results show that the proposed framework can correctly recognise human facial expressions with potential to be used in human-robot interaction scenarios

Divisions: College of Engineering & Physical Sciences
Additional Information: Copyright: IEEE
Last Modified: 19 Jun 2024 17:05
Date Deposited: 26 Oct 2017 10:05
Full Text Link: http://ieeexplo ... r=8172395&tag=1
Related URLs:
PURE Output Type: Conference contribution
Published Date: 2017-09-01
Accepted Date: 2017-09-01
Authors: Faria, Diego R. (ORCID Profile 0000-0002-2771-1713)
Vieira, Mario
Faria, Fernanda C. C.
Premebida, Cristiano



Version: Accepted Version

| Preview

Export / Share Citation


Additional statistics for this record