Towards the development of affective facial expression recognition for human-robot interaction


Affective facial expression is a key feature of non-verbal behavior and is considered as a symptom of an internal emotional state. Emotion recognition plays an important role in social communication: human-human and also for human-robot interaction. This work aims at the development of a framework able to recognise human emotions through facial expression for human-robot interaction. Simple features based on facial landmarks distances and angles are extracted to feed a dynamic probabilistic classification framework. The public online dataset Karolinska Directed Emotional Faces (KDEF) [12] is used to learn seven different emotions (e.g. Angry, fearful, disgusted, happy, sad, surprised, and neutral) performed by seventy subjects. Offline and on-the-fly tests were carried out: leave-one-out cross validation tests using the dataset and on-the-fly tests during human-robot interactions. Preliminary results show that the proposed framework can correctly recognise human facial expressions with potential to be used in human-robot interaction scenarios.

Publication DOI:
Divisions: College of Engineering & Physical Sciences
Additional Information: -
Event Title: 10th ACM International Conference on PErvasive Technologies Related to Assistive Environments, PETRA 2017
Event Type: Other
Event Dates: 2017-06-21 - 2017-06-23
Uncontrolled Keywords: affective facial expressions,emotion recognition,human-robot interaction,Human-Computer Interaction,Computer Networks and Communications,Computer Vision and Pattern Recognition,Software
ISBN: 978-1-4503-5227-7
Last Modified: 10 Jun 2024 07:50
Date Deposited: 22 Aug 2017 12:05
Full Text Link:
Related URLs: http://www.scop ... tnerID=8YFLogxK (Scopus URL)
PURE Output Type: Conference contribution
Published Date: 2017-06-21
Accepted Date: 2017-06-01
Authors: Faria, Diego Resende (ORCID Profile 0000-0002-2771-1713)
Vieira, Mario
Faria, Fernanda C.C.



Version: Accepted Version

| Preview

Export / Share Citation


Additional statistics for this record