Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Journal articles
  4. Asymmetric facial expressions: revealing richer emotions for embodied conversational agents
 
research article

Asymmetric facial expressions: revealing richer emotions for embodied conversational agents

Ahn, Junghyun
•
Gobron, Stephane
•
Thalmann, Daniel  
Show more
2013
Computer Animation And Virtual Worlds

In this paper, we propose a method to achieve effective facial emotional expressivity for embodied conversational agents by considering two types of asymmetry when exploiting the Valence-Arousal-Dominance representation of emotions. Indeed, the asymmetry of facial expressions helps to convey complex emotional feelings such as conflicting and/or hidden emotions due to social conventions. To achieve such a higher degree of facial expression in a generic way, we propose a new model for mapping the Valence-Arousal-Dominance emotion model onto a set of twelve scalar Facial Part Actions (FPA)s built mostly by combining pairs of antagonist Action Units (AU) from the Facial Action Coding System (FACS). The proposed linear model can automatically drive a large number of autonomous virtual humans or support the interactive design of complex facial expressions over time. By design our approach produces symmetric facial expressions, as expected for most of the emotional spectrum. However more complex ambivalent feelings can be produced when differing emotions are applied on the left and right sides of the face. We conducted an experiment on static images produced by our approach to compare the expressive power of symmetric and asymmetric facial expressions for a set of eight basic and complex emotions. Results confirm both the pertinence of our general mapping for expressing basic emotions and the significant improvement brought by asymmetry for expressing ambivalent feelings

  • Files
  • Details
  • Metrics
Type
research article
DOI
10.1002/cav.1539
Author(s)
Ahn, Junghyun
Gobron, Stephane
Thalmann, Daniel  
Boulic, Ronan  
Date Issued

2013

Publisher

Wiley-Blackwell

Published in
Computer Animation And Virtual Worlds
Volume

24

Issue

6

Start page

539

End page

551

Subjects

Asymmetric facial expression

•

VAD emotional model

•

Evaluation study

•

Embodied agent

Note

to appear in 2013

Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
SCI-IC-RB  
VRLAB  
Available on Infoscience
July 24, 2013
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/93459
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés