Emotions and facial expressions for conversational agents

T.D Bui1, D. Heylen2, A. Nijholt2 and M. Poel2

1Vietnam National University, Hanoi, Vietnam
2
University of Twente, Enschede, The Netherlands

We present work on Obie, an embodied conversational agent framework. An embodied conversational agent, or talking head, consists of various components. For the graphical part we have dealt with the problem of creating a face model and a facial muscle model in such a way that realistic facial expressions can be produced in real-time on a standard pc. In particular we have defined a face model that allows high quality and realistic facial expressions, which is still sufficiently simple in order to keep the animation real-time and is able to assist the muscle model to control the deformations.

Also, we have implemented a muscle model that produces realistic deformation of the facial surface, handles multiple muscle interaction correctly and produces bulges and wrinkles in real-time. Besides the graphical part, we need a system that accounts for the actions (dialogue) and emotions of the agent. We have implemented an emotion model and a mapping from emotions to facial expressions.

For the animation, it is particularly important to deal with the problem of combining different facial movements temporally. We concentrate on the dynamic aspects of facial movements and the combination of facial expressions in different channels that are responsible for different tasks.


Paper presented at Measuring Behavior 2005 , 5th International Conference on Methods and Techniques in Behavioral Research, 30 August - 2 September 2005, Wageningen, The Netherlands.

© 2005 Noldus Information Technology bv