Emotional Avatar Faces


Emotional Avatar Faces

In this area, we explore the reading of avatar faces.

In the first work, we investigate whether we personality judgments when viewing an avatar’s face. Using SecondLife avatars as stimuli, we employ Paired- Comparison Tests to determine the implications of certain facial features. Our results suggest that people judge an avatar by its look, and such judgment is sensitive to eeriness and babyfacedness of the avatar.

In the second work, we explore the applications of facial expression analysis and eye tracking in driving emotionally expressive avatars. We propose a system that transfers facial emotional signals including facial expressions and eye movements from the real world into a virtual world. The proposed system enables us to address the questions: How significant are eye movements in emotion expression? Can facial emotional signals be transferred effectively, from the real world into virtual worlds?
Collaborators
  • Yuqiong Wang, Golisano College of Computing and Information Sciences, Rochester Institute of Technology
  • Andrew Herbert, Department of Psychology, Rochester Institute of Technology.
  • Publications
  • Y.Wang, J.Geigel and A. Herbert, "Reading Personality: Avatar vs. Human Faces." In Proceedings of the 5th international conference on Affective computing and intelligent interaction - ACII’13.

  • Y. Wang and J. Geigel. Using facial emotional signals for communication between emotionally expressive avatars in virtual worlds. In Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part II, ACII’11, pages 297–304, Berlin, Heidelberg, 2011. Springer-Verlag.