Facial Expression and Eye Gaze Analysis

Automated Blend Shape Creation for Facial Motion Capture Automated Blend Shape Creation for Facial Motion Capture
The purpose of this work is to create a streamlined program for animators to create blendshapes of their face models for facial motion capture projects. Creating animations by manipulation of vertices or creating blendshapes by hand can be a very tedious process and an impractical one since vertex manipulation causes permanent change to the model. Having a tool to assist in the task and quicken the process will enable modelers to focus more on the actual creation of their characters.
Face as an Interface Face as an Interface
In this work, we propose a framework that enables the use of facial motion capture data as a means of user interface. Advances in facial motion capture technology has enabled real- time, markerless facial tracking. Although originally designed to drive the motions of a virtual character, the data captured by these systems, which can be fairly extensive, could just as well be used to drive other applications. By utilizing motion capture data as user interface signals, we provide a more general means of hands-free application control, allowing for use of a wider variety of facial signals as input.
Art with the Eyes and Face Art with the Eyes and Face
Recent work in eye tracking and facial expression analysis has enabled new forms of hands-free user interaction with computer applications. There has been particular emphasis on using these mechanisms individually for drawing and other artwork related applications. In this work, we explore the combination of the two modalities and describe a system for hands-free drawing that integrates the use of the eyes and the face as a means for user control. We present a general architecture for incorporating eye tracking and facial expression analysis into a computer application and utilize this architecture in the design and implementation of a drawing application.
Avatar Faces Avatar Faces
In this area, we explore the reading of avatar faces. In the first work, we investigate whether we personality judgments when viewing an avatar’s face. Using SecondLife avatars as stimuli, we employ Paired- Comparison Tests to determine the implications of certain facial features. Our results suggest that people judge an avatar by its look, and such judgment is sensitive to eeriness and babyfacedness of the avatar. In the second work, we explore the applications of facial expression analysis and eye tracking in driving emotionally expressive avatars. We propose a system that transfers facial emotional signals including facial expressions and eye movements from the real world into a virtual world. The proposed system enables us to address the questions: How significant are eye movements in emotion expression? Can facial emotional signals be transferred effectively, from the real world into virtual worlds?

Videos

YouTube link
Reading personality
avatar vs human faces
YouTube link
Facial motion driven
applications: Hands free
control using the face
YouTube link
Paint With the
Face and Eyes

Publications