Facial Expression and Eye Gaze Analysis
|Automated Blend Shape Creation for Facial Motion Capture
The purpose of this work is to create a streamlined program for animators to create blendshapes of their face models for facial motion capture projects. Creating animations by manipulation of vertices or creating blendshapes by hand can be a very tedious process and an impractical one since vertex manipulation causes permanent change to the model. Having a tool to assist in the task and quicken the process will enable modelers to focus more on the actual creation of their characters.
|Face as an Interface
In this work, we propose a framework that enables the use of facial motion capture data as a means of user interface. Advances in facial motion capture technology has enabled real- time, markerless facial tracking. Although originally designed to drive the motions of a virtual character, the data captured by these systems, which can be fairly extensive, could just as well be used to drive other applications. By utilizing motion capture data as user interface signals, we provide a more general means of hands-free application control, allowing for use of a wider variety of facial signals as input.
|Art with the Eyes and Face
Recent work in eye tracking and facial expression analysis has enabled new forms of hands-free user interaction with computer applications. There has been particular emphasis on using these mechanisms individually for drawing and other artwork related applications. In this work, we explore the combination of the two modalities and describe a system for hands-free drawing that integrates the use of the eyes and the face as a means for user control. We present a general architecture for incorporating eye tracking and facial expression analysis into a computer application and utilize this architecture in the design and implementation of a drawing application.
In this area, we explore the reading of avatar faces. In the first work, we investigate whether we personality judgments when viewing an avatar’s face. Using SecondLife avatars as stimuli, we employ Paired- Comparison Tests to determine the implications of certain facial features. Our results suggest that people judge an avatar by its look, and such judgment is sensitive to eeriness and babyfacedness of the avatar. In the second work, we explore the applications of facial expression analysis and eye tracking in driving emotionally expressive avatars. We propose a system that transfers facial emotional signals including facial expressions and eye movements from the real world into a virtual world. The proposed system enables us to address the questions: How significant are eye movements in emotion expression? Can facial emotional signals be transferred effectively, from the real world into virtual worlds?
avatar vs human faces
applications: Hands free
control using the face
Face and Eyes
- McGowen, V., and Geigel, J., 2016. Automatic Blendshape Creation for Facial Motion Capture ACM SIGGRAPH 2016 Posters (SIGGRAPH '16). 2016
- Wang, Y., Geigel, J., and Herbert, A. 2013. Reading Personality: Avatar vs. Human Faces Affective Computing and Intelligent Interaction (ACII), 2013 Humaine Association Conference on , vol., no., pp.479,484, 2-5 Sept. 2013
- Bethamcherla, V., Bhoyar, N., D'Aprix, I., Doshi, A., Paul, R., Rangaishenvi, S., Verma, P., and Geigel, J. 2013. Use of facial motion capture for hands free control of computer applications. In Proceedings of the ACM Symposium on Applied Perception (SAP '13). ACM, New York, NY, USA, 146-146.
- Sridharan, S., Wang, Y., Xu, S., Rangamannar, B., Tayrien, C., Ranger, S., Bailey, R., and Geigel, J., 2012. Drawing with the eyes and face. In Proceedings of the ACM Symposium on Applied Perception, SAP ’12, pages 126–126, New York, NY, USA, 2012. ACM.
- Wang, Y., and Geigel, J. 2011. Using facial emotional signals for communication between emotionally expressive avatars in virtual worlds. In Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part II, ACII’11, pages 297–304, Berlin, Heidelberg, 2011. Springer-Verlag.