Generating and Describing Affective Eye Behaviors
スポンサーリンク
概要
- 論文の詳細を見る
The manner of a persons eye movement conveys much about nonverbal information and emotional intent beyond speech. This paper describes work on expressing emotion through eye behaviors in virtual agents based on the parameters selected from the AU-Coded facial expression database and real-time eye movement data (pupil size, blink rate and saccade). A rule-based approach to generate primary (joyful, sad, angry, afraid, disgusted and surprise) and intermediate emotions (emotions that can be represented as the mixture of two primary emotions) utilized the MPEG4 FAPs (facial animation parameters) is introduced. Meanwhile, based on our research, a scripting tool, named EEMML (Emotional Eye Movement Markup Language) that enables authors to describe and generate emotional eye movement of virtual agents, is proposed.
著者
-
MAO Xia
School of Electronic and Information Engineering, Beihang University
-
LI Zheng
School of Electronic and Information Engineering, Beihang University
関連論文
- Generating and Describing Affective Eye Behaviors
- Generating and Describing Affective Eye Behaviors
- Nonlinear Shape-Texture Manifold Learning