Relevant content for the tag: emotionexpression

Publications

https://ivizlab.org/wp-content/uploads/sites/2/2017/09/cavw2015T.jpg

Perceptual Validity in Animation of Human Motion
Journal Article: Computer Animation and Virtual Worlds Journal, March 2016
27(1), 58-71. doi: 10.1002/cav.1631

A. Etemad, A. Arya, A. Parush, S. DiPaola

Wiley Online Library


Designing Socially Expressive Character Agents to Facilitate Learning
Book Chapter: Educational Gameplay and Simulation Environments, 2010
Editors: Kaufman D., Sauvé L., pp 213-230

S. DiPaola

IGI Global


A Case Study of Expression-based Creation within 3D Virtual Communities
Journal Article: International Journal of Web-Based Communities, 2010
In press

S. DiPaola, J. Turner

Inderscience Publishers


https://ivizlab.org/wp-content/uploads/sites/2/2017/09/ijcgt2009T.jpg

Perceptually Valid Facial Expressions for Character-based Applications
Journal Article: International Journal of Computer Games Technology, 2009
Vol 2009, Article ID 462315, pp 1-13

A. Arya, S. DiPaola, A. Parush

Research

AI Affective Virtual Human

Our affective real-time 3D AI virtual human project with face emotion recognition, movement recognition and full AI talking, gesture and reasoning.


Painterly NPR Project

Portrait artists and painters in general have over centuries developed, a little understood, intuitive and open methodology that exploits cognitive mechanisms in the human perception and visual system.


faceToolKit - A 3D Facial ToolKit

Our long range research project is a visual development system for exploring face space, both in terms of facial types and animated expressions. This development toolkit is based on a hierarchical parametric approach. This approach gives us an additive language of hierarchical expressions, emotions and lip-sync sequences.


iFACE - Comprehensive Envir for Interactive Face Anim

Rapid of growth of visual communication systems, from video phones to virtual agents in games and web services, has a brought a new generation of multimedia systems that we refer to as face-centric. Such systems are mainly concerned with multimedia representation of facial activities.


A Social Metaphor-based Virtual Communities (voiceAvatar)

The design goal of this project was to develop avatars and virtual communities where the participants sense a tele-presence – that they are really there in the virtual space with other people. This collective sense of “being-there” does not happen over the phone or with teleconferencing; it is a new and emerging phenomenon, unique to 3D virtual communities.


MusicFace

Can you extract the emotional aspects of a piece of music to animate a face. Music-driven Emotionally Expressive Face (MusicFace) is a early stage project that creates “facial choreography” driven by musical input. In addition to its artistic uses, MusicFace can be used for creating visual effects in movies and animations, and also realistic characters in computer games and virtual worlds.