Relevant content for the tag: faceresearch

Publications

https://ivizlab.org/wp-content/uploads/sites/2/2017/09/ijcgt2011T.jpg

Expressive Animated Character Sequences Using Knowledge-based Painterly Rendering
Journal Article: International Journal of Computer Games Technology, International Journal of Computer Games Technology, 2011
Vol. 2011, Article ID 164949, 7 pages

H. Seifi, S. DiPaola, A. Arya

https://ivizlab.org/wp-content/uploads/sites/2/2017/09/cavw2015T.jpg

Perceptual Validity in Animation of Human Motion
Journal Article: Computer Animation and Virtual Worlds Journal, March 2016
27(1), 58-71. doi: 10.1002/cav.1631

A. Etemad, A. Arya, A. Parush, S. DiPaola

Wiley Online Library


Exploring Persian Rug Design Using a Computational Evolutionary Approach
Conference Proceedings: Electronic Visualisation and the Arts, 2010
pp. 121-128

A. Dalvandi, P. Behbahani, S. DiPaola

London, UK

British Computer Society


Designing Socially Expressive Character Agents to Facilitate Learning
Book Chapter: Educational Gameplay and Simulation Environments, 2010
Editors: Kaufman D., Sauvé L., pp 213-230

S. DiPaola

IGI Global


https://ivizlab.org/wp-content/uploads/sites/2/2017/09/leonardo2009T.jpg

Rembrandt’s textural agency: A shared perspective in visual art and science
Journal Article: Leonardo, 2010
Vol 43, No 3, pp 145-151

S. DiPaola, C. Riebe, J. Enns

MIT Press


Research

Rembrandt / Vision Science Work

Using new visual computer modelling techniques, we show that artists use vision based techniques (lost and found edges, center of focus techniques) to guide the eye path of the viewer through their paintings in significant ways.


faceToolKit - A 3D Facial ToolKit

Our long range research project is a visual development system for exploring face space, both in terms of facial types and animated expressions. This development toolkit is based on a hierarchical parametric approach. This approach gives us an additive language of hierarchical expressions, emotions and lip-sync sequences.


iFACE - Comprehensive Envir for Interactive Face Anim

Rapid of growth of visual communication systems, from video phones to virtual agents in games and web services, has a brought a new generation of multimedia systems that we refer to as face-centric. Such systems are mainly concerned with multimedia representation of facial activities.


MusicFace

Can you extract the emotional aspects of a piece of music to animate a face. Music-driven Emotionally Expressive Face (MusicFace) is a early stage project that creates “facial choreography” driven by musical input. In addition to its artistic uses, MusicFace can be used for creating visual effects in movies and animations, and also realistic characters in computer games and virtual worlds.


GenFace - Exploring FaceSpace with Genetic Algorithms

Imagine an -dimensional space describing every conceivable humanoid face, where each dimension represents a different facial characteristic. Within this continuous space, it would be possible to traverse a path from any face to any other face, morphing through locally similar faces along that path. We will describe and demonstrate a development system we have created to explore what it means to ‘surf’through face space. We will present our investigation of the relationships between facial types and how this understanding can be used to create new communication and expression systems.