Relevant content for the tag: vr-worlds
Publications

Lucid Loop: A Virtual Deep Learning Biofeedback System for Lucid Dreaming Practice
Conference Proceedings: 2019 CHI Conference on Human Factors in Computing Systems, April 18, 2019 2019
Extended Abstracts. DOI: https://doi.org/10.1145/3290607.3312952
ACM

BioFlockVR: Exploring Visual Entrainment through Amorphous Nature Phenomena in Bio-Responsive Multi-Immersant VR Interactives
Conference Proceedings: 2nd International Conference on Image and Graphics Processing (ICIGP '19, March 03, 2019 2019
pp. 150-154. DOI: https://doi.org/10.1145/3313950.3313978
New York, NY
ACM

Framework for a Bio-Responsive VR for Interactive Real-time Environments and Interactives
Conference Proceedings: Electronic Visualisation and the Arts, British Computer Society, July 2017
London, UK

Exploring Different Ways of Navigating Emotionally-responsive Artwork in Immersive Virtual Environments
Conference Proceedings: Electronic Visualisation and the Arts, 2015
8 pages
London, UK
British Computer Society
Research
BioSensing 2D / 3D / VR Systems
Our lab has extensive experience in using different sensing technology including eye tracking and facial emotion recognition (DiPaola et al 2013), as well as gesture tracking and heart rate and EDA bio sensing (Song & DiPaola, 2015) to affect generative computer graphics systems.
Cognitive (AI) Based Abstraction
What is abstraction? Can you use AI techniques to model the semantics of an idea, object, or entity, where that understanding allows for abstraction of the meaning? We use several AI techniques including genetic programming, Neural Nets and Deep Learning to explore abstraction in its many forms. Mainly here in the visual and narrative arts.
A Social Metaphor-based Virtual Communities (voiceAvatar)
The design goal of this project was to develop avatars and virtual communities where the participants sense a tele-presence – that they are really there in the virtual space with other people. This collective sense of “being-there” does not happen over the phone or with teleconferencing; it is a new and emerging phenomenon, unique to 3D virtual communities.
Virtual Colab and the real CECM Colab located at SFU.
Every smart-board computer display is matched by an in-world browser in 3d
which are able to display the same web based information for distanced collaborators.
Using open source tools ( python / openGL / … ), our research group is working to create an intuitive, interactive 3D knowledge visualization system that explores a more organic approach to knowledge and data visualization. The hope is by borrowing from our interests in alternative user interface design, AI and aLife systems, visual and interaction design as well as intelligent systems that we can create a more living and intuitive system for exploring date spaces.
SFU/FIT Collaborative Design Project
Collaboratively created cyber-fashion show, where sketches (white bg)
from FIT fashion designers are turned into 3d avatar models (black bg) by SFU students
– all using distance collaborative tools between two coasts and countries.