BioSensing 2D / 3D / VR Systems
Our lab has extensive experience in using different sensing technology including eye tracking and facial emotion recognition (DiPaola et al 2013), as well as gesture tracking and bio sensing heart rate and EDA (Song & DiPaola, 2015) which both affect the generative system and can be used to understand the reception to the generated graphics (still, video, VR).
Emotional facial tracking using camera and AI software. Motion, gesture and body tracking using overhead cameras and MS Kinect. Hand tracking via our own data gloves and Leap Controller. Eye tracking via our Pupil eye tracker. Bio sensing ( heart rate and EDA) via our Empatica E4 watch.
Setup and Results
Some examples of our tracking systems. All our 2d, 3d and VR systems have an abstraction layer with software modules to support several advanced input technologies such as emotion tracking, motion tracking, and bio-sensors.
Downloads and Links
|PDF: ICVR Paper
|BioFlockVR: ICVR 2019 Conference:BioFlockVR: Exploring Visual Entrainment through Amorphous Nature Phenomena in Bio-Responsive Multi-Immersant VR Interactives
|PDF: CHI Extended Abstracts
|Lucid Loop: 2019 CHI Extended Abstracts:Lucid Loop: A Virtual Deep Learning Biofeedback System for Lucid Dreaming Practice
|PDF: Stanford Poster
|BioVR Interactives: 2017 Stanford Poster from Stanford’s ” VR and Behavoiral Change Conference”.
|PDF: IVA 2017
|Framework for a Bio-Responsive VR for Interactive Real-time Environments and Interactives
|PDF: IVA 2015
|Exploring Different Ways of Navigating Emotionally-responsive Artwork in Immersive Virtual Environments
|PDF: (ALT) CHI ’15
|Eye Tracking: Does Observation Reflect Haptic Metaphors in Art Drawing?
|EVA ’16 Video