BioSensing 2D / 3D / VR Systems


Research Collaborators
Steve DiPaola , Meehae Song , Suk Kyoung Choi , Ulysses Bernardet , Maryam Saberi , Nilay Ozge Yalcin , Servet Ulas

About
Our lab has extensive experience in using different sensing technology including eye tracking and facial emotion recognition (DiPaola et al 2013), as well as gesture tracking and bio sensing heart rate and EDA (Song & DiPaola, 2015) which both affect the generative system and can be used to understand the reception to the generated graphics (still, video, VR).

The Research
Emotional facial tracking using camera and AI software. Motion, gesture and body tracking using overhead cameras and MS Kinect. Hand tracking via our own data gloves and Leap Controller. Eye tracking via our Pupil eye tracker. Bio sensing ( heart rate and EDA) via our Empatica E4 watch.

Setup and Results
Some examples of our tracking systems. All our 2d, 3d and VR systems have an abstraction layer with software modules to support several advanced input technologies such as emotion tracking, motion tracking, and bio-sensors.

Downloads and Links
Papers/Posters
PDF: Stanford Poster BioVR Interactives: 2017 Stanford Poster from Stanford’s ” VR and Behavoiral Change Conference”.
PDF: IVA 2017 Framework for a Bio-Responsive VR for Interactive Real-time Environments and Interactives
PDF: IVA 2015 Exploring Different Ways of Navigating Emotionally-responsive Artwork in Immersive Virtual Environments
PDF: (ALT) CHI ’15 Eye Tracking: Does Observation Reflect Haptic Metaphors in Art Drawing?

Additional Media
EVA Video  EVA ’16 Video