About

Before automobiles were invented and widely adopted, animals like horses were the most common mode of transportation. While this change brought significant improvements in terms of reliability and efficiency, it also removed a core component: the emotional relationship that existed between the person and the animal.

While largely ignored, the emotional states of drivers are quite important, as they influence not only driving behavior but also the safety of all road users. For instance, driving can be quite an emotionally stressful experience and, while certain amounts of stress help the driver to remain alert and attentive, too much or too little can negatively impact driving performance and safety. Furthermore, stress in large doses has been linked to a large array of adverse health conditions such as depression and various forms of cardiovascular disease.

The Emotion Navigation special interest group from the MIT Media Lab was consolidated in 2018 with the goal of stimulating research efforts at the intersection of Affective Computing and Automotive. Some of the main questions that help guide the research are the following:

What if current cars could sense relevant emotional states of drivers such as stress?

What if cars and drivers could modulate the interaction based on implicit emotional responses instead of explicit interactions?

What if cars could help drivers not only navigate to their final destination but also to their desired emotional state?

Research Areas

Measurement

Exploring novel sensing technologies to provide comfortable and passive monitoring of physiological, behavioral and contextual information.

Understanding

Using intelligent data analysis to better understand the role of emotions while driving and efficiently recognize them at an individual and at a group basis.

Intervention

Developing multimodal actuating technologies to provide just-in-time and just-in-place interventions that enhance the driving experience and better support emotional wellbeing.

Publications

- Cristina Bustos, Neska Elhaouij, Albert Sole-Ribalta, Javier Borge-Holthoefer, Agata Lapedriza and Rosalind Picard. "Predicting Driver Self-Reported Stress by Analyzing the Road Scene." 9th International Conference on Affective Computing and Intelligent Interaction, ACII, 2021. Preprint.

 

- Jinmo Lee, Neska Elhaouij, and Rosalind Picard. "AmbientBreath: Unobtrusive Just-in-time Breathing Intervention Using Multi-sensory Stimulation and its Evaluation in a Car Simulator." Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 5, 2, Article 71, IMWUT'21, 30 pages, 2021. Link.

 

- Kyung Yun Choi, Jinmo Lee, Neska Elhaouij, Rosalind Picard, and Hiroshi Ishii. ASpire: Clippable, Mobile Pneumatic-Haptic Device for Breathing Rate Regulation via Personalizable Tactile Feedback. In Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems, CHI EA '21. Association for Computing Machinery, New York, NY, USA, Article 372, 1-8, 2021. Link.

 

- Sebastian Zepf, Javier Hernandez, Alexander Schmitt, Wolfgang Minker, and Rosalind W. Picard. "Driver Emotion Recognition for Intelligent Vehicles: A Survey." ACM Comput. Surv. 53, 3, Article 64, 30 pages, 2020. Link.

 

- Sebastian Zepf, Neska Elhaouij, Jinmo Lee, Asma Ghandeharioun, Javier Hernandez, and Rosalind W. Picard. "Studying Personalized Just-in-time Auditory Breathing Guides and Potential Safety Implications during Simulated Driving." In Proceedings of the 28th ACM Conference on User Modeling, Adaptation and Personalization, UMAP '20. Association for Computing Machinery, New York, NY, USA, 275-283, 2020. Link.

 

- Sebastian Zepf, Neska Elhaouij, Wolfgang Minker, Javier Hernandez, and Rosalind W. Picard. "EmpathicGPS: Exploring the Role of Voice Tonality in Navigation Systems during Simulated Driving." In Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, CHI EA '20. Association for Computing Machinery, New York, NY, USA, 1-7, 2020. Link.

 

- Sebastian Zepf, Monique Dittrich, Javier Hernandez, and Alexander Schmitt. "Towards Empathetic Car Interfaces: Emotional Triggers while Driving." In Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems, CHI EA '19. Association for Computing Machinery, New York, NY, USA, Paper LBW0129, 1-6, 2019. Link.

 

- Daniel Lopez-Martinez, Neska Elhaouij and Rosalind Picard, "Detection of Real-World Driving-Induced Affective State Using Physiological Signals and Multi-View Multi-Task Machine Learning." 8th International Conference on Affective Computing and Intelligent Interaction Workshops and Demos, ACIIW, pp. 356-361, 2019. Link.

 

- Kosmyna, Nataliya, Caitlin Morris, Thanh Nguyen, Sebastian Zepf, Javier Hernandez, and Pattie Maes. "AttentivU: Designing EEG and EOG compatible glasses for physiological sensing and feedback in the car." In Proceedings of the 11th International Conference on Automotive User Interfaces and Interactive Vehicular Applications, AutomotiveUI '19, pp. 355-368, 2019. Link.

Members

Neska Elhaouij

Principal Investigator

Rosalind W. Picard

Co-Principal Investigator


Javier Hernandez

Research Affiliate

Nataliya Kosmyna

Postdoctoral Fellow

Vincent Chen

Research Affiliate


Judith Amores

Research Affiliate

Kyung Yun Choi

Ph.D. Student

Craig Ferguson

Software Developer


Jinmo Lee

(Formerly) Hyundai

Sebastian Zepf

(Formerly) Daimler

Diego Muñoz

NTT Data Corporation

Collaborators

Current

Previous

Videos

AutoEmotive: Bringing Empathy to the Driving Experience

With recent developments in sensing technologies, it's becoming feasible to comfortably measure several aspects of emotions during challenging daily life situations. This work describes how the stress of drivers can be measured through different types of interactions, and how the information can enable several interactions in the car with the goal of helping to manage stress. These new interactions could help not only to bring empathy to the driving experience but also to improve driver safety and increase social awareness. Read more

"Stress-free" Multimodal Sensing while Driving

This work explores sensing opportunities to monitor drivers without creating additional stress. Based on our prior work, this demo showcases three areas for potential car sensing, 1) Dashboard Camera: facial expressions estimation with the Affdex SDK and heart rate estimation with software we have developed, 2) Steering Wheel: instrumented with a biosensor that captures 3-axis accelerometer to estimating steering wheel angle and a biosensor that captures skin conductance, and 3) Smart Seatbelt that captures subtle chest vibrations to estimate heart rate and breathing rate with the Global Vitals SDK.

R.E.A.D System

R.E.A.D (Real-time Emotion Adaptive Driving) System was developed by Kia and researchers from MIT Media Lab (ML) as part of the academic research collaboration between Hyundai Motor Company (HMC) and the ML. It uses artificial intelligence and sensors to customize the in-cabin experience based on the passenger emotional state. It starts by capturing the passenger's biosignals to determine their affective state and then changes the in-car environment (such as seat vibration, smell, light, and sound) to help improve their navigation experience. Read more

Emotion Navigation for Kids- Little Big e-Motion

As part of the academic research collaboration between Hyundai Motor Company (HMC) and the Media Lab, HMC prototyped a unique ride-on kid's car that uses part of the Emotion Navigation Special Interest Group technology to support young patients during their hospital experience. A user study is currently conducted at the SJD Barcelona Children's Hospital in Spain to evaluate the efficacy of using the vehice and its technologies in reducing the patients' stress and anxiety levels. Read more

Contact Us

We are always looking for opportunities to collaborate and help support the research.
If you are interested in learning more or would like to collaborate, please send an e-mail to:
Neska Elhaouij


Copyright 2021 - Affective Computing Group at MIT Media Lab