Immersive Emotions

Human-Computer Interaction
Billy Kwok, Dongho Koo, Salih Berk Dinçer, Tee O'neill & Felicia Renelus
Background
As part of a Tangible User Interface course, my team and me came up with a product that helps users visualize their emotions through body movements that are translated into a visualization of those emotions.
My Contributions
As someone with a background in Cognitive Science, I led research on body language and emotion, as well as better understanding color theory and neuroaesthetics to help map certain body movements to certain image displays.
Immersive Emotions is an interactive public installation that serves to help users understand their emotions and how their body language might be contributing to how they are feeling. Using biosensors embedded in a glove, the data from these sensors is mapped to a display. The biosensors used were a temperature sensor, heart rate sensor and an accelerometer. There is also a camera that tracks the user's movements and maps the user's body language to a visual representation of their emotions.
Context


Based on my research, wide and round movements correlated to feelings of happiness, while narrow and round movements correlated to feelings of sadness. Rigid, but wide movements can signify anger and rigid and narrow movements can signify fear or timidity. Using principles of color theory, these emotions were mapped to their appropriate colors based on neuroaesthetic findings.

As user's interact with the installation, they get a nice visual and can test the limits of how different images can be generated. The visuals are meant to be pleasing, but also informative.




Immersive Emotions was selected to participate in the Mobile HCI Conference 2022. Our paper can be read here.