Full citation
Cang XL, Guerra RR, Guta B, Bucci P, Rodgers L, Mah H, Feng Q, Agrawal A, MacLean K. "FEELing (key)Pressed: Implicit Touch Pressure Bests Brain Activity in Modelling Emotion Dynamics in the Space Between Stressed and Relaxed." IEEE Transactions on Haptics. 2023 Sep 4.
Abstract
In-body lived emotional experiences can be complex, with time-varying and dissonant emotions evolving simultaneously; devices responding in real-time to estimate personal human emotion should evolve accordingly. Models assuming generalized emotions exist as discrete states fail to operationalize valuable information inherent in the dynamic and individualistic nature of human emotions. Our multi-resolution motion self-reporting procedure allows the construction of emotion labels along the Stressed-Relaxed scale, differentiating not only what the emotions are, but how they are transitioning – e.g., “hopeful but getting stressed” vs. “hopeful and starting to relax”. We trained participant-dependent hierarchical models of contextualized individual experience to compare emotion classification by modality (brain activity and keypress force from a physical keyboard), then benchmarked classification performance at F1-scores=[0.44, 0.82] (chance F 1 = 0.22, σ = 0.01) and examined high-performing features. Notably, when classifying emotion evolution in the context of an experience that realistically varies in stress, pressure-based features from keypress force proved to be the more informative modality, and more convenient when considering intrusiveness and ease of collection and processing. Finally, we present our FEEL (Force, EEG and Emotion-Labelled) dataset, a collection of brain activity and keypress force data, labelled with self-reported emotion collected during tense videogame play (N=16) and open-sourced for community exploration.
Year Published
2023

Projects