Full citation
Cang, X.L., Bucci, P., Rantala, J., & MacLean, K.E. "Discerning Affect from Touch and Gaze During Interaction with a Robot Pet." in IEEE Transactions on Affective Computing. 2021. pp 1-15.
Abstract
Practical affect recognition needs to be efficient and unobtrusive in interactive contexts. One approach to a robust real-time system is to sense and automatically integrate multiple nonverbal sources. We investigated how users' touch, and secondarily gaze, perform as affect-encoding modalities during physical interaction with a robot pet, in comparison to more-studied biometric channels. To elicit authentically experienced emotions, participants recounted two intense memories of opposing polarity in Stressed-Relaxed or Depressed-Excited conditions. We collected data (N=30) from a touch sensor embedded under robot fur (force magnitude and location), a robot-adjacent gaze tracker (location), and biometric sensors (skin conductance, blood volume pulse, respiration rate). Cross-validation of Random Forest classifiers achieved best-case accuracy for combined touch-with-gaze approaching that of biometric results: where training and test sets include adjacent temporal windows, subject-dependent prediction was 94% accurate. In contrast, subject-independent Leave-One-participant-Out predictions resulted in 30% accuracy (chance 25%). Performance was best where participant information was available in both training and test sets. Addressing computational robustness for dynamic, adaptive real-time interactions, we analyzed subsets of our multimodal feature set, varying sample rates and window sizes. We summarize design directions based on these parameters for this touch-based, affective, and hard, real-time robot interaction application.
SPIN Authors
Year Published
2021