Full citation
Cang, X.L., "Towards an Emotionally Communicative Robot: Feature Analysis for Multimodal Support of Affective Touch Recognition.", M.Sc. Thesis, University of British Columbia, 2016.
Abstract
Human affective state extracted from touch interaction takes advantage of natural communication of emotion through physical contact, enabling applications like robot therapy, intelligent tutoring systems, emotionally-reactive smart tech, and more. This work focused on the emotionally aware robot pet context and produced a custom, low-cost piezoresistive fabric touch sensor at 1-inch taxel resolution that accommodates the flex and stretch of the robot in motion. Using established machine learning techniques, we built classification models of social and emotional touch data. We present an iteration of the human-robot interaction loop for an emotionally aware robot through two distinct studies and demonstrate gesture recognition at roughly 85% accuracy (chance 14%).
The first study collected social touch gesture data (N=26) to assess data quality of our custom sensor under noisy conditions: mounted on a robot skeleton simulating regular breathing, obscured under fur casings, placed over deformable surfaces.
Our second study targeted affect with the same sensor, wherein participants (N=30) relived emotionally intense memories while interacting with a smaller stationary robot, generating touch data imbued with the following: Stressed, Excited, Relaxed, or Depressed. A feature space analysis triangulating touch, gaze, and physiological data highlighted the dimensions of touch that suggest affective state.
To close the interactive loop, we had participants (N=20) evaluate researcher-designed breathing behaviours on 1-DOF robots for emotional content. Results demonstrate that these behaviours can display human-recognizable emotion as perceptual affective qualities across the valence-arousal emotion model. Finally, we discuss the potential impact of a system capable of emotional ``conversation'' with human users, referencing specific applications.
The first study collected social touch gesture data (N=26) to assess data quality of our custom sensor under noisy conditions: mounted on a robot skeleton simulating regular breathing, obscured under fur casings, placed over deformable surfaces.
Our second study targeted affect with the same sensor, wherein participants (N=30) relived emotionally intense memories while interacting with a smaller stationary robot, generating touch data imbued with the following: Stressed, Excited, Relaxed, or Depressed. A feature space analysis triangulating touch, gaze, and physiological data highlighted the dimensions of touch that suggest affective state.
To close the interactive loop, we had participants (N=20) evaluate researcher-designed breathing behaviours on 1-DOF robots for emotional content. Results demonstrate that these behaviours can display human-recognizable emotion as perceptual affective qualities across the valence-arousal emotion model. Finally, we discuss the potential impact of a system capable of emotional ``conversation'' with human users, referencing specific applications.
SPIN Authors
Year Published
2016