- Library Home /
- Search Collections /
- Open Collections /
- Browse Collections /
- UBC Theses and Dissertations /
- Towards an emotionally communicative robot : feature...
Open Collections
UBC Theses and Dissertations
UBC Theses and Dissertations
Towards an emotionally communicative robot : feature analysis for multimodal support of affective touch recognition Cang, Xi Laura
Abstract
Human affective state extracted from touch interaction takes advantage of natural communication of emotion through physical contact, enabling applications like robot therapy, intelligent tutoring systems, emotionally-reactive smart tech, and more. This work focused on the emotionally aware robot pet context and produced a custom, low-cost piezoresistive fabric touch sensor at 1-inch taxel resolution that accommodates the flex and stretch of the robot in motion. Using established machine learning techniques, we built classification models of social and emotional touch data. We present an iteration of the human-robot interaction loop for an emotionally aware robot through two distinct studies and demonstrate gesture recognition at roughly 85% accuracy (chance 14%). The first study collected social touch gesture data (N=26) to assess data quality of our custom sensor under noisy conditions: mounted on a robot skeleton simulating regular breathing, obscured under fur casings, placed over deformable surfaces. Our second study targeted affect with the same sensor, wherein participants (N=30) relived emotionally intense memories while interacting with a smaller stationary robot, generating touch data imbued with the following: Stressed, Excited, Relaxed, or Depressed. A feature space analysis triangulating touch, gaze, and physiological data highlighted the dimensions of touch that suggest affective state. To close the interactive loop, we had participants (N=20) evaluate researcherdesigned breathing behaviours on 1-DOF robots for emotional content. Results demonstrate that these behaviours can display human-recognizable emotion as perceptual affective qualities across the valence-arousal emotion model. Finally, we discuss the potential impact of a system capable of emotional “conversation” with human users, referencing specific applications.
Item Metadata
Title |
Towards an emotionally communicative robot : feature analysis for multimodal support of affective touch recognition
|
Creator | |
Publisher |
University of British Columbia
|
Date Issued |
2016
|
Description |
Human affective state extracted from touch interaction takes advantage of natural communication of emotion through physical contact, enabling applications
like robot therapy, intelligent tutoring systems, emotionally-reactive
smart tech, and more. This work focused on the emotionally aware robot pet context and produced a custom, low-cost piezoresistive fabric touch sensor at 1-inch
taxel resolution that accommodates the flex and stretch of the robot in motion.
Using established machine learning techniques, we built classification models of
social and emotional touch data. We present an iteration of the human-robot interaction loop for an emotionally aware robot through two distinct studies and
demonstrate gesture recognition at roughly 85% accuracy (chance 14%).
The first study collected social touch gesture data (N=26) to assess data quality
of our custom sensor under noisy conditions: mounted on a robot skeleton simulating regular breathing, obscured under fur casings, placed over deformable surfaces.
Our second study targeted affect with the same sensor, wherein participants
(N=30) relived emotionally intense memories while interacting with a smaller stationary robot, generating touch data imbued with the following: Stressed, Excited,
Relaxed, or Depressed. A feature space analysis triangulating touch, gaze, and
physiological data highlighted the dimensions of touch that suggest affective state.
To close the interactive loop, we had participants (N=20) evaluate researcherdesigned breathing behaviours on 1-DOF robots for emotional content. Results
demonstrate that these behaviours can display human-recognizable emotion as
perceptual affective qualities across the valence-arousal emotion model. Finally, we discuss the potential impact of a system capable of emotional “conversation” with human users, referencing specific applications.
|
Genre | |
Type | |
Language |
eng
|
Date Available |
2016-09-02
|
Provider |
Vancouver : University of British Columbia Library
|
Rights |
Attribution-NonCommercial-NoDerivatives 4.0 International
|
DOI |
10.14288/1.0314100
|
URI | |
Degree | |
Program | |
Affiliation | |
Degree Grantor |
University of British Columbia
|
Graduation Date |
2016-11
|
Campus | |
Scholarly Level |
Graduate
|
Rights URI | |
Aggregated Source Repository |
DSpace
|
Item Media
Item Citations and Data
Rights
Attribution-NonCommercial-NoDerivatives 4.0 International