- Library Home /
- Search Collections /
- Open Collections /
- Browse Collections /
- UBC Theses and Dissertations /
- From devices to data and back again : a tale of computationally...
Open Collections
UBC Theses and Dissertations
UBC Theses and Dissertations
From devices to data and back again : a tale of computationally modelling affective touch Cang, Xi Laura
Abstract
Emotionally responsive Human-Robot Interaction (HRI) has captured our curiosity and imagination in fantastical ways throughout much of modern media. With touch being a valuable yet sorely missed emotion communication channel when in-person interaction is unrealistic for practical reasons, we could look to machine-mediated ways to bridge that distance. In this thesis, we investigate how we might enable machines to recognize natural and spontaneous emotional touch expressions in two parts. First, we take a close look at ways machines engage with human emotion by examining examples of machines in three emotionally communicative roles: as a passive witness receiving and logging the emotional state of their (N=30) human counterparts, as an influential actor whose own breathing behaviour alters human fear response (N=103), and as a conduit for the transmission of emotion expression between human users (N=10 dyads and N=21 individuals). Next, we argue that in order for devices to be truly emotionally reactive, they should address the time-varying and dynamic nature of emotional lived experience. Any computational or emotion recognition engine intended for use under realistic conditions should acknowledge that emotions will evolve over time. Machine responses may change with changing ‘emotion direction’ – acting in an encouraging way when the user is `happy and getting happier' vs. presenting calming behaviours for `happy but getting anxious'. To that end, we develop a multi-stage emotion self-reporting procedure for collecting N=16 users’ dynamic emotion expression during videogame play. From their keypress force controlling their in-game character, we benchmark individualized recognition performance for emotion direction, even finding it to exceed that of brain activity (as measured by continuous Electroencephalography (EEG)). For a proof-of-concept of a training process that generates models of true and spontaneous emotion expression evolving with the user, we then revise our protocol to be more flexible to naturalistic emotion expression. We build a custom tool to help with data collection and labelling of personal storytelling sessions and evaluate user impressions (N=5 with up to 3 stories each for a total of 10 sessions). Finally, we conclude with actionable recommendations for advancing the training and machine recognition of naturalistic and dynamic emotion expression.
Item Metadata
Title |
From devices to data and back again : a tale of computationally modelling affective touch
|
Creator | |
Supervisor | |
Publisher |
University of British Columbia
|
Date Issued |
2024
|
Description |
Emotionally responsive Human-Robot Interaction (HRI) has captured our curiosity and imagination in fantastical ways throughout much of modern media. With touch being a valuable yet sorely missed emotion communication channel when in-person interaction is unrealistic for practical reasons, we could look to machine-mediated ways to bridge that distance. In this thesis, we investigate how we might enable machines to recognize natural and spontaneous emotional touch expressions in two parts.
First, we take a close look at ways machines engage with human emotion by examining examples of machines in three emotionally communicative roles: as a passive witness receiving and logging the emotional state of their (N=30) human counterparts, as an influential actor whose own breathing behaviour alters human fear response (N=103), and as a conduit for the transmission of emotion expression between human users (N=10 dyads and N=21 individuals).
Next, we argue that in order for devices to be truly emotionally reactive, they should address the time-varying and dynamic nature of emotional lived experience. Any computational or emotion recognition engine intended for use under realistic conditions should acknowledge that emotions will evolve over time. Machine responses may change with changing ‘emotion direction’ – acting in an encouraging way when the user is `happy and getting happier' vs. presenting calming behaviours for `happy but getting anxious'. To that end, we develop a multi-stage emotion self-reporting procedure for collecting N=16 users’ dynamic emotion expression during videogame play. From their keypress force controlling their in-game character, we benchmark individualized recognition performance for emotion direction, even finding it to exceed that of brain activity (as measured by continuous Electroencephalography (EEG)). For a proof-of-concept of a training process that generates models of true and spontaneous emotion expression evolving with the user, we then revise our protocol to be more flexible to naturalistic emotion expression. We build a custom tool to help with data collection and labelling of personal storytelling sessions and evaluate user impressions (N=5 with up to 3 stories each for a total of 10 sessions).
Finally, we conclude with actionable recommendations for advancing the training and machine recognition of naturalistic and dynamic emotion expression.
|
Genre | |
Type | |
Language |
eng
|
Date Available |
2024-04-29
|
Provider |
Vancouver : University of British Columbia Library
|
Rights |
Attribution-NonCommercial-NoDerivatives 4.0 International
|
DOI |
10.14288/1.0442005
|
URI | |
Degree | |
Program | |
Affiliation | |
Degree Grantor |
University of British Columbia
|
Graduation Date |
2024-05
|
Campus | |
Scholarly Level |
Graduate
|
Rights URI | |
Aggregated Source Repository |
DSpace
|
Item Media
Item Citations and Data
Rights
Attribution-NonCommercial-NoDerivatives 4.0 International