Open Collections

UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Affecting affect effectively : investigating a haptic-affect platform for guiding physiological responses 2011

You don't seem to have a PDF reader installed, try download the pdf

Item Metadata

Download

Media
ubc_2011_fall_hall_joseph.pdf [ 16.25MB ]
Metadata
JSON: 1.0080686.json
JSON-LD: 1.0080686+ld.json
RDF/XML (Pretty): 1.0080686.xml
RDF/JSON: 1.0080686+rdf.json
Turtle: 1.0080686+rdf-turtle.txt
N-Triples: 1.0080686+rdf-ntriples.txt
Citation
1.0080686.ris

Full Text

Affecting Affect Effectively Investigating a haptic-affect platform for guiding physiological responses by Joseph P. Hall B.Sc., Columbia University in the City of New York, 2008 A THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF MASTER OF APPLIED SCIENCE in The Faculty of Graduate Studies (Mechanical Engineering) THE UNIVERSITY OF BRITISH COLUMBIA (Vancouver) May 2011 © Joseph P. Hall 2011 Abstract This thesis describes the development of a platform for touch-guided anxiety management via engagement with a robot pet. An existing physiological sensor suite and “Haptic Crea- ture” robot pet are modified to influence user physiological responses through real-time interaction guided by physiological data. Participant reaction to and perception of the plat- form is then investigated in several experiments, with the results from these experiments used to refine the platform design. Finally, an experiment is conducted with elementary school children to investigate the ability of the platform to serve as a comforting presence during a stressful task. It is found that participants were not able to recognize the Creature mimicking their breathing and heart rates. However, once informed of their physiological link to the Creature they were able to use the motion of this device to gain a better awareness of their own physiological state. In addition, the presence of the Creature and its activities are correlated with changes in heart rate, breathing rate, skin conductance, and heart rate variability. These changes are suggestive of a reduction in anxiety. Overall, participant response to the platform was positive, with many participants reporting that they felt the Creature to be comforting and calming. Children in particular were receptive to the Creature, and eager to use it in their stressful environment of school testing. It is found that care must be taken, however, to ensure the platform is presented in an age-appropriate manner, as sudden changes in Creature state can be alarming to the user. The combination of physiological assessment of user affect with a small, physically com- forting robot results in a unique system with the potential to serve as a companion or training aide for children or adults with anxiety disorder, especially in clinical and educa- tional settings. ii Preface Experiments 1, 2, and the pilot experiment in this thesis were performed under UBC BREB certificate no. H01-80470. Experiment 3 was performed under UBC CREB certificate no. H09-02860. iii Table of Contents Abstract . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ii Preface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iii Table of Contents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iv List of Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . viii List of Figures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ix Glossary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xiv 1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.1 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.2 Research Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.3 Thesis Outline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 2 Literature Review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 2.1 Robotic Companions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 2.2 Robotic Therapy with Children . . . . . . . . . . . . . . . . . . . . . . . . 9 2.3 Biofeedback and Anxiety Therapy . . . . . . . . . . . . . . . . . . . . . . . 10 2.4 Haptics and Affect . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11 2.5 Physiological Assessment of Emotional State . . . . . . . . . . . . . . . . . 13 2.6 Physiological Interaction with Robots . . . . . . . . . . . . . . . . . . . . . 14 2.7 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15 3 Methods and System Design . . . . . . . . . . . . . . . . . . . . . . . . . . . 17 3.1 General Approach and Methods . . . . . . . . . . . . . . . . . . . . . . . . 17 3.1.1 Creature Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . 18 3.1.2 Creature Development . . . . . . . . . . . . . . . . . . . . . . . . . 20 3.2 Hardware Additions and Modifications . . . . . . . . . . . . . . . . . . . . 22 3.2.1 Design Considerations and Challenges . . . . . . . . . . . . . . . . . 23 3.2.2 Additional Display Mechanisms . . . . . . . . . . . . . . . . . . . . 24 iv Table of Contents 3.2.3 Creature Electronics Board . . . . . . . . . . . . . . . . . . . . . . . 25 3.3 Communications: Command and Control . . . . . . . . . . . . . . . . . . . 29 3.3.1 Design and Construction of Radio System . . . . . . . . . . . . . . 29 3.3.2 Creature User Interface . . . . . . . . . . . . . . . . . . . . . . . . . 30 3.3.3 Creature Modes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 3.4 Feedback from Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 3.4.1 Vibration and Noise . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 3.4.2 Temperature / Cooling . . . . . . . . . . . . . . . . . . . . . . . . . 33 3.4.3 Comfort . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34 3.5 Online Physiological Assessment . . . . . . . . . . . . . . . . . . . . . . . . 35 3.5.1 Physiological Signals . . . . . . . . . . . . . . . . . . . . . . . . . . 36 3.5.2 Physiological Sensors Used . . . . . . . . . . . . . . . . . . . . . . . 40 3.5.3 Sensor Application Notes . . . . . . . . . . . . . . . . . . . . . . . . 50 4 Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51 4.1 Pilot Experiment: Response to Disturbing Images . . . . . . . . . . . . . . 51 4.1.1 Introduction and Motivation . . . . . . . . . . . . . . . . . . . . . . 51 4.1.2 Experimental Design Considerations . . . . . . . . . . . . . . . . . . 51 4.1.3 Research Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . 53 4.1.4 Experiment Procedure . . . . . . . . . . . . . . . . . . . . . . . . . 54 4.1.5 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55 4.1.6 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61 4.1.7 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62 4.1.8 Feedback for Iterated Design . . . . . . . . . . . . . . . . . . . . . . 63 4.2 Experiment 1: Recognition of Mirroring and Initial Reactions to Creature . 64 4.2.1 Research Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . 65 4.2.2 Experiment Procedure . . . . . . . . . . . . . . . . . . . . . . . . . 66 4.2.3 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68 4.2.4 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70 4.2.5 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75 4.2.6 Feedback for Iterated Design . . . . . . . . . . . . . . . . . . . . . . 75 4.3 Experiment 2: Creature Entraining and Reactions During a Task . . . . . 77 4.3.1 Research Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . 78 4.3.2 Procedure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79 4.3.3 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 4.3.4 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87 4.3.5 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93 4.3.6 Feedback for Iterated Design . . . . . . . . . . . . . . . . . . . . . . 94 v Table of Contents 4.4 Experiment 3: Experiment with Children . . . . . . . . . . . . . . . . . . . 94 4.4.1 Research Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . 96 4.4.2 Experimental Procedure . . . . . . . . . . . . . . . . . . . . . . . . 96 4.4.3 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98 4.4.4 Additional Investigation with the Creature . . . . . . . . . . . . . . 100 4.4.5 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100 4.4.6 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104 4.4.7 Feedback for Iterated Design . . . . . . . . . . . . . . . . . . . . . . 104 4.5 Reflections on Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107 5 Conclusions and Recommendations . . . . . . . . . . . . . . . . . . . . . . 108 5.1 Experimental Outcomes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108 5.2 Methodological Critique and Recommendations . . . . . . . . . . . . . . . 110 5.2.1 Platform Presentation . . . . . . . . . . . . . . . . . . . . . . . . . . 110 5.2.2 Platform Interaction . . . . . . . . . . . . . . . . . . . . . . . . . . . 111 5.3 Platform Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113 5.3.1 Outcomes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113 5.3.2 Recommendations . . . . . . . . . . . . . . . . . . . . . . . . . . . . 114 5.4 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116 Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117 Appendices A Derivations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126 A.1 Creature Physiological Mirroring Derivations . . . . . . . . . . . . . . . . . 126 A.1.1 Derivation of Ramped Breathing Motion Commands . . . . . . . . 126 A.1.2 Derivation of Ramped Pulse Rate . . . . . . . . . . . . . . . . . . . 128 A.1.3 Derivation of Ramped Breathing Motion Commands . . . . . . . . 129 A.1.4 Derivation of Ramped Pulse Rate [Simplified Motion] . . . . . . . . 130 A.2 Physiological Sensor Data Analysis Methods . . . . . . . . . . . . . . . . . 131 B Experiment Documents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134 B.1 Preliminary Experiment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135 B.1.1 Pre-Experiment Questionnaire . . . . . . . . . . . . . . . . . . . . . 135 B.1.2 Post-Experiment Questionnaire . . . . . . . . . . . . . . . . . . . . 138 B.1.3 Sample Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139 B.1.4 Sample Comparisons . . . . . . . . . . . . . . . . . . . . . . . . . . 142 vi Table of Contents B.1.5 Participant Consent Form . . . . . . . . . . . . . . . . . . . . . . . 146 B.2 Experiment 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147 B.2.1 Post-Experiment Questionnaire . . . . . . . . . . . . . . . . . . . . 147 B.2.2 Data Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151 B.2.3 Sample Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152 B.2.4 Sample Comparisons . . . . . . . . . . . . . . . . . . . . . . . . . . 159 B.2.5 Participant Consent Form . . . . . . . . . . . . . . . . . . . . . . . 165 B.3 Experiment 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 166 B.3.1 Post-Experiment Questionnaire . . . . . . . . . . . . . . . . . . . . 166 B.3.2 Data Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 168 B.3.3 Sample Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 170 B.3.4 Sample Comparisons . . . . . . . . . . . . . . . . . . . . . . . . . . 177 B.3.5 Participant Consent Form . . . . . . . . . . . . . . . . . . . . . . . 182 B.4 Experiment 3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183 B.4.1 Sample Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183 B.4.2 Sample Comparisons . . . . . . . . . . . . . . . . . . . . . . . . . . 190 B.4.3 Participant Consent Form . . . . . . . . . . . . . . . . . . . . . . . 208 B.4.4 Participant Assent Form . . . . . . . . . . . . . . . . . . . . . . . . 211 B.5 Experiment Equipment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 212 C Schematics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 213 C.1 Radio Base Station Schematics . . . . . . . . . . . . . . . . . . . . . . . . . 213 C.2 Creature Board Schematics . . . . . . . . . . . . . . . . . . . . . . . . . . . 217 D Code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 224 vii List of Tables 3.1 Heart rate variability frequencies. . . . . . . . . . . . . . . . . . . . . . . . . 47 4.1 Pilot Experiment: Self-reported Likert-scale responses to anxiety, agitation, and surprise. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56 4.2 Summary of significant results from Pilot Experiment. . . . . . . . . . . . 61 4.3 Table of results from Experiment 1 questionnaire. . . . . . . . . . . . . . . . 68 4.4 Summary of results from Experiment 1. Significant results are in bold. . . . 69 4.5 Summary of significant results from Experiment 1. . . . . . . . . . . . . . . 75 4.6 Questionnaire results from Experiment 2 post-experiment survey. . . . . . . 82 4.7 Summary of results from Experiment 2. . . . . . . . . . . . . . . . . . . . . 86 4.8 Summary of significant results from Experiment 2. . . . . . . . . . . . . . . 93 4.9 Summary of significant results from Experiment 3. . . . . . . . . . . . . . . 100 B.1 Table of results from Experiment 1 questionnaire. . . . . . . . . . . . . . . . 151 B.2 Results for two-tailed unequal variance t-test between breath lengths for each subject between all stages. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151 B.3 Results for two-tailed unequal variance t-test between series of interbeat in- tervals for each subject between all stages. . . . . . . . . . . . . . . . . . . . 152 B.4 Table of results from Experiment 2 questionnaire. . . . . . . . . . . . . . . . 168 B.5 Results for two-tailed unequal variance t-test between series of interbeat in- tervals for each subject between all stages. . . . . . . . . . . . . . . . . . . . 168 B.6 Questionnaire results from Experiment 2 post-experiment survey. . . . . . . 169 B.7 Summary of results from Experiment 3. Investigated columns in green, sig- nificant results are in bold. See Figure B.81 for comparisons. . . . . . . . . 205 D.1 Haptic Creature communications protocol. . . . . . . . . . . . . . . . . . . . 224 viii List of Figures 1.1 Proposed Creature–user interaction model. . . . . . . . . . . . . . . . . . . 2 1.2 User-centered diagram of TAMER model. . . . . . . . . . . . . . . . . . . . 3 3.1 Simplified schematic of the Haptic Creature interaction loop; an example of a haptic-affect loop. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18 3.2 TAMER command and control scheme. . . . . . . . . . . . . . . . . . . . . 18 3.3 Diagram showing development of Haptic Creatures. . . . . . . . . . . . . . 19 3.4 The Haptic Creature. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 3.5 The Haptic Creature, upside-down, with fur removed, showing silicone skin. 21 3.6 Haptic Creature pulse mechanism. . . . . . . . . . . . . . . . . . . . . . . . 24 3.7 Overview of main functions of Creature electronics board. . . . . . . . . . . 27 3.8 The Haptic Creature electronics board. . . . . . . . . . . . . . . . . . . . . . 27 3.9 Creature force sensitive resistor circuit. . . . . . . . . . . . . . . . . . . . . . 28 3.10 Simplified diagram of TAMER command and control scheme. . . . . . . . . 30 3.11 The radio base station for the Creature. . . . . . . . . . . . . . . . . . . . . 31 3.12 GUI for the Haptic Creature, providing motor, servo, and temperature status. 32 3.13 Graph of Haptic Creature internal temperature during normal use. . . . . . 34 3.14 Overview of measured physiological signals and the physiological metrics de- rived from them. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35 3.15 User holding the Haptic Creature and wearing physiological sensors. . . . . 41 3.16 Thought Technology FlexComp™Infiniti Encoder. . . . . . . . . . . . . . . . 42 3.17 Thought Technology EKG™ Sensor T9306M, attached to triode electrodes for placement on chest. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42 3.18 Thought Technology EMG MyoScan-Pro™ Sensor T9401M-60. . . . . . . . 43 3.19 Thought Technology Skin Conductance Sensor SA9309M. . . . . . . . . . . 44 3.20 Thought Technology Blood Volume Pulse (BVP) Sensor SA9308M, front and rear. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45 3.21 A sample unfiltered blood volume pulse signal, showing four heartbeats. . . 45 3.22 Thought Technology Respiration Sensor SA9311M. . . . . . . . . . . . . . . 47 3.23 A sample filtered respiration signal. . . . . . . . . . . . . . . . . . . . . . . . 48 3.24 Calculation of respiration rate. . . . . . . . . . . . . . . . . . . . . . . . . . 49 ix List of Figures 3.25 Thought Technology Temperature Sensor SA9310M, showing sensor and con- nector to encoder . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 3.26 A sample skin temperature signal. . . . . . . . . . . . . . . . . . . . . . . . 50 4.1 “Wizard of Oz” Haptic Creature Prototype used in pilot experiment, showing bellows used to simulate breathing and heating pad. . . . . . . . . . . . . . 53 4.2 Diagram of Pilot Experiment procedure. . . . . . . . . . . . . . . . . . . . . 54 4.3 Preliminary experiment participant responses to statement “Haptic Creature was comforting while viewing the images.” . . . . . . . . . . . . . . . . . . . 56 4.4 Preliminary experiment participant responses to statement “Haptic Crea- ture’s actions were distracting while viewing the images.” . . . . . . . . . . 57 4.5 Preliminary experiment participant responses to statement “Haptic Creature would help reduce my anxiety in other situations.” . . . . . . . . . . . . . . 57 4.6 Average normalized skin conductance response for disturbing image slideshow with and without Haptic Creature prototype for each participant. . . . . . . 58 4.7 Typical normalized skin conductance response for a participant during calm- ing image set, the baseline. . . . . . . . . . . . . . . . . . . . . . . . . . . . 59 4.8 Typical normalized skin conductance response for a participant during dis- turbing image slideshow. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60 4.9 Diagram of Experiment 1 procedure. . . . . . . . . . . . . . . . . . . . . . . 66 4.10 Experiment 1 participant responses to statement “I found the creature com- fortable on my lap.” . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71 4.11 Experiment 1 participant responses to question of whether “creature’s actions made them more aware of their own.” . . . . . . . . . . . . . . . . . . . . . 71 4.12 Experiment 1 participant responses to statement “It was easy to recognize creature mirroring my. . . ”. . . . . . . . . . . . . . . . . . . . . . . . . . . . 72 4.13 Breath lengths of a participant during Experiment 1. . . . . . . . . . . . . . 73 4.14 Diagram of Experiment 2 procedure. . . . . . . . . . . . . . . . . . . . . . . 79 4.15 Ramped Creature motion, as used during experiments. . . . . . . . . . . . . 80 4.16 Breath lengths and heart rate for a participant during stage 2 of Experiment 2. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84 4.17 Experiment 2 participant responses to survey statement “I was aware of the creature’s breathing.” . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87 4.18 Experiment 2 participant responses to survey statement “I was aware of the creature’s pulse.” . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88 4.19 Experiment 2 participant responses to survey statement “The creature’s breathing made me more aware of my own breathing.” . . . . . . . . . . . . 89 x List of Figures 4.20 Experiment 2 participant responses to survey statement “The creature’s pulse made me more aware of my own heart rate.” . . . . . . . . . . . . . . . . . 90 4.21 Experiment 2 participant responses to survey statement “I found the crea- ture’s motion distracting during the reading assignment.” . . . . . . . . . . 91 4.22 The Experiment 3 procedure diagram. . . . . . . . . . . . . . . . . . . . . . 97 4.23 Experiment 3 participant during experiment. . . . . . . . . . . . . . . . . . 99 B.1 Heart rate acceleration for a participant during the Pilot Experiment. . . . 139 B.2 Heart rate for a participant during the Pilot Experiment. . . . . . . . . . . 140 B.3 Normalized skin conductance for a participant during the Pilot Experiment. 140 B.4 Skin conductance derivative for a participant during the Pilot Experiment. 141 B.5 Normalized EMG for a participant during the Pilot Experiment. . . . . . . 141 B.6 Mean heart rate for participants during Pilot Experiment. . . . . . . . . . . 142 B.7 Standard deviation of normalized heart rates for participants during Pilot Experiment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143 B.8 Mean normalized heart rate acceleration for participants during Pilot Exper- iment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143 B.9 Mean normalized skin conductance for participants during Pilot Experiment. 144 B.10 Mean normalized derivative of skin conductance for participants during Pilot Experiment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 144 B.11 Mean normalized EMG for participants during Pilot Experiment. . . . . . . 145 B.12 Mean estimated arousal for participants during Pilot Experiment. . . . . . . 145 B.13 Heart rate acceleration for a participant during Experiment 1. . . . . . . . . 152 B.14 Normalized heart rate acceleration for a participant during Experiment 1. . 153 B.15 Heart rate for a participant during Experiment 1. . . . . . . . . . . . . . . . 153 B.16 Normalized heart rate for a participant during Experiment 1. . . . . . . . . 154 B.17 Normalized heart rate standard deviation for a participant during Experi- ment 1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 154 B.18 Skin conductance response for a participant during Experiment 1. . . . . . 155 B.19 Normalized skin conductance for a participant during Experiment 1. . . . . 155 B.20 Skin conductance derivative for a participant during Experiment 1. . . . . . 156 B.21 Normalized skin conductance derivative for a participant during Experiment 1.156 B.22 EMG for a participant during Experiment 1. . . . . . . . . . . . . . . . . . 157 B.23 Normalized EMG for a participant during Experiment 1. . . . . . . . . . . . 157 B.24 Breath lengths for a participant during Experiment 1. . . . . . . . . . . . . 158 B.25 Mean breath lengths for participants during Experiment 1. . . . . . . . . . 159 B.26 Breath length standard deviation for participants during Experiment 1. . . 160 B.27 Mean heart rate acceleration for participants during Experiment 1. . . . . . 160 xi List of Figures B.28 Heart rate acceleration standard deviation for participants during Experi- ment 1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 161 B.29 Mean skin conductance for participants during Experiment 1. . . . . . . . . 161 B.30 Skin conductance standard deviation for participants during Experiment 1. 162 B.31 Mean and standard deviation of breath lengths of participants during each stage of Experiment 1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163 B.32 Mean and standard deviation of heart rate for participants during Experi- ment 1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 164 B.33 Heart rate acceleration for a participant during Experiment 2. . . . . . . . . 170 B.34 Normalized heart rate acceleration for a participant during Experiment 2. . 171 B.35 Heart rate for a participant during Experiment 2. . . . . . . . . . . . . . . . 171 B.36 Normalized heart rate for a participant during Experiment 2. . . . . . . . . 172 B.37 Normalized heart rate standard deviation for a participant during Experi- ment 2. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172 B.38 Skin conductance response for a participant during Experiment 2. . . . . . 173 B.39 Normalized skin conductance for a participant during Experiment 2. . . . . 173 B.40 Skin conductance derivative for a participant during Experiment 2. . . . . . 174 B.41 Normalized skin conductance derivative for a participant during Experiment 2.174 B.42 EMG for a participant during Experiment 2. . . . . . . . . . . . . . . . . . 175 B.43 Normalized EMG for a participant during Experiment 2. . . . . . . . . . . . 175 B.44 Skin temperature for a participant during Experiment 2. . . . . . . . . . . . 176 B.45 Breath lengths for a participant during Experiment 2. . . . . . . . . . . . . 176 B.46 Standard deviation of breath lengths for all participants during Experiment 2.177 B.47 Mean breath length for all participants during Experiment 2. . . . . . . . . 178 B.48 Mean heart rate for participants during Experiment 2. . . . . . . . . . . . . 178 B.49 Mean heart rate standard deviation for participants during Experiment 2. . 179 B.50 High frequency component of heart rate variability during Experiment 2 for all participants for all stages. . . . . . . . . . . . . . . . . . . . . . . . . . . 179 B.51 Mean skin conductance for participants during Experiment 2. . . . . . . . . 180 B.52 Mean derivative of skin conductance for participants during Experiment 2. . 180 B.53 Mean EMG for participants during Experiment 2. . . . . . . . . . . . . . . 181 B.54 Mean skin temperature for participants during Experiment 2. . . . . . . . . 181 B.55 Heart rate acceleration for a participant during Experiment 3. . . . . . . . . 183 B.56 Normalized heart rate acceleration for a participant during Experiment 3. . 184 B.57 Heart rate for a participant during Experiment 3. . . . . . . . . . . . . . . . 184 B.58 Normalized heart rate for a participant during Experiment 3. . . . . . . . . 185 B.59 Normalized heart rate standard deviation for a participant during Experi- ment 3. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 185 xii List of Figures B.60 Skin conductance response for a participant during Experiment 3. . . . . . 186 B.61 Normalized skin conductance for a participant during Experiment 3. . . . . 186 B.62 Skin conductance derivative for a participant during Experiment 3. . . . . . 187 B.63 Normalized skin conductance derivative for a participant during Experiment 3.187 B.64 EMG for a participant during Experiment 3. . . . . . . . . . . . . . . . . . 188 B.65 Normalized EMG for a participant during Experiment 3. . . . . . . . . . . . 188 B.66 Skin temperature for a participant during Experiment 3. . . . . . . . . . . . 189 B.67 Breath lengths for a participant during Experiment 3. . . . . . . . . . . . . 189 B.68 Mean heart rate standard deviation for participants during Experiment 3. . 191 B.69 Mean heart rate pnn50 for participants during Experiment 3. . . . . . . . . 192 B.70 Mean skin conductance for participants during Experiment 3. . . . . . . . . 193 B.71 Mean skin conductance standard deviation for participants during Experi- ment 3. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 194 B.72 Mean skin temperature for participants during Experiment 3. . . . . . . . . 195 B.73 Mean skin temperature standard deviation for participants during Experi- ment 3. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 196 B.74 Mean breath length for participants during Experiment 3. . . . . . . . . . . 197 B.75 Heart rate vlf% for participants during Experiment 3. . . . . . . . . . . . . 198 B.76 Mean derivative of skin conductance standard deviation for participants dur- ing Experiment 3. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 199 B.77 Mean breath length standard deviation for participants during Experiment 3. 200 B.78 Mean heart rate for participants during Experiment 3. . . . . . . . . . . . . 201 B.79 Mean heart rate rms standard deviation for participants during Experiment 3.202 B.80 Mean derivative of skin conductance for participants during Experiment 3. . 203 B.81 Summary of comparisons made during Experiment 3. . . . . . . . . . . . . . 204 B.82 Diagram of TAMER command and control scheme. . . . . . . . . . . . . . . 212 C.1 The radio base station board. . . . . . . . . . . . . . . . . . . . . . . . . . . 213 C.2 The radio base station schematic. . . . . . . . . . . . . . . . . . . . . . . . . 214 C.3 The Creature board board. . . . . . . . . . . . . . . . . . . . . . . . . . . . 218 C.4 The Creature board schematic. . . . . . . . . . . . . . . . . . . . . . . . . . 219 xiii Glossary affect (n) emotion or desire, esp. as influencing behavior or action affect (v) produce an effect on, influence alpha (α) The probability of falsely rejecting a true hypothesis. BVP blood volume pulse Creature The Haptic Creature ECG Electrocardiogram effect (n) a result, consequence, impression effect (v) bring about EKG Electrocardiogram EMG electromyogram ELF Extremely Low Frequency HALO Haptic-Affect Loop Haptic Creature A zoomorphic robotic companion for exploring haptic (touch-based) interaction. HF High Frequency HR heart rate HRV heart rate variability ibi interbeat interval ICICS Institute for Computing, Information and Cognitive Systems LF Low Frequency xiv Glossary p-value The probability of obtaining a statistical result as extreme as the result obtained, assuming the null hypothesis is true. pnn50 The sum of the number of successive interbeat intervals that differ by more than 50 ms, divided by the total number of interbeat intervals counted. QRS complex The series of deflection seen on an EKG during a heart beat. RMSSSD The root mean squared standard deviation of heart rate TAMER Touch-guided Anxiety Management via Engagement with a Robot Pet TAMER Platform The platform designed and constructed in this thesis SCR skin conductance response, also known as galvanic skin response (GSR) ST skin temperature USB Universal Serial Bus VLF Very Low Frequency zoomorphic having or representing animal forms or gods of animal form xv Chapter 1 Introduction As robots become better able to infer and assess human affective state, there will be a range of opportunities for robots to assist us with specific physical tasks and with more general social needs, such as playing, entertaining, and skill training. Robots will be able to adapt their behavior based upon that of their operator [1]; in doing so they may also be able to influence his or her emotional state. A gentle touch or hug from a robot that determines you are sad could cheer you up, or decreased interruptions from a robot assistant that detects your happiness with its progress could improve task performance. With physiological sensing, there is even the potential that a robot could become aware of your feelings before you are [2]. A robot detecting an increase in your heart and breathing rate could intervene before you were consciously aware that you were becoming afraid or angry, allowing for reaction times faster than a human alone could achieve. The platform constructed and experimentally verified in this thesis is designed to begin investigation into this link between robot behavior and user emotion. A small personal robot [3] is utilized as part of an interactive feedback loop incorporating integrated biofeed- back from a user wearing physiological sensors. By investigating how the behaviors of a small personal robot can change a user’s physiological state, the platform will be capable of reacting to and even guiding user physiological signals in order to produce an effect in the user. 1.1 Motivation In this work, the proposed end-use application for this platform is as a companion robot for children with anxiety or emotion related disorders, following the interaction model in Figure 1.1. Using biosensors to assess user emotional state, the platform would intervene when appropriate to encourage anxiety-therapy training and coping behaviors [4, 5]. Immune to fatigue, it would provide an untiring, uncompromising tool for a therapist, parent, or teacher, by reinforcing existing therapy techniques [6] and helping to enable their application in the non-clinical world. The haptic interaction channel allows for the interruptions by the platform to be confidential, nonintrusive, and discreet [7] — a small stuffed animal would not seem out of place in a supportive classroom environment or home, nor unusual for a child to possess, and the sense of touch can provoke comforting reactions. This interaction is 1 1.1. Motivation a variant on the haptic-affect loop principle proposed by MacLean, Croft, and McGrenere, in which a combination of haptic stimuli and physiological sensing are used to manipulate user affective state. CHILD ROBOT PHYSIOLOGICAL SEN SIN G AF FE CT  D IS PL AY STRESSORS ANXIETY FEEDBACK GUIDED INTERVENTION Figure 1.1: Proposed Creature–user interaction model. The overall system block diagram is shown in Figure 1.1, which along with Figure 1.2 represents the concept described by MacLean, Garland, Croft, Van der Loos and O’Brien in earlier proposals. It describes the platform as envisioned in its eventual use. Input from the user is gathered both by touch sensing on the companion robot and from physiological sensors worn by the user. An anxiety assessment is derived from this data: touch sensors are interpreted by a gesture recognition engine that identifies how the user is holding or stroking the creature (e.g., light petting, hard squeezing. . . ), and physiological data by an inference engine. This estimate of anxiety is then used to drive the robot’s response model. A response rendering engine enforces transitions between commanded response states and coördinates the companion robot’s mechanisms so as to depict a coherent presentation of its biomimetic mechanisms. Offline interaction is provided by the therapist to download both user physiological data and interaction information to assess therapy performance and to upload new therapy protocols as the user progresses. The overall platform is called TAMER, for Touch-guided Anxiety Management via Engagement with a Robot pet. Before the TAMER platform can be used for therapy purposes, it is necessary to deter- mine what effect the presence of the platform has on user physiology, and if this effect can be measured or influenced. Only after such an effect has been determined can an investigation into the guidance of user affect begin. This thesis, therefore, describes the construction and testing of a platform that can support the entire model as shown in Figures 1.1 and 1.2, 2 1.2. Research Objectives Creature user therapist touch sensing creature response sensors affect sensing anxiety assessment physiological response training of coping skills initial engine training & calibration touch interaction physiological data processed data touch d ata & gestu re recog nition response determination anx iety  est ima te response commands attachment Figure 1.2: User-centered diagram of TAMER model. beginning with previously existing robot and physiological sensing platforms, and takes a significant step forward in verifying the platform through experimental observation of the platform’s effect on participant physiological signals. Specifically, an existing robot, the Haptic Creature [8], and physiological sensing platform [9] were modified and improved to function as part of the TAMER platform. The loop, excepting the therapist and touch sensing boxes from Figure 1.2, and with physiological measures related to anxiety simply measured in lieu of a full anxiety assessment engine, is then tested and verified. The plat- form construction is guided by feedback from pilot studies, design interactions, and the verification experiments. 1.2 Research Objectives The overall research objectives are as follows: 1. Determine user physiological reactions to both the presence of and motion of a com- panion robot, particularly when the companion robot is imitating the user’s physio- logical state. 2. Determine whether physiological reactions can be provoked in a user through manip- ulation of the companion robot’s heart rate and breathing mechanisms. To achieve these objectives, the following contributions were required: 1. Construction of a platform consisting of a small companion robot and physiological sensors that is capable of measuring physiological data from a user and providing haptic feedback that could evoke a physiological response. This platform is based on existing robot and physiological sensing platforms. 3 1.3. Thesis Outline 2. Verification and improvement of the functionality of this platform through iterated design and participant feedback. 1.3 Thesis Outline Following a literature review, this thesis presents the design process of the TAMER platform followed by the experimental design, protocol, and outcome of testing, and finally the conclusions and recommendations toward realizing the vision of a therapeutic robot tool for anxiety therapy. Chapter 2 An outline of recent literature in the subject areas relevant to the TAMER platform. Research related to robotic companions, robotic therapy, biofeedback and anxiety therapy, haptics and affect, physiological assessment of emotional state, and physiological interaction with robots is discussed. Chapter 3 A description of the design and construction of the TAMER platform and de- tails of its components: the companion robot, the “Haptic Creature,” and integration of the physiological sensing system. Chapter 4 The experiments performed with the TAMER platform and their results. Four main experiments were performed. The first was a pilot experiment to determine the feasibility of the platform. The second and third investigated the initial user reactions to the Haptic Creature and attempted to determine whether users could recognize it mirroring their physiological state, as well as the reactions to the Creature when the user was performing a task. The final experiment began to investigate reactions to the Creature during a task with child participants. Chapter 5 The conclusions from the construction and experimental testing of the TAMER platform and recommendations for improvements and future work. 4 Chapter 2 Literature Review As the TAMER platform incorporates research from a number of different areas, this litera- ture review will serve as a broad overview of the motivations for the TAMER platform and a summary of the existing work and technology that have been incorporated in it, including basic science and engineering research in psychology and haptics. First, robotic companions and the uses of robots in therapy are discussed, followed by some of the technology and techniques used in their applications. Techniques specific to biofeedback and anxiety are discussed, as well as the general link between haptics and affect. Finally, work in physio- logical assessment of emotional state is presented, as well as the use of this assessment or other physiological data in interaction with robots. 2.1 Robotic Companions Around the turn of the twenty-first century personal robots were introduced to the North American consumer market. From Japan came the Sony AIBO [10], a robotic pet dog capable of learning and responding to verbal commands. Tyco Inc. in the United States designed the Furby [11], a small furry electronic creature with the ability to move its eyes, ears, and even itself in response to human interaction. Through prolonged conversation and interaction with their owners, Furbies would appear almost as children in developing a growing command of the language around them, gradually mumbling less and less in their own gibberish language and more in their owner’s. Now, for the first time, advances in computer and artificial learning technology hold the potential for robots to refine their interactions with us in a way resembling the development of a friendship or companionship. The AIBO, as a commercial robotic pet, was the focus of a large amount of research investigating whether owners would react to this device like a loyal fireside-accompanying plastic pet or more as they would a television. Friedman et al. investigated how several hundred AIBO owners described their devices on an online web forum. Although 75 percent of the owners attributed “technological essences” to the creature; still 60 percent described it as having some sort of “Mental States,” in particular believing it to have the ability to have intentions and feelings [12]. Owners spoke of their AIBO “wanting” to do things, and of feeling “sad” or “happy” based upon both actions the owners had taken and the reactions of AIBO. Of particularly importance to the concept of companionship was that 5 2.1. Robotic Companions 60 percent of owners attributed “social rapport” to their robotic canine, with 28 percent of posts describing an emotional connection that they had to their AIBO, and 26 percent expressing a sense of companionship with their plastic pup. It is surprising that humans were able to feel many of the same feelings they would have towards a living animal to a robot, especially one with such limited expressive and interactive ability. Although the AIBO, like the Furby, could respond to commands, there was little verbal communication other than barking, and no software algorithms attempting to emulate a greater emotional connection. Melson et al. took on the human-robot and human-animal comparison more directly, and investigated children’s responses to both an AIBO and an Australian shepherd [13]. They found that while children were able to recognize that the AIBO was a robot and not an actual dog, they still treated it in dog-like ways, and “affirmed that it had mental states. . . sociality. . . and moral standing.” About half of the students even thought AIBO was more like a dog than a computer. Later research revealed similar results in adults, that “even while the person recognizes that AIBO is a technology, the person still affirms AIBO as a companion, and as a friend” [14]. It appears that although humans may recognize AIBO and other robotic pets as technological devices and not living creatures, they still are able to form the bonds of companionship with their robot, and with these can come health and social benefits. Banks et al. showed that elder adults in a nursing home were again able to recognize that AIBO was a robot and not a real dog. Interaction with AIBO produced the same reduction in loneliness that interacting with an actual animal dog provided [15]. The nursing home residents showed a high level of attachment to both the living dog and the robotic impersonator, but yet the effect of level of attachment was not sufficient to explain the decrease in loneliness for either interaction, suggesting some additional attachment. Tamura et al., in research with adults with dementia who might not be able to distinguish the robot from an actual pet, found that residents would often look at, communicate, and care for AIBO, and that this resulted in increased communication from the patients and improved well-being [16]. This finding supports the goal of the TAMER platform to capitalize upon these social links. It aims not to blindly reproduce some benefits of human-animal interaction, but rather to deliver targeted emotional and behavioral therapy through this medium. The intriguing research prospects of these commercial devices led to the development of several robotic platforms strictly for research use. Instead of modifying commercial platforms designed for entertainment, these were engineered specifically to investigate the behavioral effects that robot animals could have on humans. The most prominent of these was Paro, a robotic baby harp seal developed by Shibata et al. [17]. This robot has the abil- ity to move its eyelids, flippers, and neck, and displays sophisticated animal behaviors, such as responding to noises and sleeping. Paro also is equipped with reinforcement learning, 6 2.1. Robotic Companions responding to positive interactions such as gentle petting as well as negative interactions such as slapping. It can recognize and grow accustomed to its owner. During long-term interaction at a Japanese nursing home, Paro was shown not only to increase the amount of social interaction engaged in by residents with their peers, but also to reduce their stress levels and improve health, as measured by stress hormone levels [18]. Additional studies by Kidd et al. had American nursing home residents interact with Paro in a group, rather than in one-on-one settings, and saw improvements in community building among the mem- bers [19]. Robotic therapy is particularly appealing to nursing homes and hospitals where real animals may be banned for both health and hygienic reasons. The widespread use and exhibition of Paro have also allowed for cross-cultural comparisons of user impressions of robotic animals. In a recent study, Shibata et al. found that westerners were more likely to attribute to Paro a “comfortable feeling like interacting with real animals,” while users from Japan and Korea tended to attribute to Paro a “favorable impression to encourage interaction.” They attribute this difference to cultural differences in relationships with an- imals [20]. The success of this robot in therapy has been so great that Paro is now being manufactured for commercial use, targeted to the elderly and those with dementia [21]. A concern noted in these experiments was confounding impressions of specific species. Not only may a user have different expectations from a robotic dog than a robotic seal, but two users from different backgrounds may have different expectations of proper “dog” be- havior. To help avoid this, the robot companion in the TAMER platform is a zoomorphic creature, with animal-like characteristics but not resembling a specific animal. Unlike Paro, it communicates solely through the haptic channel, investigating user reactions to a robot designed to be a non-specific species. Another robot specifically designed for therapy is the Huggable, a robotic stuffed teddy bear. This robot was designed by Stiehl et al. specifically to investigate touch interactions. It features the ability to move its neck, eyebrows, ears, and shoulders (in order to hug, hence the name) with fully compliant voice coil actuators. Its unique feature is a creature-wide sensitive skin to distinguish between various touching behaviors [22]. Much like Paro, the Huggable also contains a behavior system designed to increase companion behaviors. It has the ability to look into a person’s face and recognize its owner [23]. More recent work proposes the use of the Huggable in pediatric care, either as a proxy to allow distant friends and family to interact with a child, or with the hope that a child will use the robotic bear as an “emotional mirror” of themselves, allowing for doctors and nurses to receive valuable feedback that the child may be unwilling or unable to provide [24]. These robots both serve as an inspiration for and feed the design iteration of the robot companion for the TAMER platform, the Haptic Creature. It was designed by Yohanan et al. to investigate the emotional aspects of our touch interaction with animals [3]. The Haptic Creature has the ability to purr, breath, heat up, and adjust its ears, and uses these 7 2.1. Robotic Companions behaviors in an attempt to determine what common behaviors typical to human-animal interaction we find pleasurable. Work is currently underway to determine the emotional states users attribute to various Creature behaviors [8], and how to utilize these to affect the emotional state of the user. In addition to robots designed for full-body touch interaction, there are several robots de- signed to investigate the potentials of human-robot companionship through primarily visual or audio means. In keeping with the theme of hugs, the Huggable Robot Probo was designed “as a tele-interface for entertainment, communication, and medical assistance” [25]. The robot has the appearance of something like a robotic green anteater, with a long nose, and is capable of actuating most of its face, neck, and trunk to display various emotional states. Research is currently ongoing to map its facial expressions to emotional states [26]. The iCat is a small yellow robot consisting primarily of a large face that is capable of displaying a wide range of emotions and coherently changing between emotional states [27]. The iCat is intended to investigate what emotional states, as displayed through facial expressions, are to be expected from such a robot in long-term engagement, and which would best be able to maintain engagement in the creature, thereby building a relationship. Work is currently ongoing in measuring participant engagement with the creature [28]. The Kismet robot by Breazeal is another expressive robotic face; this is designed to investigate the use of facial expressions in interacting vocally with users [29]. Unlike an animal, a robot has the potential to communicate with humans in their own language, and there are many potential uses for a robot that essentially acts as the embodiment of a voice to develop a relationship with users. Heerink et al. investigated the use of the iCat to elderly nursing home residents with conversation skills [30]. They found that while the residents were generally excited and interested to interact with the robot, the conversation activity might have been too oriented towards assisting and not sufficiently enjoyable. They hypothesized that viewing the iCat as an assistant rather than a companion would lead to less than expected utilization of the robot. Kanda et al. experienced similar results when using robots to help teach English in Japanese elementary schools [31]. After the initial excitement from introduction of the robot faded, interaction with the robot fell off markedly. However, those who continued to interact with the robot showed signs of improved English skills. While a robot designed for entertainment may not be successfully adapted to a role in therapy, a therapeutic robot must maintain a degree of entertainment and companionship in order to attract repeated, long-term engagement and use. The TAMER platform aims to build upon these examples of robotic companions to create an engaging device with therapeutic benefits. People are capable of developing a pet-like attachment towards their robots, and the technology exists to construct small personal robots with various interaction devices. In this system we endeavor to leverage this emotional connection to manipulate user affect and feelings. 8 2.2. Robotic Therapy with Children 2.2 Robotic Therapy with Children Much research into robotic companions and robotic therapy has involved the use of children, the target audience of the TAMER platform. Children, like the elderly, often have a variety of special needs that require care, and they are generally receptive to robots under the guise of a new toy. Several robots and applications of robots to therapy are described here that, like the TAMER platform, target children and attempt to influence child behavior. Through play, robots may have the ability to elicit emotions more reliably and repeatedly than a human caregiver. Kozima et al. developed Keepon, a small, bright yellow robot that resembles a snowman, with the ability to orient its eyes on a target and bob or rock around to display emotion [32]. Primarily designed for toddlers or babies, they found that children were able to develop a steady emotional reaction to the creature. They hypothesized that by appearing so different from a human but “perceiving and acting” as we do, Keepon “motivates children to explore and communicate with it,” a necessity for human-robot interaction. Their intended use for this robot is to promote interaction and engagement in children with autism spectrum disorder. Plaisant et al. developed a robot that, rather than communicate to the child, attempts to enable the child to better communicate with others [33]. The child controls the robot via arm-bands and a headset, and the robot attempts to promote “therapeutic play,” either by having the child display emotions appropriate to a story in the robot to gain awareness of the emotions, or by having the robot perform rewarding tasks when certain motion goals as part of physical therapy are met. They note the advantage of a robot controlled by a child in education and therapy: giving a child the ability to have control over part of his or her environment prevents frustration and encourages success. Kronreif et al. developed the PlayROB, a robot that assists several disabled children in assembling LEGO™ structures by handling the bricks. They found that children were able to quickly adapt to use the system, and build LEGO™ structures that they may not have otherwise had the physical ability to assemble [34]. Many applications from the use of robotics in rehabilitation have also been found ap- plicable to children. Out of many: Cook et al. used a robotic arm to assist children with motor development disabilities in gaining motor control [35], and Krebs et al. successfully used a robot exoskeleton to assist children with cerebral palsy in developing proper walking motion as well as muscle strength [36]. Robotic therapy is particularly well-suited to provide benefits to children who are de- velopmentally disabled, particularly those with autism spectrum disorder. These children are often unable to express their emotions and can have difficulties communicating — this often leads to rapid frustration in a social environment. A robot can adapt its behavior to a child’s emotional state through the use of physiological sensing to access a communications 9 2.3. Biofeedback and Anxiety Therapy channel unavailable to humans, and can use machine learning techniques to correlate the child’s activity and signals with emotional states. Dautenhahn et al. developed Robota, a humanoid robotic doll capable of moving its legs, arms, and head [37]. This was part of the Aurora project, the goal of which was to study the role of robots in autism therapy. Robota was used in an attempt to develop interaction skills in children with autism. The robot was programmed both to dance to music and to react to the pressing of controls on a control pad by moving its limbs. They found that the robot was able to become a source of interaction for the children, a device about which they could communicate with their teacher. More importantly, after they had become comfortable interacting with the robot, they were then interested in communicating with the creator of the robot, who was a stranger to them [38]. Salter et al. developed a small spherical robot called Roball for interacting with autistic young children [39]. The robot is designed to resemble a ball to facilitate ease of play, and current research is ongoing in how best to adapt the balls behavior to children’s actions in order to maximize engagement and attention. Liu et al. mount a basketball hoop on a typical industrial pick and place robot to develop an engaging video game for children with autism [40]. They developed a basketball-shooting game with three levels of difficulty by varying the motions of the robot arm. They then attempted to maximize user engagement and liking through the use of physiological sensors to detect emotional state. They found that they were effectively able to increase child liking of their game session through the use of physiological feedback. Robins et al. does caution, however, that in particular robots designed for the target audience of children with autism must be careful that they do not simply encourage interac- tion with the robot, but instead use the robot as a tool to eventually encourage interaction with other humans [41]. Leveraging this finding of receptivity to robotic therapy and interaction by young chil- dren, the primary user group of the TAMER platform will be children. Guidance into the interaction loop from therapists and child educators will help ensure that while time with the Creature and platform is playful and fun, it also provides important therapeutic benefits. 2.3 Biofeedback and Anxiety Therapy Physiological training exercises such as yoga and other meditation have long been used for calming purposes. These approaches, however, require careful and repeated training under the supervision of an instructor to be effective. For a novice practitioner, often the concentration needed to achieve anxiety reduction cannot be established in the very anxiety-inducing situations for which they would wish them to be effective. Feedback-guided 10 2.4. Haptics and Affect training would seem an effective learning technique, however it is only relatively recently that we have been able to measure and quantify muscle relaxation. With this technology has emerged a new field of biofeedback-guided therapy: patients are trained to reduce or stimulate certain physiological indices to help them reduce their stress or anxiety levels. Raskin et al. used biofeedback to teach adult patients to relax their frontalis muscle, used to lift the eyebrows, and found that several patients had their anxiety markedly or moderately improved through this technique, in one of the first pilot studies on anxiety patients. They state that “in many ways biofeedback techniques represent a modern electronic version of these older approaches” [42]. Townsend et al. found that electromyogram (EMG) relaxation training was superior to group psychotherapy in decreasing mood disturbances, as well as both trait and state anxiety [43]. In an investigation into the physiological symptoms of anxiety, Lehrer et al. found that biofeedback training to increase heart rate variability is also effective in reducing anxiety [44], and that “various forms of breathing retraining have been found to be effective treatments and/or treatment adjuncts for anxiety disorders” [45]. They also note the benefit of biofeedback guided training over simple verbal guidance. With the advent of portable physiological sensing devices, recent studies have examined the use of real-time biofeedback. In this technique patients are alerted when they are exceeding certain physiological thresholds associated with anxiety, in order that they might begin calming procedures. Murphy et al. used a heart rate variability feedback device for patients with generalized anxiety disorder; they found that, in combination with cognitive behavioral therapy, such biofeedback could be just as effective as EMG relaxation training in reducing anxiety [46]. Reiner reported similar results: he equipped patients with a portable heart rate variability monitor, and they were instructed to monitor their heart rate variability throughout the day. Patient reported outcomes included reductions in anxiety and anger and improved sleep; participants found the feedback device to be more helpful than meditation and yoga [6]. Reiner also found that patients who were most compliant with the monitoring and training reported the greatest benefits. These results suggest the prospect that a biofeedback enabled robot, such as the Haptic Creature, could take on the role of either tutor, by training users to reduce their anxiety using its breathing mechanism, or alerter, by making users aware of their current state of heightened anxiety. 2.4 Haptics and Affect Herteinstein states that “touch is capable of communicating valenced and discrete emotions as well as specific information” [7]. Touch can be used both to communicate and to elicit emotions. The simple act of touching has been shown to be capable of influencing user ac- tions and opinion. Although an often under-considered element of human-robot interaction, recent attention has been focused on how to reproduce communicative touch in robots. 11 2.4. Haptics and Affect Various studies have confirmed that touch, even unnoticed, can have a profound impact over our behavior. Fisher et al. found that the brief touch from a librarian handing back a library card produced an increase in positive opinion of the librarian and library [47]. This effect held even when the participant was consciously unaware of the touch. Willis et al. asked passers-by on a campus and in a mall to sign a petition or complete a survey, respectively — they found that combining the request with a casual touch almost dou- bled participant compliance [48]. It is not only the touch by a human that can induce these effects: Vormbrock et al. found that touching a dog is correlated with changes in blood pressure [49], and Shiloh et al. found that touching rabbits and turtles reduced state- anxiety [50]. Touching the toy versions of the animals, however, did not have a similar effect on anxiety. Several more recent studies, however, have confirmed that artificial, active touch can provoke positive reactions. Haans et al. investigated whether an armband with vibrotactile actuators could produce the same increase in altruism and compliance associated with human touch; they found that both man and machine had similar success rates [51]. Touching can even make a robot seem more humanlike: Cramer et al. found that proactive robots seemed less machine-like when they touched users [52], but also that a user’s opinion of touch was influenced by robot behavior: touching reactive robots made them seem less dependable. Tactile pleasure should be of concern in designing interaction devices. Salminen et al. investigated the responses to stimulation by a fingerprint friction stimulator: stimuli rated as unpleasant, arousing, dominating, and less approachable produced faster reaction times than those considered more pleasant [53]. Swindells et al. observed physiological reactions to operation of a haptic knob along with emotional reports, concluding that “analyzing both affective and performance measures together is crucial for good design” [54]. Both the effect of haptics on affect and the effect of affect on haptic use have been investigated through several haptic devices that attempt to communicate or influence user emotions. The intimate nature of touch in relationships has inspired several researches to see if mechanical devices can substitute for interpersonal touch. Smith et al. concluded that users were able to communicate emotion through knobs during various tasks [55]. Chang et al. developed the Lumitouch, a pair of linked picture frames designed to provide a sense of presence across distance. A frame would light up when a user was in front of the partner frame: by touching the frame a user could cause colors to light up on the other frame; the colors varied depending upon the location, intensity, and duration of touch [56]. Couples found this generally appealing: several developed their own “haptic language” for remote communication. Mueller et al. invented the “Hug over a Distance,” in which couples could wirelessly activate an inflatable vest in their partner, simulating a hug [57]. This was generally well received, although thought impractical for every-day use [58]. The 12 2.5. Physiological Assessment of Emotional State TapTap was a similar device, essentially a haptic scarf designed to record and display touch interactions [59]. This device was proposed to enable a single user to provide therapeutic touch asynchronously to several people without the necessity of their presence. As touch is an important link to emotion, the TAMER platform aims to use this intimate channel to affect user’s emotional state. The haptic channel seems uniquely suited for this sort of task. Through it, the Haptic Creature will be able to unobtrusively display information and even communicate discreetly. 2.5 Physiological Assessment of Emotional State While biofeedback therapy may have used physiological measurement in order to adjust and moderate emotional responses in patients, these physiological metrics could also be used to assess emotional state. Humans are sophisticated enough that single-sensor metrics are not particularly generalizable to all emotional states, but with the advent of improved com- puter pattern recognition and machine learning techniques, it became possible to develop the online recognition of emotional state through physiological measurements. When even humans may have trouble reading the verbal and visual clues of their fellow humans, the use of non-conscious channels for emotional communication with robots appears ideal. Humans are not typically accustomed to openly and consciously assessing and sharing their feelings, and whereas body language, posture, and gaze may be difficult for a robot to assess directly, requiring sophisticated cameras and visual processing techniques, much work has been done in using small, simple physiological sensors for the assessment of emotional state. Picard et al. were among the first to apply the machine learning techniques that had been originally used for vocal and facial emotional analysis to physiological data [2]. They used psychological techniques to instill in participants 8 different emotions, and they achieved a success rate of 81 percent in recognizing these from blood volume pulse, skin conduc- tance, and respiration rate sensors. They state that at the time “there were doubts in the literature that physiological information shows any differentiation other than arousal level,” making this the first proof of concept of machine emotional recognition through physiological signals. Kim et al. attempted similar emotion recognition in children, using a support vector machine to classify emotional state based upon blood volume pulse, skin conductance, and skin temperature sensors. They were able to achieve a success rate of 78 percent in recognizing sadness, anger, and stress in users [60]. Wagner et al. attempted a more robust emotion classification system, using feature reduction to improve valence and arousal recognition in users strapped to electrocardiogram, skin conductance, respiration rate, and electromyography sensors and subject to emotion-inducing music. They were able to achieve 92 percent accuracy in identifying emotional state [61]. Kulić et al. attempted not to estimate discrete emotional state, but rather to develop 13 2.6. Physiological Interaction with Robots online recognition of a user’s valence and affect levels. They utilized a fuzzy-logic based inference engine to assess user arousal base upon electrocardiogram, skin conductance, and electromyography sensors, and use this as the basis for human-robot interaction [62]. They later refined their results using a Hidden Markov Model [63] to achieve an average recognition rate of 72 percent [9]. Liu et al. applied support vector machines to identify emotional state in children with autism [64], using this as the input for the robot basketball game mentioned previously. Theirs is unique in that they trained their system not by progressing the user through various emotional states, but by using both therapists and parents to assess emotional state of the children, who would not themselves be able to communicate this effectively. They were able to achieve a success rate of approximately 83 percent recognition, and improve the child’s liking of the game. Rani et al., in a recent summary of applying several machine learning techniques to a large data set, achieved an overall emotional classification success rate of 86 percent using support vector machines [65]. A major limitation with all these physiological assessment engines developed is that they are often not generalizable to every-day practical use, having been calibrated in specific, often sterile environments for specific uses. Bethel et al. caution that “research should focus on developing a diverse set of complimentary [sic] measures that capture the full range of human-robot interactions” [66]. Although the TAMER hardware and software support the ability to provide online assessment of physiological state through computer learning methods, the training of an assessment engine specifically for anxiety is beyond the scope of this thesis. 2.6 Physiological Interaction with Robots Despite the limitations of these physiological assessment engines, they have already been used to some success in human-robot interactions. Takahashi et al. used skin conductance sensors in an eating assistance robot for people with disabilities [67]. By measuring skin conductance response they were able to distinguish between erratic behavior that was under user control and that which was robot generated, and use this input to fine-tune their control algorithms. Itoh et al. utilized physiological sensing to reduce user stress when interacting with a large personal robot during interaction tasks [68]. If user stress raised above a certain value the robot would stop the activity and shake hands with the user. They found that subject stress was significantly reduced by the robot’s motion. Rani et al. demonstrated affect- based control of a robot: upon sensing an increase in anxiety from its user the robot would interrupt its own task and return to assist the user [69]. Hanajima et al. programmed a 14 2.7. Summary robot to reduce its speed of approach towards a user based upon skin conductance response and found that this improved subjective response to the robot [70]. Kulić et al. used their previously defined mentioned algorithms to assess user emotional response to slow, medium, and fast robot trajectories with various behaviors of approach towards the user [62]. They then analyzed this information to estimate user arousal during interaction with the robot, reducing the velocity of the robot when sensed arousal is high, as this could be a dangerous condition [71]. Such a reaction has promising applications in situations where humans must work in close proximity to a robot: a robot reacting to a user’s physiological indicators of danger could potentially stop much more quickly than if the user had to find and hit an emergency stop button. Thus, while a generalized emotion system is still some distance away, physiology-based input to a robot system has been shown a successful input to a robot control system to reduce stress related to human-robot interaction. The eventual goal of the TAMER platform is to have a robot that reacts to valence changes in the assessed level of a user’s emotional state of anxiety. Although at present, platform behavior and effectiveness are not based on aggregated sensor data, but rather simpler, single-sensor readings, that capability exists and can be implemented once the appropriate inference engines have been researched. 2.7 Summary While recent works have begun to apply physiological monitoring to human-robot interac- tion, the TAMER platform uniquely attempts to integrate this broad background of tech- nologies in order to guide physiological responses. Chapter 3 will describe how an existing companion robot and physiological sensing suite were combined to produce this platform. Incorporating biofeedback training techniques into a robotic companion has the potential to provide users with an untiring, consistent trainer for developing important coping tech- niques to deal with stress and anxiety. Results from Chapter 4 will show that users are able to successfully use the TAMER platform to mimic the breathing of the Haptic Creature. Experimental results will show that activation of the TAMER platform has a statistically significant relationship to a user’s physiological measures, even when the user is performing a separate task. Physiological sensing allows for a robot companion to be not simply an alerting mechanism, informing the user of his or her undesired physical and emotional re- actions to conditions, but also a teacher, targeting these conditions for reinforcement of the previously learned coping skills. The platform can potentially act as a proxy for a therapist who cannot always be present with the user, and at the same time gather physiological data for further analysis and feedback. For this children are an ideal target population, as they require constant and consistent reinforcement, but must receive such therapy without 15 2.7. Summary belittling and disparagement from their peers. Orienting cognitive based therapy through the haptic sense allows for non-intrusive and inconspicuous communication even in a social environment, while also using a channel that has been shown to have great effect on behav- ior and affect. Results from Section 4.3 will show that school-aged children are amenable to working with the Creature, and find interacting with it comforting and pleasurable. In addi- tion, when the TAMER platform is used during computerized cognitive activities in school, it will be found to have a statistically significant relationship to physiological changes in the children, who typically enjoy having the companionship of the Haptic Creature during this stressful activity. 16 Chapter 3 Methods and System Design The TAMER platform expands upon two existing technologies: a “Haptic Creature” and a physiological sensing suite. Thus, a unified platform is aimed at guiding user affect through haptic interaction, particularly for anxiety reduction purposes. The design of this system proceeded in two main parts: the construction of a new Haptic Creature, with design modifications made to support this particular use, and modification of the sensor suite both to interface with the Creature and to record physiological data related to anxiety that could be used to drive the Creature. This chapter outlines the overall platform approach and methods in Section 3.1, the hardware modifications developed in Section 3.2, the TAMER command and control framework in Section 3.3, and the modifications made due to feedback from user testing in Section 3.4. Finally, in Section 3.5, the modified physiological sensing suite for use in the platform is presented. 3.1 General Approach and Methods The TAMER platform pairs a robotic creature designed for haptic interactions with phys- iological sensors in order to guide physiological responses related to affect. Three main components are needed in order to effectively manipulate affect: a sensing suite to measure physiological signals, a response engine using this information, and an interaction device. These components function in a haptic-affect loop, of which the block diagram in Figure 3.1 is an instance. The sensing suite serves to provide online feedback for the platform. Phys- iological data from all available sensors are analyzed in real-time by the response engine. When used to assess user affect it is trained using questionnaires or surveys from previous interactions. In the initial studies for this thesis the primary goal is not to manipulate affect directly, but rather to take the more preliminary of step of attempting manipulation of user physiological indicators, such as breathing rate or heart rate. In this case the re- sponse engine is not trained to recognize specific emotional states, but rather commands actions directly from sensor output, and adapts based upon user response to these actions. The interaction device for this platform is the “Haptic Creature” described in Section 3.1.1; in this platform it serves as a robotic companion. It is desired that through the coupling between user affect and robot actions a genuine affection and sense of connection with the robot will be engendered in the user. 17 3.1. General Approach and Methods creature user haptic display haptic input and physio sensing Figure 3.1: Simplified schematic of the Haptic Creature interaction loop; an example of a haptic-affect loop. Figure 3.2 describes the overall TAMER platform. Physiological sensors attached to the user collect and transmit physiological data to the physiological sensor software. The sensor software, based on these data and desired Creature behavior, sends motion commands to the Creature, over radio, USB, Bluetooth, or an actual wire. Having received these commands, the Creature’s microcontroller activates the breathing, pulse, or heating mechanisms to perform the desired motions. In typical use these commands are breathing servo position data or a pulse command. At the same time, sensor data are sent by the microcontroller through radio, USB, Bluetooth or an actual wire to the Creature display software, which displays the sensor information. These components are described in the following sections. PHYSIO SOFTWARE CREATURE DISPLAY RADIO or USB or BLUETOOTH or WIRE PULSE or BREATHING or HEAT or EARS or PURR TEMPERATURE SENSORS MICROCONTROLLER TOUCH SENSORS USER CREATURE COMMANDS SENSOR DATA SENSORS USBCONVERTER SENSOR ENCODER CREATURE PHYSIOLOGICAL SENSORS HOST COMPUTER(S) PHYSIO DATA Figure 3.2: TAMER command and control scheme. 3.1.1 Creature Introduction The concept of the Haptic Creature was initially created by Yohanan and MacLean [72], who constructed a manually-actuated prototype version of the Haptic Creature followed by 18 3.1. General Approach and Methods a robotic version. The Haptic Creature used in this thesis is a second robotic version. It is similar to the original, but was intended not for fundamental research into the haptic ex- pression of emotion, but for the TAMER platform, and thus minor modifications were made for this application, under the supervision of Yohanan et al. and the author. Development of an entirely unique companion robot for the TAMER platform would have been outside the scope of this thesis. The robot used in this thesis incorporates a pulse mechanism and heating pads as additional display mechanisms, which the original did not have, as well as electronics systems designed for integration into the TAMER control loop. Some of the de- sign improvements made on this thesis’s robot have been brought back to the original. For the Creature in this thesis, physical construction of the shell, creature heating pad, pulse, breathing mechanism, ears, and fur was in collaboration with Yohanan et al., with several undergraduate student design groups. However, all electronics and communication proto- cols presented in this thesis are unique to this thesis and wholly the work of the author. A summary of this is shown in Figure 3.3. A description of the Haptic Creature development process follows.  “wizard of oz” prototype original creature second creature tim e revisions shell, mechanism concepts testing & developm ent concept verification + pulse + heating pads + TAMER    connectivity design improvements Figure 3.3: Diagram showing development of Haptic Creatures. The second Creature (green) is used as part of this thesis. 19 3.1. General Approach and Methods 3.1.2 Creature Development The robotic companion for this platform is the “Haptic Creature” (see Figures 3.4 and 3.5), developed by Yohanan and MacLean [72] to investigate affective touch in human- robot interaction (see Section 2.1). While much research has focused on the effect of robot appearance in interactions with humans, the Creature is innovative in that it is among the first to explore in depth our touch-based interactions with robots, and how the tactile qualities of a robot influence our perceptions of it. Such investigation is necessary: robots are no longer constructs of cold metal and motors in factories, where human contact would be dangerous, but have become smaller, more personal devices that interact with their users in more intimate ways. This research both draws from, and can serve as an aid to, the domains of human-animal and human-human touch interaction. In those fields it is typically difficult to eliminate the many confounding variables that are present in touch studies: touching from or being touched by other humans is almost always emotionally loaded, and perceptions of animal touch can be positively or negatively altered by previous experiences with them. The Creature allows for the individual components of human-animal interaction to be Figure 3.4: The Haptic Creature. 20 3.1. General Approach and Methods Figure 3.5: The Haptic Creature, upside-down, with fur removed, showing silicone skin. studied separately. Actions such as breathing, warmth, and purring can be emulated in isolation as well as combined in both natural and abnormal ways — this is much easier and more practical than, for example, training a cat to purr repeatedly, but not to move or breathe! By manipulating these individual components, a more complete model of how each contributes to our perception of the emotional “state” of the Creature can be developed. The relations between these actions and perceived emotional states should help to develop a more fundamental understanding of the affective nature of our touch interactions. As originally conceived by Yohanan et al. [72], the Creature had several main mech- anisms to interact with users: purring, breathing, ear display, and warmth. These were drawn from the actions typical of small domestic mammals, but designed to be zoomor- phic: resembling a generic animal more than any one species to avoid confounding effects. Care was also taken to ensure that the Creature’s display mechanisms were purely haptic, with minimal aural or visual components, to again reduce confounding effects and narrow investigative scope. The first version constructed by Yohanan et al. was a “Wizard of Oz” prototype (this version was utilized for use in the pilot experiment of the Haptic Affect Platform, described in Section 4.1) with all mechanisms present but with the breathing and ear display manually actuated by a human operator. Initial studies by Yohanan et al. [72] investigated how these mechanisms could be combined into coherent emotional states. They 21 3.2. Hardware Additions and Modifications found that participants were successful in identifying and distinguishing between the device asleep, content, happy, upset, and playing dead [72]. Following that testing a robotic version of the Creature was then constructed by Yohanan et al.. Several form factor and mechanism iterations followed leading to the version depicted in Figure 3.4, the first robotic model of the Haptic Creature, and that used in subsequent experiments by Yohanan et al. [3]. This version consists of a hard fiberglass shell with force sensitive resistors encompassing the structure to detect touch (separate research has been ongoing to classify these sensor inputs as common gestures, such as petting or striking [73]). The shell is covered with soft synthetic fur on all sides except the bottom, where a softer, felt-like fabric, like the abdomen of a dog or cat, is present. A servo mechanism moves the upper rear part of the abdomen to simulate a breathing motion; the mechanism is attached through springs to improve its compliance. Inside the Creature, a motor with an uneven weight attached to its shaft spins to emulate the vibrations characteristic of purring, with also a slight purring sound. The ears are constructed from the rubber bulb of a blood pressure cuff. A servo adjusts a valve connected to the outlet of the bulb to increase or decrease the rate of airflow from the cuff when squeezed, adjusting the perceived ear stiffness when squeezed. From this “base” design of the Creature, and following the preliminary studies reported in Section 4.1, an additional creature was constructed for use in the TAMER platform by the author in collaboration with Yohanan et al. [8] and undergraduate student teams. It includes the shell and mechanisms described above, and incorporates additional mechanisms and modifications necessary for use in the TAMER platform. Unless mentioned otherwise, all references to the Creature henceforth refer to this newer, second version, the robot integrated into the TAMER platform and used during the experiments in this thesis. While the original Creature and those elements mentioned above were developed by Yohanan et al., the modifications made to the additional Creature, in particular the electronics and the applications thereof, are unique contributions of this thesis. 3.2 Hardware Additions and Modifications The use of the Haptic Creature in this platform results from an important characteristic of the device revealed in initial prototypes: its calming potential. In casual interaction the warmth and gentle breathing sensation from the Creature were often perceived as com- forting. However, in order to fully investigate this behavior there were a number of chal- lenges and concerns to be addressed in modifying the platform from its original intended purpose of investigating affective touch. The Creature required additional robustness for longer-term operation in a less laboratory-like environment. Additional mechanical actu- ators were needed that, while staying within the solely haptic mode of interaction, could 22 3.2. Hardware Additions and Modifications better represent physiological states. As part of this thesis, electronics for motor power and control, sensor input, and communication were constructed, and a communications protocol to incorporate the Creature into the TAMER interaction loop, shown in Figure 3.1, was developed. Feedback from user testing was incorporated into the design process to both test and refine these hardware changes. In this section the design considerations and challenges inherent in the TAMER are described, followed by the details of the modifications made to the Creature’s display mechanisms, and electronics. The following sections describe the TAMER platform’s communications and control systems, and finally the refinements to these modifications based on feedback from user interactions. 3.2.1 Design Considerations and Challenges The two primary considerations in designing the Creature element of the TAMER platform were robustness and engagement. Robustness was a paramount design goal: the eventual usage environment for the TAMER platform includes home and school environments, where the Creature will be subject to the not-gentle handling of children. In these environments, it is expected that the Creature will be dropped, struck, and generally played with. It is necessary that the Creature be rugged enough to withstand this treatment, as well as to degrade gracefully in the event of failure, in a way that should not cause harm to the user. Compliance was necessary in Creature mechanisms — a child hugging the Creature could obstruct the motion of the breathing or pulse mechanisms, potentially causing too high a load on the servo or motor driving the mechanism. The Creature also had to be capable of surviving longer-term experiments of several hours or an entire school-day. In addition, the nature of this ultimate user group demanded consideration of the Creature’s engagement ability. While acting through channels of limited expressiveness, the Creature had to be initially intriguing to the user, inducing a desire for contact and interaction, and had to maintain this desire during long-term encounters, while not being so engaging as to distract the user from his or her ordinary tasks. To help foster this engagement a command and control framework that can be readily adapted to changing environments was necessary for the TAMER platform. The platform must be able to react quickly to short-term changes in physiological state, as well as subtly to longer-term user responses. It must also be capable of rapidly communicating interaction data, such as touch patterns, which are applicable to its present operation. The platform must be able to generate and store performance and interaction data for later analysis. As it is anticipated that experimental time with the ultimate user group may be limited, it was imperative that the experimenter be able to modify engagement parameters and Creature behavior quickly; therefore, the Creature hardware and software also had to be adjustable and reprogrammable. All of these parameters had to be fulfilled within the small size of the 23 3.2. Hardware Additions and Modifications present shell, and, for time and budget purposes, without a whole-scale revamping of the previously existing Creature mechanisms. All modifications had to support a robust and reliable device capable of withstanding repeating long-endurance experimental trials. The modifications made to achieve these goals are described in the following sections. 3.2.2 Additional Display Mechanisms In consideration of the primarily haptic nature of the device, the Creature’s expressive channels were limited to those which produced effects discernible by touch. In order to increase the Creature’s expressiveness, two additional display mechanisms were added to the Creature under the supervision of the author: a pulse mechanism to replicate the presence of a heartbeat, and heating pads to generate warmth. Pulse A pulse mechanism, designed and constructed by an Undergraduate Mechatronics Capstone Design Project Course team under the supervision of Yohanan et al. [8] and the author, was added to the Creature (see Figure 3.6). This expressive channel was well-suited to the TAMER platform for several reasons. As heart rate and heart beats are directly measured by the physiological sensor suite, this mechanism permits representation of a user’s heart beat in the Creature. Heart rate and heart rate variability are also linked to human affective state, in particular anxiety, therefore display or manipulation of this activity could potentially affect the user’s physiological state. Having a pulse also increases the “life-like” nature of the Creature in a way that maintains its zoomorphic behavior. Incorporating heart-rate into the Creature’s affect presentation allows the Creature to present its own emotional states with greater fidelity and higher accuracy; these more expressive emotional states could potentially allow for increased growth of user companionship with the Creature. Figure 3.6: Haptic Creature pulse mechanism. 24 3.2. Hardware Additions and Modifications The pulse mechanism consists of a bipolar stepper motor attached to a pulley. Two rods, one on each side, with a cork on the end, are attached to the pulley via a revolute joint. The rods pass through a support bracket near the sides of the Creature. As the stepper motor rotates the pulley, these brackets force the rods to move linearly outwards and inwards. A limit switch mounted near the pulley prevents over-rotation. The net effect of a rapid clockwise then counterclockwise motion (or vice-versa) of the stepper motor is to create a brief tap or “pulse” on the point impacted by the corks. The pulse mechanism is mounted transversely near the front of the Creature, approximately where its “neck” would be if it had one, and with fur on the Creature this mechanism produces a pulse locatable in the immediate area of the mechanism. It does not, however, produce a discernible tactile effect in any other area of the Creature. A maximum heart rate of approximately 160 beats per minute was achieved in bench testing. Heating Pads Many users responded positively to the warmth produced by a heating pad in the “Wizard of Oz” prototype Haptic Creature [72]. Therefore, three heating pads were added to the bottom of the Creature to reproduce this warmth. The heating pads are large, flat resistors that dissipate heat when voltage is supplied. They are not noticeably felt through the fur. When operated on 500 mA of current, heat from the pads is able to be felt through the Creature’s fur in approximately one minute. Feedback from DS18B20 1-wire digital thermometers mounted around the Creature’s shell can be used to monitor temperature levels, and deactivate the heating pads when the desired temperature is achieved. 3.2.3 Creature Electronics Board The Creature’s main electronic board was designed by the author to support the basic functionality necessary for the TAMER platform, while allowing for easy maintainability and upgradability. The board was designed to attach to the Arduino Mega, an “open-source electronics prototyping platform based on flexible, easy-to-use hardware and software” [74]. The mating of a custom board with an off the shelf component served to provide increased functionality, improved reliability, and easier maintainability of the control system. The use of the Arduino helped to reduce potential assembly and design errors in the microcontroller and its supporting hardware, which were among the most complex parts of the electronics. It also allowed for the system to be programmed in a free, open-source developer environment and programming language based on the common C programming language: this allows for future programmers without knowledge of assembly language to maintain the codebase. The Arduino is also able to be reprogrammed without the need to remove the chip or use special programmers. Full schematics of the electronics board, as well as sample code and 25 3.2. Hardware Additions and Modifications a parts list, can be found in Appendices C.1, C.2, and D. The Creature’s electronics board and its components are shown in Figures 3.7 and 3.8. Power Supply The power supply for the Creature comprises several components for delivering power at the voltages and currents necessary for its mechanisms. Power for the Arduino and control board components is supplied by 5 V and 3.3 V linear voltage regulators. Filter capacitors are placed as close as possible to all integrated circuit chips to reduce line noise. Power for the motors, servos, and heaters is supplied by three Dimension Engineering 25 W step down adjustable switching regulators [75]. Adjustable voltage regulators were required to allow for motors or servos to be replaced, as well as for overall current regulation of the heaters. Switching regulators were required both for their efficiency gains: they waste less power than traditional linear regulators, reducing overall current draw, and their reduced heat production: heat buildup is of concern in the small enclosed space of the Creature. Typical operation of the heaters at 10 V, the servos at 7.2 V, and the motors at 5 V allows for maximum current draws of 2.5 A, 3.47 A, and 5 A respectively, although in typical usage this total current draw is not reached. Power input to the Creature was provided by a 12 V wall power supply capable of supplying 5 A. The linear voltage regulators are low drop-out, allowing for microcontroller power and therefore radio communications to be maintained when input power is as low as 5.7 V. This is of particular importance when the provisions for internal powering of the Creature with a battery are utilized. Connectors are present to allow the Creature to be controlled by a 12 V NiMH battery, similar to that used in remote control cars, eliminating the need for a “tail” wire to the Creature. A Maxim MAX712CPE-ND battery charging chip and supporting circuitry allow for the battery to be charged from a wall outlet without disassembling the shell. Battery voltage can be monitored via the onboard microprocessor. Motor Controls The control board is capable of controlling one bipolar stepper motor and two bidirectional or four unidirectional DC motors. Motor control is provided by two Texas Instruments L293DNE dual H-bridges, each capable of supplying 1.2 A of continuous current, and uti- lizing integrated clamping diodes to prevent back emf. DC motor speed control is provided by PWM output at 64 kHz, 32 kHz, 8 kHz, 1 kHz, or 500 Hz. Heater control is provided by four p-channel MOSFETs capable of providing 9A of current each; typical control is on-off with hysteresis. 26 3.2. Hardware Additions and Modifications power motor controls motors sensor input temperature sensors touch sensors communications, command and control bluetooth, xbee radios Figure 3.7: Overview of main functions of Creature electronics board. Figure 3.8: Creature Board with power (orange), motor controls (purple), sensor input (yel- low), communications and command and control (blue), and temperature sensing (white) areas highlighted. 27 3.2. Hardware Additions and Modifications Sensor Inputs The control board supports acquisition of touch sensor data from the sixty-four force sen- sitive resistors (FSRs) arrayed around the Creature. There are four individual sensing circuits: each comprises sixteen FSRs attached connected to a single sixteen to one signal multiplexer. The output of that multiplexer is connected to a circuit as shown in Figure 3.9. Two of the multiplexers share an operational amplifier and digital potentiometer to reduce hardware requirements. Sensor output runs from 0 V to 2.5 V. For greater fidelity and to aid in sensor calibration a digital potentiometer, the 100 kΩ Maxim MAX5479EUD+-ND, is used as the resistor in the sensor circuit; in general, larger resistor values cause the FSRs to saturate less quickly but lose fidelity. Sensor operation was not addressed in this thesis, but resistor values were chosen so as to gain two to three amplitude levels of touch sensing per resistor. The sensor outputs are connected to the analog input pins of the microcontroller, and the digital pins for controlling the multiplexers and digital potentiometers to the digital input and output pins of the microcontroller. Control of the digital potentiometer is via the 3-wire SPI protocol. A triple-axis accelerometer with analog output, capable of sensing up to ± 3 g, is also present on the control board. + - VOUT = 2.5(1− R DIGIPOTR FSR ) R DIGIPOT R FSR +5V +2.5V Figure 3.9: Creature force sensitive resistor circuit. Microcontroller The control board uses the Arduino Mega as its main controller. The Arduino Mega is a standalone microcontroller board containing an Atmel ATmega 1280 AVR microcontroller running the Arduino bootloader. The microcontroller operates at 16 MHz, with 128 kB of Flash memory. It has 54 digital output pins, 14 of which can provide pulse width modula- tion (PWM) output, and 16 analog input pins, as well as 4 UART serial communications channels. The use of the Arduino board allowed for the microcontroller and its supporting devices to be connected with smaller solder-traces than possible in a non-mass-manufactured board, allowing for a smaller overall footprint. The Mega also supports the SPI (Serial Pe- ripheral Interface Bus) and I2C (Inter Integrated Circuit) communications protocols. 28 3.3. Communications: Command and Control Communications Equipment There are several digital input and output methods provided by the control board. Primary input to the Creature is via the universal serial bus (USB) port on the Arduino Mega board; this is also the channel through which the microcontroller code and firmware are loaded and updated. Access to this port is somewhat difficult without disassembling the Creature’s shell; therefore, a “tail” consisting of a short USB cable and power cord surrounded by fur is typically attached to the Creature. A Digi XBee® 802.15.4 RF module allows for wireless radio communication between the radio base station (see Section 3.3.1) and the Creature. The range of the XBee has been experimentally measured at greater than twenty meters, line of sight, which is more than sufficient for typical operations. A Bluegiga Bluetooth communications module (WRL-08771) allows for Bluetooth communication between the Creature and a Bluetooth-enabled computer. Both radio communication devices emulate serial ports on the Creature and the host computer; typically communications speed is 57600bps, bidirectional. Headers on the control board allow for a wired tail to be attached for additional serial communication with the microcontroller or other devices. 3.3 Communications: Command and Control The control board was designed to support communication through several different meth- ods and media, in order to support communication and monitoring in diverse environments. The Creature as part of the TAMER platform must be capable of both receiving data from and providing data to the physiological sensing suite, and it must be able to do this reli- ably and effectively. It must also be able to report Creature hardware status, in particular internal temperatures and battery voltages. In typical operation the Creature is controlled by a host computer, communicating wirelessly via the XBee radios. In locations with high electromagnetic interference or for testing purposes a wired serial connection from the radio base station may be used. The radio base station, creature status interface, and several typ- ical usage cases of the Creature communications systems are described in this subsection. A diagram of the command and control scheme is shown in Figure 3.10. The computer hardware used during the experiments is described in Appendix B.5. 3.3.1 Design and Construction of Radio System Wireless communication with the Creature necessarily requires two radios: one inside the Creature and another to send commands to it. A radio base station was designed and constructed to contain this second radio, and allow for a secure and safe connection with the host computer. The radio base station (see Figure 3.11) consists of a Digi XBee® 802.15.4 radio, as well as supporting components. The radio base station supports input 29 3.3. Communications: Command and Control CREATURE SENSORS USER PHYSIO DATA TOUCH DATA HARDWARE STATE CREATURE COMMANDS PHYSIO SOFTWARE CREATURE DISPLAY Figure 3.10: Simplified Diagram of TAMER command and control scheme. Arrows repre- sent communications links between system components, dashed arrows identify the connec- tions that are typically wireless. from several sources: USB communication with a host computer, as well as two-wire serial input from any other serial device. In addition, several digital and analog input and output pins on the front cover of the unit allow for switches or potentiometers to be used to control the Creature. For future applications, an Atmel ATmega 328 chip can be attached to the radio base station board to allow for operation of the base station without a host computer. This chip is typically programmed with the Arduino bootloader to allow for use of the Arduino programming environment, and can make use of the several LEDs and a 4-digit 7-segment display on the unit’s control board for user feedback. Schematics and parts lists for the radio base station can be found in Appendix C.1. 3.3.2 Creature User Interface Use of the Creature in experiments revealed a need for additional monitoring and feedback of the Haptic Creature. A computer graphical user interface (GUI) in the Processing envi- ronment was developed to receive feedback from the Creature, and is shown in Figure 3.12. This GUI can utilize whichever communication methods are not currently being used to send commands to the Creature; during typical operation this is the Bluetooth transceiver. It provides the status of the breathing servo and pulse motor, as well as the internal tem- perature of the Creature. Data received are logged to a text file, for use in after-experiment performance analysis. 30 3.3. Communications: Command and Control Figure 3.11: The radio base station for the Creature. 3.3.3 Creature Modes The Creature is capable of operating either as part of the TAMER loop, controlled by other programs, radio controlled via the radio base station, or autonomously via onboard firmware. These methods are described in the following sessions. Physiological Sensor Suite Input (e.g., mirroring) In typical operation the Creature mechanisms are directly controlled by the physiological sensor suite. The physiological sensors are connected to a computer running the physio- logical sensing software, which is connected to a radio for command transmission for the Creature. Creature mechanisms are controlled by the physiological software according to programmed algorithms — in the simple but common usage case of the Creature mirroring the user’s heart rate and pulse, the physiological software commands the position of the breathing servo to match that of the respiration sensor on the user, and triggers a pulse in the Creature when it detects one in the user. Input from the software does not have to be 31 3.3. Communications: Command and Control Figure 3.12: GUI for the Haptic Creature, providing motor, servo, and temperature status. A display of commanded respiration rate is shown in the upper left hand corner. To the right of that graph temperature readings from internal temperature sensors are display. On the far right is a timer system for experiments. direct motor or servo commands; the software can also control the Creature hardware at a more general level, such as commanding a transition between pre-programmed emotional “states” on the Creature. Direct Software Input The Creature can also be controlled by any other software program that has access to the serial port, such as those written in the Processing language [76]. Radio Stand-Alone The radio base station as mentioned previously can act as a standalone device by installing the ATMEGA328 microcontroller into the unit, and programming it using the Arduino enviroment. The digital and analog input pins on the radio base station can be connected to, for example, potentiometers or switches to drive the Creature. This is useful when operating the Creature for demonstrations or testing, where a host computer is not available. 32 3.4. Feedback from Testing Autonomous Operation The Creature can also be programmed to act autonomously, running a preset program with- out external input. As this does not incorporate the functionality gained by incorporation in the TAMER loop, it is typically used only for testing or demonstration. 3.4 Feedback from Testing After construction of the Creature for the TAMER platform, testing and informal pilot studies revealed several design concerns and suggestions for improvement that were im- plemented into the device. The three primary areas of redesign were related to unwanted vibrations in the Creature, temperature and cooling related issues, and user comfort. 3.4.1 Vibration and Noise During operation of the Creature as part of this thesis a slight vibration and noise were present from the breathing and pulse mechanisms. The noise from the breathing servo was predominantly from the servo attempting to maintain a constant position against gravity pushing down the abdomen shell. The shell would fall a small amount and then be raised by the servo, creating sound from the action of the servo and vibration from the motion of the shell. The refresh rate of the servo was increased to give the shell less time in which to fall before the servo would react: this had the result of eliminating the vibration, and changed the sound emitted from a choppy one to a lower-volume purr. These changes both increased the rate at which breathing servo commands are sent from the physio software and increased the smoothness of the breathing mechanism when mirroring a user’s respiration. The noise from the pulse mechanism was reduced by placing vibration dampers at the mechanism mounting points. This somewhat muffled the sound, but there is still a fairly audible click when the pulse mechanism is operated. 3.4.2 Temperature / Cooling A similar version of the Haptic Creature experienced a servo failure due to overheating. To prevent this, the Creature was equipped with a temperature monitoring system by the author. Up to eight DS18B20 1-wire digital thermometers can be located throughout the Creature; in typical operation one is placed on the breathing mechanism servo, and another near the voltage regulators on the control board, the two primary heat generating components inside the shell. Temperature readings are taken every five seconds during Creature operation and passed to the control computer, if present. The Creature is shut down when internal temperature rises above 48.8 ◦C, above which damage to the internal components may occur. Figure 3.13 shows the Creature internal temperature during an 33 3.4. Feedback from Testing experiment. Temperature is monitored in two places: on the breathing servo, and the anterior of the Creature, located the farthest away from the breathing servo and expected to be the coolest part of the Creature. Breathing servo temperature increased by eight degrees during fifty minutes of use. Although overheating concerns are therefore not a problem during typical operation, some wires were rerouted and neatened to increase available space around the primary heat generating mechanisms in the Creature, namely the servos and the voltage regulators on the control board. creature temperature while active on lap time [min] te m pe ra tu re  [° C] 0 10 20 30 40 50 60 21.1 22.2 23.3 24.4 25.6 26.6 27.8 28.9 anterior breathing servo Figure 3.13: Graph of Haptic Creature internal temperature during normal use. 3.4.3 Comfort While the fur was generally found to be soft and comfortable, the fiberglass shell could still be easily felt underneath. This was particularly evident when the Creature was resting on the lap: the Creature bottom seemed particularly hard and bony. To alleviate this a silicone skin was developed for the Creature by the author in collaboration with Yohanan et al. [8] and constructed by the author. It attaches to the fiberglass shell underneath the fur (see Figure 3.5). This skin consists of an approximately 0.25 inch thick piece of silicone in the shape of the Creature’s shell. Part of the skin stretches over the ends of the shell to secure it in place. The pad improved the comfort of the device markedly, and had the added benefit of increasing the zoomorphic characteristics of the Creature. The feel of the silicone under the 34 3.5. Online Physiological Assessment fur also attempted to replicate the feeling of a dog or a cat, where there is a harder level of skeleton under the fur coat. The silicone, combined with the fur, also helps spread out the force from any touching of a skinned surface, resulting in registration of a touch by more force sensing resistors on the shell. 3.5 Online Physiological Assessment The second major component of the TAMER platform is the physiological sensor suite, allowing for user feedback via physical channels. Figure 3.14 summarizes the physiologi- cal signals collected in this platform and the physiological metrics derived from these. Six sensors, EKG (Electrocardiogram), EMG (Electromyogram), BVP (Blood Volume Pulse), Skin Conductance, Respiration, and Skin Temperature, are currently used within this plat- form, both to derive the physical state of the user and to drive the actuators of the Haptic Creature. The capability exists for additional sensors to be integrated into this platform as they become available. Section 3.5.1 describes the key physiological signals used for this platform, Section 3.5.2 describes the sensors and encoder used, and Section 3.5.3 describes the reactions to and limitations of these sensors. ACCELERATION NORMALIZED ECG ECG HEART RATE DERIVATIVE NORMALIZED BEAT DETECT BVP BVP AMPLITUDE HEART RATE BEAT DETECT SCR SCR NORMALIZED SLOPE NORMALIZED RESP RESP RATE AMPLITUDE EMG EMG NORMALIZED TEMP (FILTERED)(RAW) VARIABILITY STANDARD DEV. PNN50 FREQUENCY Figure 3.14: Overview of measured physiological signals and the physiological metrics de- rived from them. 35 3.5. Online Physiological Assessment 3.5.1 Physiological Signals The bio-sensor suite generates a large number of physiological metrics of which several are particularly important due to their relation to participant anxiety. Literature specifically regarding the links between the physiological sensors used within this thesis and anxiety is discussed here in order that the use of these specific sensors may be justified. Anxiety can be thought of as the “fight or flight” response of the autonomic nervous system: the body prepares to respond to a threat by optimizing performance of critical systems. Heart rate, body temperature, and blood flow to muscles are increased; while activities such as those of the digestive system or blood flow to the extremities are reduced until after the danger. When properly triggered due to an external stimulus this is considered “fear,” but without such a trigger it is considered “anxiety.” When this stress response is activated improperly in a person, such as in social situations, the effects can be crippling as well as unhealthy; and in cases of long-term anxiety disorders, this response may be chronic and debilitating. Physiological sensors can be used to detect the physiological changes char- acteristic of anxiety, and are particularly useful in situations where a person is unable to consciously detect the stress response beginning. Current research investigates both the short-term responses to stimuli as can be easily gathered in a laboratory environment and the longer-term physiological differences between those suffering from chronic anxiety and control subjects. Both are difficult due to inherent between-subject variations in common physiological metrics; the latter also due to the within-subject fluctuations over the longer time periods. Indeed, in persons with chronic anxiety disorder it may be impossible to gather a non-anxious baseline for physiological comparison; Craske states that: “the auto- nomic system may reestablish a balance over long periods of stress, such that dysfunction is no longer apparent except during acute panic attacks” [77]. Although it may be difficult to assess anxiety quantitatively, it is the eventual goal of the TAMER project to incorpo- rate into the TAMER platform advanced machine learning algorithms for analyzing data from the physiological sensors. By training the system based on anxiety assessments from medical professionals it should be able to identify various levels of anxiety in a user, and might even be able to eventually distinguish levels of anxiety in sufferers of chronic anxiety, which would otherwise be difficult. For the experiments in this thesis it is assumed that participants did not suffer from a chronic anxiety disorder in which their physiological re- sponses would be reduced, and therefore comparisons are made to a physiological baseline gathered during the experiment. Disturbing images, intensive tasks, and timed and scored cognitive training exercises are used in an attempt to induce physiological changes in the participants that would be similar to anxiety, as they have been both self-reported to in- duce anxiety and used in other studies that purport to induce anxiety in their subjects. 36 3.5. Online Physiological Assessment Eventual comparison of the physiological data from these experiments to physiological data from patients undergoing anxiety as determined by a medical professional would confirm whether or not anxiety was actually induced. Here, changes in both long-term and short- term heart rate, skin conductance, heart rate variability, respiration rate, skin temperature, and corrugator muscle activity are discussed. These are the primary sensors and metrics used in the physiological sensing suite of the TAMER platform. Heart Rate and Skin Conductance Heart rate and skin conductance response are perhaps the two physiological metrics most highly and often correlated to anxiety. Both are primarily associated with short-term, in- duced anxiety: indeed the increase in skin conductance is often called the “startle response” due to its quick onset and rapid disappearance. Bankart et al. induced anxiety by informing subjects that they were likely to receive an electric shock after a countdown period [78]. The probability that they would be shocked was varied. They found that during the countdown period both heart rate and skin conductance increased linearly. After the first shock, heart rate quickly ceased to increase and stabilized, but maintained an elevated rate compared to baseline throughout the experiment. Skin conductance continued to increase throughout the experiment. Telling subjects that their shock would be mild reduced this effect, but it was still present. Öhman et al. investigated whether this response was driven by conscious activity [79]. They showed pictures of snakes and spiders to users for 30 milliseconds, too short a duration to consciously perceive the image, and found that skin conductance re- sponse was similar to those groups that had been shown the pictures for long enough to consciously perceive them. They also found that those who had previously expressed fear of spiders and snakes had more elevated skin conductance responses than those who had not, and that they felt more negative, more aroused, and less dominant after their exposure to the images. These, among other studies in the literature, suggest that elevated skin conductance and heart rate are correlated with experimentally induced anxiety, and that the level of such elevation is increased by an increased perception of anxiety. Assessment of non-experimental anxiety confirms these results. Caprara et al. measured the skin conduc- tance levels of patients about to undergo a dental procedure [80]. They found that increased skin conductance was an objective and reliable test for identifying anxiety in patients. Hoehn-Saric et al., in several studies, investigated the effects of a clinical anxiety di- agnosis on skin conductance and heart rate response to stressful tasks. They found that when given a stress-inducing task, subjects tended to show reduced skin conductance and heart rate variability (heart rate variability as standard deviation), and that this reduction in variability was greater in those who had been diagnosed with chronic anxiety [81]. They further examined this lack of variability to conclude that chronic anxiety patients typically 37 3.5. Online Physiological Assessment react with less physiological flexibility to every-day stress, but have an increased reaction to anxiety-provoking stimuli [82] than control subjects. Heart Rate Variability Various changes in heart rate variability have been correlated with an increase in either gen- eral or specific anxiety responses, and a reduction in heart rate variability is now commonly associated with anxiety. Dishman et al. measured heart rate variability in gym patrons for five minutes while resting [83]. They found a correlation between a reduced normalized high frequency component of heart rate variability and the patron having perceived emo- tional stress during the previous week. They did not find a correlation between self-reported susceptibility towards anxiety and heart rate variability. Generalized anxiety disorder is also associated with resting variations in heart rate vari- ability. Blom et al. investigated heart rate variability in subjects with generalized anxiety disorder or major depressive disorder, and found them to have lower high frequency and low frequency components of heart rate variability, as well as a reduced standard deviation of heart rate interbeat intervals compared to controls [84]. Thayer et al. also investigated this subject pool [85]. They found that subjects with generalized anxiety disorder had shorter interbeat intervals and lower high frequency component of heart rate variability even while resting, and that when instructed to worry they had even shorter interbeat intervals, lower high frequency component of heart rate variability, and a reduction in successive interbeat intervals that differed by more than 50 milliseconds. Friedman et al. subjected subjects susceptible to severe panic attacks, severely afraid of blood, and controls to stressful tasks in a lab [86]. They found that the control subjects had longer heart rate inter beat intervals, higher variance in heart rate inter beat intervals, greater high frequency component of heart rate variability, and lower low frequency to high frequency ratios than those susceptible to panic attacks. Respiration While hyperventilation is the most obvious respiration-related indicator of anxiety, several studies have investigated whether more subtle variations in respiration rate could be an indicator of increased anxiety. Several results were not promising: Suess et al. induced anxiety by threat of electric shock, and while they saw an increase in heart rate during the task, this was not correlated with a change in respiratory activity [87]. However, more recent work does suggest a link between respiratory variability and anxiety. Martinez et al. found that patients diagnosed with panic disorder had greater respiratory variability in both rate and amplitude than controls, even after receiving medication for the disorder [88]. Niccolai et al. in a recent meta-analysis of the literature, confirm that increased respiratory 38 3.5. Online Physiological Assessment variability is well-correlated with panic disorder [89]. Skin Temperature Skin temperature has often been used to help identify emotions, and has been associated with both anxiety and relaxation. In general, a decrease in skin temperature is correlated with an increase in anxiety. Rimm-Kaufman et al. found that hand skin temperature in- creased when participants were exposed to a video designed to generate happiness, but decreased when asked threatening personal questions [90]. Mittelmann et al. induced anx- iety in subjects by questioning them during psychoanalysis: they found that a decrease in finger skin temperature was associated with anxiety [91]. Boudewyns et al. again subjected subjects to electric shock in order to induce anxiety; they found that finger skin temperature decreased during the stressful condition and increased during relaxation, and was correlated with participant self-reports of arousal [92]. Electromyogram Surface electromyography of various muscle groups has been used to assist in the classi- fication of various emotional states. Increased muscle tension has been associated with anxiety disorders and stress, and brief muscle responses can be associated with startle events. Smith et al. investigated corrugator or eyebrow muscle response to disturbing im- ages designed to induce anxiety, and found that these images were correlated with increased EMG activity [93]. They also found that baseline images of increased anxiety before the disturbing images were associated with a larger response over neutral photos. Cacioppo et al. found that EMG corrugator muscle activity could be used to distinguish between posi- tive and negative emotion inducing pictures, even when there were visible changes in facial expression [94]. Dimberg concluded that “facial EMG technique may be a sensitive tool for measuring emotional reactions” [95], and found that anger-inducing photos increased corrugator muscle activity as opposed to neutral photos [96]. It is important to note that while the above data show correlations in various physiolog- ical metrics to anxiety, the actual inference of anxiety from physiological data, especially in an online modality, is challenging. The human body is a complex organism, and the physi- ological metrics measured are affected by the activities of numerous bodily systems, all of which can have different short and long-term reactions to stimuli. Responses are often not consistent across the population, and are in some cases not even present at all. Laboratory induction of anxiety in a controlled environment can help in identifying these effects, but in a real-world environment they are often obscured by the noise from every-day interactions. While the various low-frequency signals are useful in the classification of anxiety disorders, and have been used to judge the effects of various robotic interventions, their utility for 39 3.5. Online Physiological Assessment short-term human-robot interaction is limited. In recognition of these limitations, the initial use of the TAMER platform has been either in controlled laboratory settings or in scenarios in the outside world with limited reaction to external stimuli — participants were at a computer in a classroom, but not interacting with their classmates. Additional sensor platform training in recognition of anxiety will be necessary before the platform can used in every day activity. 3.5.2 Physiological Sensors Used In this section the primary sensors used for collecting physiological information are de- scribed. The hardware and software platform for physiological sensing is a later version of that used by Kulić et al. in their human-robot interaction research in the CARIS lab. The initial usage of the sensor platform by Kulić et al. was to detect anxiety in human-robot in- teraction: see Section 2.5. Additions made by this author include the porting of the software to a more recent operating system, as well as the capability for the software to communicate with the most recent Thought Technology hardware and the Haptic Creature. Only where changes have been made in the processing of physiological signals are they described in this section, otherwise see reference [97] for signal processing and filtering details. An image of a user wearing the physiological sensors typically used with the TAMER platform is shown in Figure 3.15. Encoder The data acquisition device used for this platform is the Thought Technology [98] Flex- Comp™ Infiniti (pictured in Figure 3.16). This encoder is designed for clinical physiological measurement and biofeedback training. This encoder has ten channels capable of recording at 2048 samples per second, although data are sampled at 256 Hz within the platform. Data are transferred from the encoder via a fiber optic cable to a converter located near the host computer, and then converted to USB to connect to the host computer. The encoder is powered by four AA batteries. EKG (Electrocardiogram) EKG (Electrocardiogram) or heart electrical activity is measured by the EKG Sensor T9306M (see Figure 3.17), a 3-lead electrocardiography sensor. The sensor is connected either to a 3-terminal electrode as in Figure 3.18(c) and attached to the center of the chest, or to an extender cable as shown in Figure 3.17. In the latter case, three electrodes are attached to the participant’s chest: a negative electrode on the right shoulder, a positive electrode to the left of the navel, and a ground electrode on the upper left portion of the chest. This was the method that was typically used during experiments. Although the 40 3.5. Online Physiological Assessment Figure 3.15: User holding the Haptic Creature and wearing physiological sensors: respiration rate (a), blood volume pulse (b), skin conductance (c), and EKG (d). The EMG sensor is not shown, it would have been placed on the forehead. extender cable required the use of additional cabling, the use of three smaller electrodes attached to the periphery of the chest instead of one large electrode in the center of the chest reduced the amount of body hair contacted by the electrode glue, resulting in greatly improved participant comfort (particularly male) when removing the sensors. It also pro- vided better signal quality, as there was less susceptibility to noise from fidgeting of the body core, and the single electrodes proved less susceptible to losing their connection due to perspiration. Similar electrodes are used for the EMG sensor, and in all cases the electrodes used are single-use and disposable. Participants were typically asked to attach the sensors themselves. A QRS detection algorithm [99] is then applied to the signal data to detect the occurrence of a heart beat. From this data heart rate, heart acceleration, and heart rate variability are calculated, as are normalized versions of the same. EMG (Electromyogram) Electromyogram or muscle activity is measured by the EMG MyoScan-Pro™ Sensor T9401M- 60 (see Figure 3.18): a pre-amplified surface electromyography sensor. This sensor is typi- cally connected to the forehead to measure the activity of the corrugator supercilii muscle; 41 3.5. Online Physiological Assessment Figure 3.16: Thought Technology FlexComp™Infiniti Encoder. Figure 3.17: Thought Technology EKG™ Sensor T9306M, attached to triode electrodes for placement on chest. for this location care must be taken to ensure the sensor cable does not interfere with the user’s vision. It can also be attached to other muscles to measure their electrical activity. This signal is filtered and then normalized as in Kulić et al. [97]. 42 3.5. Online Physiological Assessment (a) Back of Sensor (b) Front of Sensor (c) Side of Sensor Figure 3.18: Thought Technology EMG MyoScan-Pro™ Sensor T9401M-60. (c) shows sensor with electrode attached. Skin Conductance Skin Conductance Response (SCR) (or Galvanic Skin Response (GSR)) is measured by the Skin Conductance Sensor SA9309M, as shown in Figure 3.19. The sensor measures the electrical resistance of the skin, and is the same type of sensor used in lie detector tests. Skin conductance is affected by the amount of moisture present in the skin, as released by glands when sweating or in response to stress or fear. During experiments the Skin Conductance Sensor is worn on the index and middle finger of the participant’s non-dominant hand. The sensor electrodes must be cleaned with alcohol after each use, and are replaced after fifty uses. This signal is then filtered, and the derivative taken to produce a skin conductance derivative measurement. Both are normalized as in Kulić et al. [97]. 43 3.5. Online Physiological Assessment Figure 3.19: Thought Technology Skin Conductance Sensor SA9309M. BVP (Blood Volume Pulse) Sensor The Blood Volume Pulse Sensor SA9308M (as shown in Figure 3.20) is a photo- plethysmography sensor. It measures the reflectivity of the skin to infrared light, a property dependent upon the amount of blood present in the underlying tissues. A heartbeat causes a sudden increase in the amount of blood present; therefore, this sensor is able to measure the occurrence of a pulse. This sensor is typically attached to distal end of the thumb of the participant’s non-dominant hand, and secured in place by a velcro strap. It is used when the more-invasive EKG Sensor is not desired or appropriate; however, care must be taken to ensure the sensor is attached tightly enough to the finger to record a signal, but not so tight as to impede circulation and cause discomfort to the participant. The sensor does not measure in absolute units, but rather percentage change in blood volume. Processing An example sensor signal is shown in Figure 3.21. The raw blood volume pulse signal is passed through a 7th-order low pass Butterworth filter with a 3 Hz cutoff, as a user’s heart rate should not exceed about 120 beats per minute during experiments. The filtered signal is then passed through a peak-detection algorithm, which looks for a change in first derivative to determine the occurrence of a heartbeat. From this time series, heart rate and heart rate variability can be extracted. 44 3.5. Online Physiological Assessment Figure 3.20: Thought Technology Blood Volume Pulse (BVP) Sensor SA9308M, front and rear. time [s] bl oo d vo lu m e pu ls e [% ] 0 2 4 6 8 10 12 14 16 18 35.2 35.4 35.6 35.8 36 36.2 36.4 36.6 36.8 37 37.2 Figure 3.21: A sample unfiltered blood volume pulse signal, showing four heartbeats. 45 3.5. Online Physiological Assessment Heart Rate Variability Heart rate variability is a general term describing several met- rics derived from heart rate that describe activity of the autonomic nervous system. It is computed from a series of interbeat intervals (IBIs), or time between heart beats, and can therefore be calculated from either the electrocardiogram or blood volume pulse sensor. Use of the Electrocardiogram (EKG) sensor theoretically gives better performance as the EKG directly measures the electrical activity of the heart. The blood volume pulse sensor measures a more distant effect of the heart beat, the increase in blood in a distal digit. In practice this difference is minimal, and often one sensor will offer higher reliability than the other due to the physical characteristics of the particular user: the EKG sensor can be difficult to attach on a subject with a large amount of chest hair, and the blood volume pulse (BVP) sensor can shift and become detached if the subject’s thumb moves too often. Several variability metrics are calculated, as defined in the following paragraphs. Root Mean Squared Standard Deviation The root mean squared standard deviation of heart rate is calculated as follows, where n is the number of observations: SDRMS = √∑n i=1(ibin+1 − ibin)2 n (3.1) A running 10-second average of root mean squared standard deviation is generated by the physio software; this value can be computed for longer time periods as well. PNN50 PNN50 is calculated as the sum of the number of successive interbeat intervals that differ by more than 50 ms, divided by the total number of interbeat intervals counted. PNN50 = # of (ibin+1 − ibin) > 50 n (3.2) Frequency Analysis Frequency variation in the interbeat interval series is calculated for extremely low, very low, low, and high frequency bands using commonly accepted ranges [100], as shown in Table 3.1. The integral of the power spectral density function of the signal is used to calculate the power of each frequency band. For samples less than five minutes in length extremely low frequency and very low frequency data are typically unreliable [101]. Also calculated is the LF / HF ratio as follows: LF / HF ratio = low frequency power high frequency power (3.3) 46 3.5. Online Physiological Assessment Table 3.1: Heart rate variability frequencies. band lower limit [Hz] upper limit [Hz] extremely low frequency (ELF) 0 0.0033 very low frequency (VLF) 0.0033 0.04 low frequency (LF) 0.04 0.15 high frequency (HF) 0.15 0.4 Respiration Sensor Respiration rate, amplitude, and waveform are measured by the Respiration Sensor SA9311M, as shown in Figure 3.22 This sensor consists of a strain gauge connected to a large velcro strap. This strap is worn around the upper abdomen over the participant’s clothing, and tightened so that the strain gauge is on the front of the abdomen. The strain gauge expands and contracts with the user’s breathing. The sensor does not measure expansion in absolute units, but rather percentage expansion. Figure 3.22: Thought Technology Respiration Sensor SA9311M. Processing An example respiration rate signal is shown in Figure 3.23. The processed signal is shown in Figure 3.24. The raw respiration signal is passed through a 5th-order low pass Butterworth filter with a 1 Hz cutoff. The filtered signal is then passed through a peak-detection algorithm, which looks for a change in the first derivative to determine the peak of a breath (peaks are identified by blue triangles in the figure, troughs by the purple triangles), similar to how peaks in the blood volume pulse signal are detected. The peak-to-peak distance between breaths is then used to calculate the participant’s breathing 47 3.5. Online Physiological Assessment rate (L1 or L2). The normalized current respiration amplitude is calculated as follows: Respiration Amplitudecurrent − Respiration Amplitudemin Respiration Amplitudemax − Respiration Amplitudemin (3.4) time [s] re sp ira tio n am pl itu de  [% ] 0 2 4 6 8 10 12 14 16 18 20 21 22 23 24 25 26 27 28 29 Figure 3.23: A sample filtered respiration signal. Normalization from smallest to largest expansion is necessary for calculations, as the percentage compression and expansion for a single breath for each user can vary widely. Typically this is about five to six percent of the full sensor range for a deep breath. On some subjects, particularly very small ones, the upper abdomen may not give a large enough range of motion, and the sensor may have to be placed lower on the abdomen, around the belly. Such a placement is undesirable, as belly motion can be affected heavily by speech. Although the algorithm has generally proved robust to short phrases or questions, longer periods of speech can result in erroneous data. Respiration rate and breath length are terms typically used to describe user breathing; they are the inverses of each other. 48 3.5. Online Physiological Assessment time [s] re sp ira tio n am pl itu de Calculation of Respiration Rate 0 1 2 3 4 5 6 7 8 9 10 21.4 21.6 21.8 22 22.2 22.4 22.6 22.8 23 23.2 23.4 L1 L2 Figure 3.24: Calculation of respiration rate. Skin Temperature The Temperature Sensor SA9310M (as shown in Figure 3.25) measures the skin temperature of a peripheral digit. This sensor was worn on the ring finger of the participant’s non- dominant hand, and attached by a piece of medical tape. No filter is used as the signal is relatively noise-free and slow moving; a sample signal is shown in Figure 3.26. Data are recorded in degrees Celsius. As with all sensors placed on the fingers, care must be taken to ensure that the sensor does not become detached during use, and that the sensor does not slip down to the underside of the finger, which may be in contact with a warmer surface. Figure 3.25: Thought Technology Temperature Sensor SA9310M, showing sensor and con- nector to encoder 49 3.5. Online Physiological Assessment time [s] te m pe ra tu re  [° C] 0 10 20 30 40 50 60 70 80 90 31 31.1 31.2 31.3 31.4 31.5 31.6 31.7 Figure 3.26: A sample skin temperature signal. 3.5.3 Sensor Application Notes Actual use of the sensors in both brainstorming, pilot studies, and experiments as part of this thesis provided valuable feedback on the use of the sensors in experimental environments. In particular, maintaining proper sensor functioning is a challenge inherent to physiological experiments — most physiological experiments have at least a few people whose data are unusable. The sensors attached to the fingers were particularly prone to coming loose during experiments, as subjects typically made contact with the Creature with their hand attached to the sensor. The most sensitive to motion is the blood volume pulse sensor, which requires a tight fit on the thumb to record a proper signal. Motion in the body core could also affect sensor readings: fidgeting could result in noise in the EKG signal, and talking resulted in disruption of the respiration rate signal. In general, however, the large number of wires required to physiologically monitor a subject is a greater hindrance to non-experimental use of the sensors than these motion concerns. 50 Chapter 4 Experiments A series of four experimental trials was performed to evaluate the functionality of the TAMER platform and to investigate its efficacy in guiding physiological responses. The first, a pilot experiment, was a preliminary examination into the feasibility of using the Haptic Creature as an anxiety-reducing device: participants were asked to view disturb- ing images with a proof-of-concept version of the Haptic Creature. Overall results were encouraging enough to support construction of the TAMER platform. Experiment 1 and Experiment 2 investigated the ability of the Haptic Creature to influence physiological re- sponses. In Experiment 1, the initial physiological and subjective responses to the Creature were observed, and participants were exposed to the Creature mimicking their breathing and pulse. In Experiment 2 participants were asked to use the Creature as a training tool, breathing with it, and then had the Creature in their lap as they performed a task. In the final experiment, Experiment 3, the Creature was tested in a target environment, namely an elementary school that supports children with learning challenges, many of them anxiety related. Children were introduced to the Creature and then given the Creature to have during a stressful activity. In this chapter the experiments and experimental results are described. 4.1 Pilot Experiment: Response to Disturbing Images 4.1.1 Introduction and Motivation Initial reactions to the first Haptic Creature prototype revealed the potential for the device to provoke a comforting and calming response in its users [72]. The Creature’s similarity to both a stuffed animal and an actual animal suggested the potential for the Creature to produce similar comforting effects. Here, a pilot study was undertaken to investigate this general hypothesis. Information obtained from this experiment was also desired to assist in the design of the second Haptic Creature prototype. 4.1.2 Experimental Design Considerations The first step in developing such an experiment was to determine both how to best in- duce anxiety in adult participants, and whether the physiological sensing would be able to 51 4.1. Pilot Experiment: Response to Disturbing Images recognize such anxiety. Inducing anxiety in experimental participants is difficult both prac- tically and ethically: participants can have widely differing responses to the same stimuli, anxiety-inducing scenarios are limited, and threats of harming or actual physical harm to the participant are not permitted under ethics regulations. It is also necessary to have a non-anxious baseline from the participants to help determine the physiological indicators of anxiety. Therefore, although long-term anxiety or general stressful situations such as the middle of exam week could be ideal scenarios in which relaxation therapy would be effective, the determination of this more chronic and persistent anxiety would be beyond the time- scale of the preliminary experiments and the clinical capabilities of the researchers, thus necessitating an investigation of short-term anxiety induction and response. In addition, in order to produce a measurable effect during the limited time-span of an experiment, the anxiety stimulus must be able to quickly induce anxiety in the participant. Typical psychological methods of inducing anxiety in experiments are such procedures as rapid-fire yelling of math questions to be answered, or playing a stressful puzzle or video game. These were deemed impractical for two reasons. First, they were viewed as too distracting from the Haptic Creature prototype, and second, they required the use of the participant’s hands — participants would need to keep their hands, which would also be encumbered with the physiological sensors, on the Creature during any experiment, as the hands are the primary channel through which the Creature communicates. It is important, eventually, to have their hands available for other activities while using the Creature. Potentially hand-reliant tasks could raise the questions of how much physical interaction with the Creature would be required for it to be effective, and how inhibiting the hand sensors would be. These are discussed below in Sections 4.3.4, 4.3.6, 4.4.5, and 5.3.2. Therefore, additional anxiety inducing methods were investigated. In a pilot study, six participants were asked to watch a two-minute video clip of a movie picked for its believed ability to induce anxiety [102] while physiological data, skin conduc- tance, EKG, and EMG, were collected. Analysis showed an increase in skin conductance and heart rate during the movie. This response, however, was inconsistent across trials, highly transient, and dependent upon an individual’s engagement with the video. In most cases, this response peaked for only part of the scene, remaining at a lower state for the ma- jority of the film. While clearly real, these responses were neither sustained nor controllable enough for use during an experiment. A more stable visual source of anxiety was therefore sought. The International Affective Picture System [103] is a set of images designed to provoke either positive or negative reactions in subjects, and correlated with physiological effects in both skin conductance and corrugator muscle activity [104], both of which are directly measured by the physiological sensor suite. Images such as mutilations, snakes, insects, and dead bodies, corresponding to high anxiety induction, were selected. By using a variety of images, it was expected that participants would be more likely to experience at 52 4.1. Pilot Experiment: Response to Disturbing Images least one anxiety inducing stimulus. Since this pilot experiment was done prior to the construction of the Haptic Creature version described in Section 3.1.1, the “Wizard of Oz” prototype constructed by Yohanan et al. [72] and shown in Figure 4.1 was used during the experiment. This prototype is a manually actuated predecessor of the present Haptic Creature. It consists of a warming element, a purring mechanism, inflatable ear-like appendages, and a pneumatically acti- vated breathing mechanism. In operation during the experiment the breathing and purring mechanisms are activated at a constant, moderate rate by a facilitator. Figure 4.1: “Wizard of Oz” Haptic Creature Prototype used in pilot experiment, showing bellows used to simulate breathing and heating pad. 4.1.3 Research Questions There were two main research questions for this preliminary experiment. • Would the prototype Haptic Creature be effective in reducing the level of anxiety experienced by a participant during the viewing of disturbing images, as measured by physiological sensors and surveyed self-assessments? • What changes would be measurable or captured by the physiological sensors during the experiment, and could they be correlated with anxiety? Physiological data were investigated both for an EMG reaction to the disturbing images, due to their visual nature, and for changes in average heart rate and skin conductance, which are two commonly accepted methods of measuring anxiety [105, 106]. A description of the calculations performed for this and the following experiments is included in Appendix A.2. 53 4.1. Pilot Experiment: Response to Disturbing Images 4.1.4 Experiment Procedure This experiment took place in an ICICS experiment room that had been cleared of equip- ment. Participants sat in an office chair facing an HDTV television screen affixed to the wall. The encoder for the physiological sensors was placed on a small table beside the participant. Wiring from both the biosensors and the “Wizard of Oz” prototype Haptic Creature ran from the participant to a fake wall placed to the participant’s right. The wall served to hide the prototype’s actuators, computer equipment, and the experiment facil- itators. During the experiment participants were viewed through cameras present in the room; unusual interactions with the prototype were noted. There were three main parts to this experiment: a preliminary questionnaire, two separate slideshow viewings, and a post-experiment questionnaire. The overall experiment procedure is outlined in Figure 4.2. baseline, calm images disturbing images without creature baseline, calm images disturbing images with creature (i) (ii) (iii) (iv) ra nd om  o rd er Figure 4.2: Diagram of Pilot Experiment procedure. Preliminary Questionnaire After signing consent forms, participants were given a written survey asking for general demographic information as well as the participant’s experience and comfort with touch- based interaction. A copy of the questionnaire is included in Appendix B.1.1. Sensor Attachment and Baseline The participants were then fitted with three physiological sensors: skin conductance (SCR) on their non-dominant hand, three-lead electrocardiogram (EKG) on their chest, and surface electromyogram (EMG) on the corrugator muscle of their forehead. Sensor functionality was tested and confirmed before the facilitators retreated behind the wall. The participants then viewed a slideshow of calming nature scenes for two minutes whilst baseline data were gathered. Disturbing Images Slideshows The participants were given the “Wizard of Oz” prototype for either the first or second slideshow — the order was determined randomly. Once given the prototype, the participant 54 4.1. Pilot Experiment: Response to Disturbing Images was asked to sit with it for two minutes to gain familiarity with the device. While in the participant’s lap, the prototype was manually actuated by an experiment facilitator to generate a breathing and purring sensation. Participants were instructed to focus their eyes on the screen and not on the Haptic Creature during slideshow viewing. When the prototype was taken from the participant, it was removed to behind the fake wall. Each slideshow consisted of twelve disturbing images, and each was shown for ten seconds for a total of 120 seconds of disturbing images. The order of images shown was randomly determined for each participant from the total set of 24 images. After the first slideshow, the prototype was then given or taken away, and the participant was again shown two minutes of calming nature scenes while another baseline was gathered — giving the participant time to recover from the influence of the previous slideshow. The second slideshow then followed. Concluding Questionnaire After the second slideshow was completed, the physiological sensors were removed from the participant, and they were asked to rate their responses to both the images and the haptic device via survey. A copy of the questionnaire is included in Appendix B.1.2. Before beginning the“Overall Response” section of the questionnaire, participants were informed of the two operating modes of the Creature during the experiment. 4.1.5 Results Ten participants, seven male and three female, between the ages of 20 and 30 took part in the experiment. All were undergraduate and graduate computer science and engineer- ing students, and were compensated for their time (approximately 30 minutes). Due to an equipment malfunction one participant’s physiological data were not useable, but his questionnaire data were included. Self Reported Results Participants were surveyed as to their states of anxiety, agitation, and surprise during the disturbing image slideshows, both with and without the “Wizard of Oz” Haptic Creature prototype, on a 5-point Likert Scale, with adjectives used previously for reporting affective state during human-robot interaction experiments [107]. Descriptions of quantitative survey results refer to general trends, not statistical analyses. Results are shown in Table 4.1. Participants had lower self-reported mean anxiety, agitation, and surprise with the prototype than without. Participants were also surveyed as to their levels of comfort with the prototype during the experiment; these results are shown in Figure 4.3. Nine out of ten participants found the Creature comforting. 55 4.1. Pilot Experiment: Response to Disturbing Images Table 4.1: Pilot Experiment: Self-reported Likert-scale responses to anxiety, agitation, and surprise (1 = strongly felt, 5 = weakly felt). Prototype Present No Prototype State Mean Std. Dev. Mean Std. Dev. Anxious 2.3 1.2 1.7 0.6 Agitated 2.0 1.1 1.7 0.7 Surprised 2.8 1.2 1.7 0.7 1 2 3 4 5 1 2 3 4 5 6 7 8 9 10 ranking (1 = strongly disagree, 5 = strongly agree) nu m be r o f r es po ns es haptic creature was comforting while viewing the images Figure 4.3: Preliminary experiment participant responses to statement “Haptic Creature was comforting while viewing the images.” Participants were also surveyed as to whether they felt that the motions of the prototype were distracting while viewing the images; these results are shown in Figure 4.4. Participants generally expressed agreement with this statement; only 2 mildly disagreed. Participants were also surveyed as to whether they felt that the Creature would help them reduce their anxiety in other situations; these results are shown in Figure 4.5. As a group, participants did not express any conclusive general opinion. There were no particular patterns identified from within-individual data, likely due to the small sample size. Participants were not given detailed interviews about their sur- vey responses; they were, however, asked to provide comments on the Creature and the experiment. 56 4.1. Pilot Experiment: Response to Disturbing Images 1 2 3 4 5 1 2 3 4 5 6 7 8 9 10 ranking (1 = strongly disagree, 5 = strongly agree) nu m be r o f r es po ns es haptic creature's actions were distracting while viewing images Figure 4.4: Preliminary experiment participant responses to statement “Haptic Creature’s actions were distracting while viewing the images.” 1 2 3 4 5 1 2 3 4 5 6 7 8 9 10 ranking (1 = strongly disagree, 5 = strongly agree) nu m be r o f r es po ns es haptic creature would help reduce my anxiety in other situations Figure 4.5: Preliminary experiment participant responses to statement “Haptic Creature would help reduce my anxiety in other situations.” 57 4.1. Pilot Experiment: Response to Disturbing Images Physiological Results Counting and visual inspection revealed that all subjects had a skin conductance response to at least six disturbing images in each slideshow, as marked by an increase in skin conductance when the image was presented. Therefore, statistical comparisons were made using the five images with the highest initial skin conductance responses for each subject. An example of skin conductance response for a subject during the calming images is in Figure 4.7, and for the same subject during the disturbing images, showing the initial response to images, is shown in Figure 4.8. Note the large transients that occur at the onset of several new images; these indicate an orienting response. Regardless of whether they were holding the prototype, all subjects responded to at least six images with a jump in skin conductance of more than 20%. None had a significant response to all twelve images, and there was no order related trend in these responses. The mean normalized skin conductance response with the Creature was significantly greater (M = −0.261, SD = 0.143, p < 0.05) than the mean normalized skin conductance response without the Creature. A graph of mean skin conductance response is shown in Figure 4.6. participant no rm al ze d SC R re sp on se average normalized skin conductance response to disturbing images 1 2 3 4 5 6 7 8 9 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 without creature with creature Figure 4.6: Average normalized skin conductance response for disturbing image slideshow with and without Haptic Creature prototype for each participant. 58 4.1. Pilot Experiment: Response to Disturbing Images 0 10 20 30 40 50 60 70 80 90 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 time [s] n o rm a li ze d  s k in  c o n d u c ta n c e normalized skin conductance during baseline images Figure 4.7: Typical normalized skin conductance response for a participant during calming image set, the baseline. The vertical line represents the start of a new image. Baseline is typically less than five percent of maximum response. 59 4.1. Pilot Experiment: Response to Disturbing Images 0 20 40 60 80 100 120 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 time [s] n o rm a li ze d  s k in  c o n d u c ta n c e normalized skin conductance during disturbing images Figure 4.8: Typical normalized skin conductance response for a participant during disturb- ing image slideshow. The vertical line represents the start of a new image. 60 4.1. Pilot Experiment: Response to Disturbing Images As summarized in Table 4.2, the disturbing images induced significant changes in mean heart rate, mean normalized EMG, mean normalized heart rate acceleration, mean normal- ized derivative of skin conductance, heart rate standard deviation, and arousal as compared to the calming images. Arousal (as per Kulić et al. [108]), normalized skin conductance, and the normalized derivative of skin conductance were significantly less during the disturbing images with the Creature than without the Creature. Table 4.2: Summary of significant results from Pilot Experiment. physiological metric comparison mean SD p mean heart rate [bpm] calming images to disturbing images without creature 1.86 3.59 < 0.001 mean normalized EMG calming images to disturbing images without creature 0.032 0.081 0.05 mean normalized heart rate acceleration calming images to disturbing images without creature -0.00630 0.00290 0.002 mean normalized skin conductance derivative calming images to disturbing images without creature 0.0204 0.037 0.007 heart rate standard deviation [bpm] calming images to disturbing images without creature -3.32 5.19 0.046 arousal calming images to disturbing images without creature 0.0522 0.0802 0.016 mean normalized skin conductance images without creature to images with creature -0.261 0.143 0.007 mean normalized skin conductance derivative images without creature to images with creature -0.0154 0.0130 0.047 arousal images without creature to images with creature -0.0602 0.0769 0.001 4.1.6 Discussion Responses from the surveys revealed many useful comments and several general trends. Participants reported either feelings or strong feelings of anxiety, agitation, and surprise, and all responded to at least six of the disturbing images in each set with the peak in skin conductance typically associated with a startle response [109]. There was no order-related trend of which images produced this response, suggesting that the participants did not become acclimatized to the disturbing images during the session. No participant had a physiological response to all of the images. Mean heart rate, EMG, heart rate acceleration, and heart rate standard deviation were also affected by the disturbing images. The EMG reaction to the disturbing images was likely due to their visual component, and the heart rate changes are consistent with a more anxious or aroused state. After the experiment, many subjects also reported to the facilitators that they found some of the images disturbing. It is likely that the disturbing images were successful in inducing anxiety in the participants. 61 4.1. Pilot Experiment: Response to Disturbing Images In general, participants reported lower anxiety, agitation, and surprise with the Hap- tic Creature prototype than without. In addition, skin conductance response and inferred arousal (as per Kulić et al. [9]) during the disturbing images were significantly less with the Creature than without. With such a small sample size, physiological results were encour- aging, indicating that this approach was worthy of further research. Survey data indicated that subjects generally found the Haptic Creature prototype a comfort while viewing the distracting images: this was encouraging feedback for both the form-factor of the Creature and the idea that a small robotic creature would be of any help in reducing a subject’s anxiety. In comments, many subjects specifically commented on the creature’s warmth as comforting, and several mentioned finding its simulated breathing prominent. Some indi- cated that they found the gentle breathing of the Creature pleasant; interestingly, a few volunteered that this caused them to become more aware of their own breathing. It is therefore also likely that the Haptic Creature prototype had an effect on the participants. Participants did, however, report that the prototype caused moderate to high levels of distraction during the image viewing. A device that purely distracts from sources of anxiety would be of limited utility, as this distraction would be of short duration and would preclude the accomplishment of other tasks. It is, however, possible that some subjects found the entire experience of the Haptic Creature unusual and hence distracting, and that their subjective reporting of distraction would be decreased after spending additional time with the Creature. Although some participants may have found the Creature distracting, most subjects did not seem to find the prototype so distracting as to be annoying. There was also net-positive but varied response to the proposition that the Haptic Creature prototype might reduce anxiety in stressful situations other than that of viewing disturbing images. There is also an experimental concern in that the Creature was never presented to the user in its inactive state, to determine whether Haptic Creature prototype presence alone was sufficient to induce these seen effects. 4.1.7 Conclusions Not all participants reacted to every disturbing image, but all had a skin conductance (SCR) response to at least six of the disturbing images in each set with a peak in skin conduc- tance. A change in mean EMG, heart rate, heart rate acceleration, and heart rate standard deviation was also correlated with the images. The presence of the Haptic Creature pro- totype was correlated with reduced levels of both mean and normalized skin conductance response values, as well as inferred arousal, during the anxiety-inducing disturbing video task. Participants generally reported the Haptic Creature as comforting during the experi- ment, particularly liking its warmth and gentle breathing. 62 4.1. Pilot Experiment: Response to Disturbing Images 4.1.8 Feedback for Iterated Design The overall positive feedback to the prototype device encouraged future investigation, and provided valuable guidance as to Creature and TAMER platform experiment design, as well as experimental methods. Many of the lessons learned from this experiment were incorpo- rated into the design of the Haptic Creature used for future experiments. Participants’ favorable opinion of the warmth that the prototype was able to produce through its heating pad led to the installation of additional heating pads into the Creature. Due to partici- pants’ high comfort rating attributed to the plushness of the Creature, additional padding was added to the new Creature. In designing the control system of the new prototype, par- ticular attention was paid to ensuring that the Creature would be able to interface with the physiological sensor suite directly, without requiring an additional experimenter to operate the Creature. This also reduced the complexity of using the system. Deficiencies in the physiological sensing platform were also recognized and addressed. This preliminary experiment revealed that the existing physiological sensor software was insufficient for longer-term affect based experiments. In particular, it was difficult to cor- relate the sensor data logs with specific experimental conditions: the various stages of the experiment had to be identified by carefully timing the start of the experiment and noting at what time various events occurred relative to this — a potentially error-prone measure- ment when dealing with shorter-term physiological events. Participants remarked upon the breathing activity of the prototype, and many felt that the Creature’s breathing increased their awareness of their own breathing. As breathing exercises and training are an impor- tant aspect of current anxiety training, it was necessary to add the respiration rate sensor to the physiological sensors. As a result of rewriting the sensor software to support the res- piration rate sensor, the ability to use both the skin temperature and blood volume pulse sensors was gained. This experiment also formed the basis for several methodological changes in the follow- ing experiments. Inducing anxiety ethically was always a challenging task. While the IAPS picture set seemed effective at inducing anxiety, they provoked an emotionally loaded en- counter – many participants remarked upon the gruesomeness of the images, and expressed displeasure at having to view them. Longer-term studies along this vein would involve the viewing of many more images, which would not only be extremely displeasurable to par- ticipants, making recruitment difficult, but was highly unlikely to be approved (and would indeed be inappropriate) for the targeted platform age group of children. There were also limitations on the sensor suite’s ability to recognize anxiety: the existing inference engine proved unable to adequately measure anxiety and, more importantly, levels of anxiety in participants. The engine had been trained primarily on visual stimuli, and may not have been able to recognize the more subtle human reactions to changes in emotional state. As 63 4.2. Experiment 1: Recognition of Mirroring and Initial Reactions to Creature work to improve the emotional state recognition engine was already ongoing in a separate process, it was decided to focus the trial experiments of the platform on what the physi- ological sensors were capable of doing well: measuring effects of raw physiological metrics such as breathing rate, heart rate, and skin conductance. While ongoing work was inves- tigating self-reported emotional responses to the Haptic Creature, there had not yet been research investigation of physiological reactions to interaction with the Haptic Creature. If physiological reactions occurred from the Haptic Creature, there could be the potential to command these reactions through particular motions and activity state of the Creature to reduce the physiological metrics related to anxiety. 4.2 Experiment 1: Recognition of Mirroring and Initial Reactions to Creature Following the preliminary experiment, the TAMER platform, as described in Chapter 3, was constructed. The following experiments, Experiment 1 and Experiment 2, describe small- scale studies that were intended as much for obtaining feedback and verification of the platform systems as beginning to explore the potential physiological effects of the Creature, and possible roles for the Creature in anxiety reduction. The first experiment performed had two primary motivations: to begin the investigation of human physiological response to interaction with the Haptic Creature, and to determine whether participants could rec- ognize the Haptic Creature mirroring their breathing rate and heart rate. By linking the Creature’s pulse and breathing mechanisms to those of the participant, as recorded by the physiological sensors, the Creature has the ability to “mirror” a user’s breathing rate and pulse. This ability has several possible applications, some of which are particularly applica- ble for use within the TAMER platform, such as an alerting scenario, in which the Creature attempts to inform its user of his or her own breathing rate and heart rate by mirroring. In a stressful or anxiety inducing situation, participants may not recognize that they are becoming more stressed and anxious, or the degree to which that is the case. By seeing their own breathing and heart rate in the Creature, users could gain increased awareness of their own physiological state and take appropriate coping actions. Therefore, the primary goal of this first experiment was to determine user reaction to this mirroring: both their subjective responses and whether they could recognize it in the Creature. A second goal of this first experiment was to determine whether the programmed ac- tions of the Creature’s mechanisms were recognized as both lifelike and appropriate to the Creature. Pilot studies and informal initial interactions suggested that users were able to distinguish between various “states” of the active Creature through the application of be- havioral state terms typically associated with a living animal: e.g., the Haptic Creature, 64 4.2. Experiment 1: Recognition of Mirroring and Initial Reactions to Creature when its breathing mechanism displayed fast breathing, would be perceived as “breathing heavily;” whereas a slower breathing rate and lower intensity in the Creature would be per- ceived as “resting.” It was not evident, however, whether a human participant’s breathing rate and heart rate imposed on the Creature would be perceived in the same way. The small creatures that humans are generally familiar with, such as dogs or cats, typically have a higher heart rate and breathing rate than their owners. Consequently, the expected “normal” baseline activity of the Creature could in fact be at this level, which would be around the level of an excited human; normal human resting breathing rates and heart rates could appear lethargic in the Creature. This would impact both user recognition of mirroring as well as user determination of the Creature’s emotional state. Accordingly, participant subjective responses as to their perceptions of Creature motion were collected and discussed. Physiological manipulation of the user was approached indirectly in this experiment. Interacting with a pet has been associated with physiological reactions such as decreased heart rate [49] and breathing rate, as well as reduced levels of anxiety [110]. There was, therefore, the potential that the zoomorphic appearance and behavior of the Creature would allow it to provoke similar results. In order to have such effects, it was necessary to confirm that the Haptic Creature was, in fact, able to convey a sensation of both breathing and heart rate to the user, and that this could be recognized. At the very least, however, the Creature’s similarity to a stuffed animal could also potentially give comfort. To investigate this, user physiological data were collected both for initial reaction to the Creature as well as during the entire interaction session. 4.2.1 Research Questions These motivations led to three primary research questions and goals: • Examine participants’ qualitative opinions of overall Creature feel and their reaction to medium-term interaction with the Creature. Are participants able to identify breathing and pulse mechanisms, and do they find these mechanisms appropriate to the Creature? • Determine if participants are able to identify the Creature mirroring their breathing and heart rate, and if so, what are their reported reactions to it? • Examine initial physiological reaction to the Creature. Does the Creature’s state, either motionless, breathing steadily, or mirroring the user, have an effect on physio- logical metrics of the participant? In order for participants to recognize the Creature mirroring their physiological state, they would have to be able to distinguish the motions of their own breathing and heart rate 65 4.2. Experiment 1: Recognition of Mirroring and Initial Reactions to Creature in the Creature from those of the Creature operating at a constant breathing and heart rate. Therefore, physiological responses in skin conductance, blood volume pulse, EKG, and respiration rate were measured while the Creature was inactive, actively breathing at a constant rate, and then mirroring the participant’s respiration and heart rate for ninety second periods. This length was chosen to allow the experiment to be completed within a half-hour time period to encourage participant participation: differentiation between stages was seen in pilot studies of this length. Participants were surveyed as to their impressions of the Creature’s mechanisms and their reactions to the physiological mirroring. 4.2.2 Experiment Procedure Experiments took place in an experiment room that had been emptied of all equipment except for a table placed against the wall. During the experiment participants remained seated, facing the wall, at the large table. The physiological sensor encoder was placed on the table, to the right of the participant. The wire from the sensors, the experiment facilitator, the Haptic Creature support equipment, and computers were located behind a fake wall to the right of the participant. A web camera affixed to the top of the wall was used to observe the participant during the experiment. Participants wore noise-canceling headphones during the experiment. The experiment consisted of the four phases shown in Figure 4.9, and described here. no creature creature still creature mirroring subject (i) (ii) (iii) (iv)creature constant rate motionRa nd om ize d Figure 4.9: Diagram of Experiment 1 procedure. After signing consent forms, participants were fitted with skin conductance (SCR), blood volume pulse (BVP), and skin temperature (ST) sensors on their non-dominant hand, as well as three-lead electrocardiogram (EKG) and respiration rate (RR) sensors. The sensors were then activated and tested. If necessary, adjustments were made to sensor fit to ensure that they were properly functioning. 66 4.2. Experiment 1: Recognition of Mirroring and Initial Reactions to Creature (i) No Creature Participants were then asked to sit calmly for ninety seconds while a baseline was gathered, which began when the facilitator had returned behind the wall. As this stage was the initial baseline gathered for the participant, it was necessarily always performed first. (ii) Creature still (CS) The participants were then introduced to the Haptic Creature, and given it to be placed in their lap. They were instructed to sit quietly with the Creature on their lap, and to feel free to pet and interact with the Creature. They were requested to try to maintain at least one hand on the Creature at all times during their interaction. After the facilitator had returned behind the wall, physiological data were gathered for ninety seconds. As this stage incorporated a combination of initial reaction to the Creature and reaction to the still Creature, it was always performed second. (iii) Creature mirroring subject (CM) The facilitator then returned to the participant and informed him or her that the mecha- nisms of the Creature would now be activated. After the facilitator returned behind the screen the Creature was then turned on. It began mirroring the participant’s breathing and heart rate: a detected pulse from the EKG sensor triggered a pulse on the Creature, and the output of the respiration rate sensor was commanded on the Creature’s breathing mechanism. This continued for ninety seconds, during which time physiological data were continued to be gathered. The order of this stage and of the “Creature constant motion,” stage iv, was counterbalanced. (iv) Creature constant motion (CCM) The Creature’s constant motion stage was then begun. In this mode, the Creature has a respiration rate and intensity of twelve breaths per minute, as well as a pulse rate of seventy beats per minute, typical of a resting human adult. The transition from the previous stage to this mode occurred without comment from the facilitator. This stage continued for ninety seconds, during which time physiological data were continued to be gathered. The order of this stage and the “Creature mirroring subject” stage were counterbalanced; in both cases transitions occurred smoothly and without comment. (v) Experiment Ending and Questionnaire The physiological data collection was then ended, and the Creature removed from the subject. The participant then removed the sensors, and a post-experiment questionnaire 67 4.2. Experiment 1: Recognition of Mirroring and Initial Reactions to Creature was administered; a copy is included in Appendix B.2.1. 4.2.3 Results Ten subjects, three female and seven male, took part in this experiment. None had partici- pated in previous experiments. All were graduate or undergraduate engineering or computer science students between the ages of eighteen and thirty. Self-Reported Results Descriptions of quantitative survey results refer to general trends, not statistical analy- ses. Only two subjects were able to recognize the Creature behavior during the mirroring stage as mirroring their breathing and heart rate. The responses from the post-experiment questionnaire are shown in Table 4.3. Table 4.3: Table of results from Experiment 1 questionnaire (1 = strongly disagree, 5 = strongly agree), n = 10. Responses Statement 1 2 4 53 It was easy to recognize the creature mirroring my breathing. I found the creature mirroring my breathing comforting (if noticed). I found the creature mirroring my breathing disturbing (if noticed). The creature’s breathing made me more aware of my own breathing. It was easy to recognize the creature mirroring my pulse. I found the creature mirroring my pulse comforting. I found the creature mirroring my pulse disturbing. The creature’s pulse made me more aware of my own heart rate. I found the creature comfortable on my lap. I was startled by the activation of the creature. I found the creature’s motion disturbing. I found the noise of the creature distracting Physiological Responses Group-wise and within-subjects comparisons were performed for several physiological met- rics. Pool-wise comparisons are summarized in Table 4.4, based on two-tailed dependent sample t-tests (α = 0.05). Within-subjects comparisons were performed where more than 68 4.2. Experiment 1: Recognition of Mirroring and Initial Reactions to Creature one data point existed for each participant for each stage, namely for their series of indi- vidual breath lengths and heart rate interbeat intervals. Table 4.4: Summary of results from Experiment 1. Significant results are in bold. physio metric comparison stages units CS–CM CS–CCM CM–CCM breath length mean mean 0.263 0.115 -0.148 ms sd 0.801 0.928 0.439 p 0.351 0.718 0.338 breath length sd mean -0.358 -0.104 0.254 ms sd 0.458 0.649 0.420 p 0.043 0.643 0.102 heart rate mean mean -3.90 +2.00 -1.54 bpm sd 4.64 4.17 2.23 p 0.045 0.212 0.075 heart rate variability mean -0.023 -0.012 0.010 bpm/ms sd 0.024 0.025 0.023 p 0.022 0.186 0.215 skin temperature mean mean 0.759 0.741 -0.008 ◦C sd 0.585 0.582 0.003 p 0.040 0.047 0.956 skin temperature sd mean 0.017 0.052 0.034 ◦C sd 0.199 0.069 0.155 p 0.808 0.066 0.548 skin conductance mean mean 2.18 1.96 -0.515 S sd 2.01 1.49 0.404 p 0.022 0.047 0.104 skin conductance sd mean -0.127 0.007 0.134 S sd 0.364 0.202 0.310 p 0.322 0.925 0.227 Breath lengths The series of breath lengths for each subject between the Creature still, Creature constant motion, and Creature mirroring stages using a two-tailed within-subjects unequal variance t-test (α = 0.05). Six of ten participants were found to have a significant difference (p < 0.05) between breath lengths with the Creature still and the Creature in constant motion, and seven between breath lengths with the Creature still and the Creature mirroring the subject. Of those seven, three also had a significant difference (p < 0.05) between breath lengths with the Creature in constant motion and the Creature mirroring the subject. Breath length series were similarly compared with the breath length of 2.5 seconds, the commanded breath length for the Creature during the Creature constant motion stage. Comparisons were made with actual participant breathing rates during the Creature still, Creature constant motion, and Creature mirroring stages. Results for a significant difference 69 4.2. Experiment 1: Recognition of Mirroring and Initial Reactions to Creature (p < 0.05) between command breath length and participant breath length failed to conclude anything for five people during the Creature still phase, and five others during the Creature constant motion and Creature mirroring stages. The mean and standard deviation of breath lengths are graphed in Figure B.31. 4.2.4 Discussion Qualitative Results Overall initial reactions to the Creature were investigated through survey questions and interview responses to determine participants’ qualitative opinions of overall Creature feel and whether they found the Creature’s actions appropriate to the Creature. These responses were typically positive, with no overtly negative opinions of the Creature’s feel or behavior, nor of interaction with it. Most participants, upon their introduction to the Creature, expressed a desire to touch and feel it. Participants generally agreed with the statement “I found the Creature comfortable on my lap” (see Figure 4.10). This comfort level with the Creature was important both in that participants were able to tolerate the placement of a new device on their lap, and in that they were comfortable with such a device moving and being “active” in such a personal and private part of the body. Participants in general expressed their like of the motion of the Creature: one described that it “made the Creature seem much more real and lifelike.” One participant noted that she found “feeling the pulse of the Creature was really comforting.” When asked what they liked most about the Creature, a majority of respondents mentioned a positive reaction to the Creature’s warmth on their lap. There were no complaints about the breathing or pulse mechanisms seeming disturbing or disconcerting; most stated that this behavior was in line with their expectations for the Creature. However, most participants did find the pulse mechanism of the Creature to be noisy and moderately distracting. There was an audible clicking sound whenever a pulse took place that was quite noticeable in the quiet of the experiment room. Although comfortable with the Creature, participants were less successful in linking the Creature’s breathing and pulse with their own. There was no consensus on whether the Creature’s breathing and pulse made them more aware of their own breathing and pulse (see Figure 4.11). One participant noted that she became worried about the Creature when its breathing rate changed, an indication that perhaps this participant viewed the Creature as having some form of “life.” Responses were investigated to determine if participants were able to identify the Crea- ture mirroring their breathing and heart rate, and if so, their reported reactions to it. Results are reported in Figure 4.12. As a group, participants were consistently unable to identify the Creature mirroring their own breathing and pulse, with only a single partici- pant able to recognize this behavior. Most thought that there were two or three different 70 4.2. Experiment 1: Recognition of Mirroring and Initial Reactions to Creature I found the creature comfortable on my lap ranking (1 = strongly disagree, 5 = strongly agree) n 1 2 3 4 5 0 1 2 3 4 5 6 7 8 9 10 Figure 4.10: Experiment 1 participant responses to statement “I found the creature com- fortable on my lap.” the creature's ___ made me more aware of my own ___ ranking (1 = strongly disagree, 5 = strongly agree) n 1 2 3 4 5 0 1 2 3 4 5 6 7 8 9 10 breathing pulse Figure 4.11: Experiment 1 participant responses to question of whether “creature’s actions made them more aware of their own.” 71 4.2. Experiment 1: Recognition of Mirroring and Initial Reactions to Creature operating modes of the Creature; these modes were typically identified as “fast and slow” or “smooth and random,” not as mirroring. Once informed that the second mode of the Creature was mirroring their breathing and pulse, most participants expressed surprise; one participant even stated that he “did not think I was breathing that fast or heavy.” Almost all rated mirroring as very difficult to observe. One participant stated: “mirroring could be made more obvious.” Without any explanation that the Creature would mirror the participant, it appears that there was no expectation that such mirroring could occur. On reflection, when a small animal is placed on our laps, while we may investigate its breathing and heart rate to assess its emotional state, most of us do not immediately compare its breathing rate and heart rate to our own. ranking (1=strongly disagree, 5 = strongly agree) n It was easy to recognize the creature mirroring my... 1 2 3 4 5 0 1 2 3 4 5 6 7 8 9 10 breathing pulse Figure 4.12: Experiment 1 participant responses to statement “It was easy to recognize creature mirroring my. . . ”. The one participant who was able to recognize the Creature mirroring her breathing and pulse was unable to offer an explanation for this ability, but did hypothesize that because she plays a musical instrument she may be more cognizant of her own breathing than other people. She had a strongly negative reaction to mirroring, responding that “I really did not like this. I found it difficult to breath normally. It was much better to match my breathing to the Creature.” As she had been exposed to the Creature constant motion stage before the Creature mirroring stage, it is likely that during the Creature constant motion stage she was attempting to match her breathing to that of the Creature. It is possible that the sudden transition from attempting to match the breathing of the Creature to now finding herself guiding the Creature could be disturbing. Indeed, the participant would ultimately find herself in a sort of positive feedback loop until the limits of the Creature’s respiration 72 4.2. Experiment 1: Recognition of Mirroring and Initial Reactions to Creature mechanism were reached. Physiological Reactions Initial physiological reactions to the Creature were investi- gated. Physiological reactions to the Creature were generally inconclusive. Comparisons were first made between the breath lengths of participants during each stage. Breath lengths were determined from analysis of the respiration sensor waveform: peaks and troughs were detected and from this breath length was calculated. Where there were obvious noise arti- facts in the signal (most likely from movement or talking), attempts were made to interpolate the breath length by identifying the underlying wave pattern. The respiration rate sensor is particularly sensitive to the motions of the abdomen that occur during speech, as this often greatly overshadows the breathing motion. Figure 4.13 shows the breath lengths of a partic- ipant during the experiment. During the baseline the participant took longer breaths than during the Creature constant motion or Creature mirroring stages, and indeed the mean of both the Creature constant and Creature mirroring stages is close to the commanded 2.5 second breath length of the Creature during the Creature constant motion stage. Figure B.31 show the mean and standard deviation of breath lengths for all participants during the experiment. time [s] br ea th  le ng th  [s ec on ds ] breath lengths during experiment for single subject 0 50 100 150 200 250 300 3500 1 2 3 4 5 6 7 8 baseline creature constant creature mirroring mean Figure 4.13: Breath lengths of a participant during Experiment 1. The activation of the Creature was strongly correlated with a change of breathing rate for the participant in six out of ten of the participants. The same six saw both a change from the Creature still stage to Creature constant motion stage, as well as the Creature still stage to the Creature mirroring stages. An additional three saw a difference between their breath lengths during the constant motion and mirroring stages. It should be noted that the subjects who did not react to the constant motion Creature also did not react to the mirroring Creature, their mean breath lengths remained similar throughout the entire 73 4.2. Experiment 1: Recognition of Mirroring and Initial Reactions to Creature experiment, and the standard deviation of breath lengths for them generally remained low and similar for each stage. T-tests were also conducted to determine if the participants’ breath lengths were distin- guishable from the Creature’s constant motion breath lengths. They were distinguishable during the baseline for five out of ten participants, and then also during the Creature con- stant motion and Creature mirroring stages by five (different) subjects. This is an indication that the chosen commanded breath rate was similar to that of the average resting respira- tion rate. Two of the six subjects whose breathing rates were affected had breathing rates that were indistinguishable from the Creature’s. The important result in group trends was that overall there was a significantly lower standard deviation of breath lengths during the Creature motion stage as compared to the Creature still stage. This implies that breathing became more “regular” as a result of the active Creature, and that the steady and repeated motion of the Creature was able to induce a similar steadiness in the subject’s breathing. A similar increase in steadiness was shown by the reduction of heart rate and heart rate variability. Analysis of the series of heart rate interbeat intervals for each participant indicates that nine out of ten participants had a change in heart rate from the Creature still stage to the Creature mirroring stage, and seven from the Creature still stage to the Creature constant motion stage. We propose that this heart-rate change was induced by the Creature. Mean heart-rate was significantly less during the Creature constant motion stage than during the baseline, making it likely that this change induced by the Creature was in the negative, i.e. more relaxed, direction. Heart rate standard deviation, or heart rate variability, was also significantly reduced during the Creature constant motion stage as compared to the baseline. The increase in mean skin conductance is likely due to sensor drift during the course of the experiment. Most participants saw a brief peak in skin conductance when the Creature was activated, indicative of the startle response, but there were no other large peaks during the experiment. The increase in skin temperature for both the Creature constant motion and the Crea- ture mirroring stages as compared to the Creature still stage is likely indicative of an increase in relaxation during the experiment. It is unlikely that this was caused directly by the warmth of the Creature as the skin temperature sensor was worn on the back of the ring finger of the non-dominant hand, and therefore was generally placed farther away from the Creature’s main source of warm, its breathing mechanism. A trial experiment with the temperature sensor mounted on the anterior dorsal end of the Creature did not reveal any significant temperature change after five minutes of the Creature’s mechanisms being activated. A summary of the significant results from the experiment is shown in Table 4.5. 74 4.2. Experiment 1: Recognition of Mirroring and Initial Reactions to Creature Table 4.5: Summary of significant results from Experiment 1. physiological metric comparison mean SD p breath length sd Creature still to Creature constant motion −0.358 s 0.458 s 0.043 mean heart rate Creature still to Creature constant motion -3.90 bpm 4.64 bpm 0.045 heart rate variability Creature still to Creature constant motion −0.023 s 0.024 s 0.022 mean skin temperature Creature still to Creature constant motion 0.759 ◦C 0.585 ◦C 0.040 Creature still to Creature mirroring 0.741 ◦C 0.582 ◦C 0.047 mean skin conductance Creature still to Creature constant motion 2.18 S 2.01 S 0.021 Creature still to Creature mirroring 1.96 S 1.49 S 0.047 4.2.5 Conclusions Users did not report any overtly negative reactions to overall interaction with the Creature. Participants had a high awareness of the breathing mechanism of the Creature, but a lower awareness of its pulse mechanism. Participants found the Creature comfortable on their laps and had no disturbing reactions to or adverse opinions of the motion of the Creature during their interactions with it. Nine of the ten participants were not able to recognize the Creature mirroring their own physiological state. Exposure to the Creature produced a reduction in heart rate variability, mean heart rate, and the standard deviation of breath lengths, as well as increase in skin temperature during the Creature constant motion stage as compared to baseline; these are physiological indications of relaxation. The reduced heart rate and breath length standard deviations are closer to the Creature’s, which ran at a constant rate during the constant motion stage. 4.2.6 Feedback for Iterated Design This experiment provided valuable feedback as to the utility of the haptic anxiety reduction platform. In its first use with test participants, the functioning hardware and software components of the system were validated. Participant reports caused several hardware and procedural modifications to be made to the platform. The first area of concern was Creature noise. Several participants noted the noise of the Creature as “distracting,” and response to the questionnaire question about Creature sound indicated a similar reaction. Efforts were therefore made to reduce the sounds emitted by the Creature. The greatest source of noise, the Creature’s pulse mechanism (see Figure 3.6) was removed and lubricated, with foam padding added where the pulse mechanism 75 4.2. Experiment 1: Recognition of Mirroring and Initial Reactions to Creature is attached to the Creature. The Creature’s startup routine was also adjusted to prevent sudden noises emanating from the pulse mechanism if the Creature needed to be reset or lost power during operation. Additionally, the Creature’s breathing servo refresh rate was increased to eliminate a vibration sound that was noticeable when the breathing mechanism was under heavy load. After these modifications, the Creature’s sound output level was noticeably lower, and in observations with noise canceling headphones little to no Creature sound was able to be discerned. In extremely quiet environments such as the experiment testing rooms the use of noise canceling headphones is now recommended where practical. Noise emitted by the Creature turned out to be a much more solvable problem than the companion problem: noise emitted by the participant, namely talking. Speech requires air to be directed over the vocal cords, and in the process the normal respiration waveform is disrupted. The respiration rate sensor proved extremely sensitive to interference from talking; this sensitivity often led to inaccurate estimates of respiration rate that required manual correction. As a result, care is now taken to ensure that the experiment facilitator is out of sight during the experiment, so that the participant is not inclined to speak. If the respiration rate estimate appears to be abnormally high or low additional time is taken on the baseline stage so that the respiration rate can be recalculated. The inability of most participants to recognize mirroring during the experiment may have been symptomatic of a lack of formal introduction to the Creature. Interaction with the Creature is intuitive only when it is viewed as a robotic pet whose mechanisms add the mechanical sensations of life to an otherwise inanimate object. The concept of a robotic pet physiologically linked to its user did not occur to most participants, even after they themselves were equipped with physiological sensors. This is not necessarily surprising, as the physiological sensors are most often used in experiments to record reactions to various stimuli, and very rarely are used as the direct input for another system. Before future experiments, care should be taken to describe the functioning of the Creature: both the various mechanisms and the fact that it is capable of reacting to physiological sensor input from the participant. This will ensure that the participants know what to look for in terms of Haptic Creature activity changes, as well as provide a baseline for expected Creature breathing rate and pulse rate that is near to their own. The strong negative reaction that a participant had upon finding the Creature mirroring their breathing rate indicates that this capability may not be advisable in scenarios where the participant is following the Creature’s breathing, as it could potentially lead to an uncomfortable positive feedback loop. A sudden change to mirroring may be useful as a high-salience indicator to alert the participant during a task. 76 4.3. Experiment 2: Creature Entraining and Reactions During a Task 4.3 Experiment 2: Creature Entraining and Reactions During a Task While Experiment 1 investigated mostly the subjective response to the Creature, answering the questions of “Will people like it?” and “Will people be receptive to it?”, an attempt to manipulate the user’s affect, a key goal of the TAMER platform, had not yet been performed. During Experiment 1, there had been an observation of increased “steadiness,” that is, a decrease in standard deviation of both breathing rate and heart rate during the experiment attributed to the Creature. This had been an encouraging result: it showed that the Creature was able to at least somewhat have influenced the user’s physiological state. It was proposed to further investigate this ability of the Creature, both directly, by asking users to follow the Creature, and indirectly, by examining the Creature’s physiological effect when the user was performing a task. The primary goal of this experiment was to investigate whether a change in Creature “physiological state” as conveyed through its respiration and pulse mechanism has an ef- fect on a participant’s physiological state (as measured through pulse and respiration rate). Unlike in the previous Experiment, where the Creature had simply been activated or de- activated, here a more focused change in Creature activity was adopted, one that would also be of use in determining whether participants might find higher or lower activity levels in the Creature more noticeable. In this experiment, the Creature was progressed from a physiological state mirroring the participant’s respiration and pulse (their baseline) to a state with either a faster respiration rate and higher pulse, or a slower respiration rate and a lower pulse. After some time in this new state, the Creature was progressed back again to the original pulse and respiration state baseline. This is shown in Figure 4.15. Time period lengths were chosen to allow the experiment to be completed within a half-hour time period to encourage participant participation: overall experiment lengths were generally greater than in the previous Experiment due to the shorter questionnaire. The gradual adjustment in Creature activity would prevent any disconcertion from the Creature being suddenly activated or deactivated, and would also preclude recognition of a sudden change in Creature activity. A difference of 20 percent from baseline in respiration rate and 20 beats per minute in heart rate was chosen as representing a distinguishable difference in Creature activity levels while not exceeding the capabilities of the platform. Larger deltas resulted in extremely fast and noisy Creature motions, often to a distracting level, during the elevated activity level state. The transitions between the high and low activity levels were generally shorter than the constant motion stages, as where physiological comparisons were made between the high and low activity states a large enough time was needed for participant physiological metrics to stabilize. A secondary goal of this experiment was to determine if the Creature could influence its 77 4.3. Experiment 2: Creature Entraining and Reactions During a Task user when the user was not directly engaging with the Creature. This would help support the role of the TAMER platform in its ultimate end environment: one in which the Haptic Creature acts as merely accompaniment while the user performs another task. There were two stages of interaction with the user to investigate this. In the first, the participant was invited to interact with the Creature in a focused way, through petting or stroking the Creature, for several minutes. In the second, participants held the Creature on their laps while performing a secondary task, in this case reading literature. It was expected that they would find the Creature’s motions and actions comforting, but not distracting from their task. In the previous experiment it had been found that participants required a thorough introduction to the Creature. Even after being equipped with physiological sensors, par- ticipants did not recognize that the Creature could be linked to their own physiological state, and several of the Creature’s mechanisms, particularly the pulse, are not obviously found without careful inspection. As part of the introduction, therefore, it was decided to ask the participant to mirror the Creature’s breathing and heart rate for a brief period, a procedure henceforth called “entrainment” (cf. “mirroring”). This would help accomplish several goals. Breathing rate training as part of relaxation therapy is an important part of many anxiety reduction techniques, and the Haptic Creature’s abilities to display con- trolled breathing rates could give it the ability to act as a trainer. If users could successfully mirror the Creature’s breathing, it would help to confirm one possible usage scenario of the TAMER platform. By matching user breathing with the Creature’s, this entraining would also help provide an expected activity level for the Creature of the user’s own breathing rate and heart rate, giving participants a calibration on what activity levels to expect from the Creature for the rest of the session. 4.3.1 Research Questions In this experiment the following research questions were posed: • Can participants consciously mirror the Creature’s respiration rate when instructed to do so? If so, does this mirroring affect the participant’s physiological state? • Does Creature motion affect participants’ physiology either when interacting with the Creature or when performing a task with the Creature on their laps? • Is the Creature distracting to participants when they are asked to perform a simple, non-stimulating mental task? Overall group trends were analyzed. Skin temperature, heart rate variability, heart rate acceleration, and skin conductance were examined for any prevailing trends through pool-wise comparison between stages using two-tailed dependent sample t-tests (α = 0.05). 78 4.3. Experiment 2: Creature Entraining and Reactions During a Task 4.3.2 Procedure Experiments took place in an experiment room which had been removed of all equipment except for a table placed against the wall. During the experiment participants remained seated, facing the wall, at the large table. The physiological sensor encoder was placed on the table, to the right of the participant. The wires from the sensors, the experiment facilitator, the Haptic Creature support equipment, and computers were located behind a fake wall to the right of the participant. A web camera affixed to the top of the wall was used to observe the participant during the experiment. The experiment consisted of the four phases shown in Figure 4.14, and described here. creature still ramped creature motion w/o task user asked to mirror ramped creature motion w/o task user asked to breath normally (i) (ii) (iii) (iv) ramped creature motion w/task user asked to breath normally Ra nd om ize d Figure 4.14: Diagram of Experiment 2 procedure. Introduction and Baseline After signing consent forms, participants were fitted with skin conductance (SCR), blood volume pulse (BVP), and skin temperature (ST) sensors on their non-dominant hand, as well as three-lead electrocardiogram (EKG) and respiration rate (RR) sensors. The sensors were then activated and tested. If necessary, adjustments were made to sensor fit to ensure that they were properly functioning. Participants were then asked to sit calmly for ninety seconds while a baseline was gathered. Stage 1: Creature Still Participants were given the Haptic Creature. It was placed on their lap, and its respiration and pulse mechanisms were described and pointed out. They were instructed to sit quietly with the Creature and to feel free to interact with it by petting, stroking, or touching. Physiological data were continued to be gathered for ninety seconds after the facilitator had moved out of sight of the participant. These ninety seconds are stage 1 in Figure 4.14 79 4.3. Experiment 2: Creature Entraining and Reactions During a Task and in other references. Stage 2: Ramped Creature Motion, User Asked to Mirror Participants were then informed that the mechanisms of the Creature would now be acti- vated. After the facilitator had returned behind the screen, the Creature began to mirror the physiological state of the user in both heart rate and respiration. The facilitator then returned to the participant, and invited him/her to mirror the Creature’s breathing with his/her own. Once the facilitator returned behind the screen, the Creature then immedi- ately began a progression consisting of a thirty second “ramp” to a breathing rate and heart rate 20% higher than that of Stage 1, sixty seconds at the new, higher rate, and then a sixty second ramp down to a breathing rate and heart rate 20% lower than that of stage 1, followed by sixty seconds at that rate. The Creature was then deactivated. These two hundred and ten seconds are stage 2 on Figure 4.14 and in other references. HR + 20 bpm 1.2 * Resp. Rate HR - 20 bpm 0.8* Resp. Rate Baseline Gathered Figure 4.15: Ramped Creature motion, as used during experiments. Stage 3: Ramped Creature Motion With User Task, User Asked to Breathe Normally Participants were then assigned a reading task. They were asked to read selections from three Graduate Record Examinations™ [111] reading passages, count the number of words containing four syllables, and write this number at the bottom of the page. They were instructed to keep at least one hand on the Creature at all times, and to keep the reading material on the desk rather than hold it in their hands. During this stage the Creature performed a ramped motion similar to that of stage 2 but longer, consisting of a sixty second “ramp” to a breathing rate and heart rate 20% higher than that of stage 1, one hundred and twenty seconds at the new, higher rate, and then a one hundred and twenty second ramp down to a breathing rate and heart rate 20% lower than that of stage 1, followed 80 4.3. Experiment 2: Creature Entraining and Reactions During a Task by one hundred and twenty seconds at that rate. The Creature was then deactivated. These four hundred and twenty seconds are stage 3 on Figure 4.14 and in other references. Stage 4: Ramped Creature Motion Without User Task, User Asked to Breathe Normally Participants were then instructed to sit calmly with the Creature while the same ramp progression as in stage 3 is performed. These four hundred and twenty seconds are stage 4 on Figure 4.14 and in other references. Questionnaire The Creature was collected, the sensors removed, and a post-experiment questionnaire administered. A copy of the post-experiment questionnaire is included in Appendix B.3.1. The order of stages 3 and 4 was determined randomly. Stage 2 was always performed first to ensure that participants were aware of the Creature’s mechanisms’ location and actions, as well as the intended relation between the Creature’s mechanisms and their own breathing and heart rate. Nine undergraduate or graduate computer science and engineering students between the ages of twenty and thirty, four of whom were female, took part in this experiment. None had participated in the previous experiments. Participants were compensated for their time. 4.3.3 Results Qualitative and then physiological results are reported in this section. Qualitative Results A summary of the questionnaire results is shown in Table 4.6. Descriptions of quantitative survey results refer to general trends, not statistical analyses. Participants reported a high ability to easily mirror the Creature’s breathing, and generally a high awareness of the Creature’s breathing and pulse. 81 4.3. Experiment 2: Creature Entraining and Reactions During a Task Table 4.6: Questionnaire results from Experiment 2 post-experiment survey (1 = strongly disagree, 5 = strongly agree). When asked to mirror creature: I was able to easily mirror the creature’s breathing I was aware of the creature’s pulse I was comfortable with creature on my lap I was aware of my own breathing I was aware of my own heartrate I found noise of creature distracting 1 2 4 53 While sitting with active creature: I was aware of the creature’s breathing I was aware of the creature’s pulse I noticed changes in the creature’s breathing I noticed changes in the creature’s pulse I was aware of my own breathing I was aware of my own heart rate I was comfortable with creature on my lap 1 2 4 53 During reading task: I was aware of the creature’s breathing I was aware of the creature’s pulse I noticed changes in the creature’s breathing I noticed changes in the creature’s pulse I was aware of my own breathing I was aware of my own heart rate I was comfortable with creature on my lap I found creature’s motion distracting 1 2 4 53 Overall: creature made me more aware of breathing creature made me more aware of heart rate enjoyed interacting 1 2 4 53 82 4.3. Experiment 2: Creature Entraining and Reactions During a Task Physiological Results Breath Lengths Typical physiological results from the experiment are in Figure 4.16, which shows a participant’s breathing rate and heart rate during the second stage of the experiment, in which they were asked to mirror the Creature. In the leftmost frame of the graph the baseline is gathered. At the sixty second mark on the graph the Creature has ramped down to a constant value of 80% of baseline, and here the participant’s mean respiration rate is almost the same as commanded respiration rate — the commanded and mean breath length lines overlap. During this time period the mean heart rate is increased slightly from baseline, but not to near the commanded value of twenty beats per minute greater than the baseline mean heart rate. In the other constant motion stage of the experiment, starting at the one hundred and eighty second mark on the graph, participant respiration rate remains almost constant at the commanded respiration rate of 120% of the baseline, here again the commanded and mean breath lengths overlap. During this period the mean heart rate is increased slightly both from the previous period and the baseline, whereas the commanded heart rate was twenty beats per minute lower than baseline. All participants showed greatly reduced standard deviation of breath lengths when asked to mirror the Creature, and this reduction somewhat tended to stay, with standard devia- tions remaining lower for most participants when both sitting calmly and performing the task than during baseline. On average, standard deviations were slightly but not signif- icantly higher when performing the task than when sitting calmly. There was a statisti- cally significant difference in the standard deviation of breath lengths between the baseline and the training stages (M = 1.15 s, SD = 0.535 s, p < 0.05), the baseline and sitting calmly (M = 0.780 s, SD = 0.686 s, p < 0.05), and the baseline and performing a task (M = 0.596 s, SD = 0.524 s, p < 0.05), as well as between the training stage and sitting calmly (M = −0.377 s, SD = 0.411 s, p < 0.05) and the training stage and performing the task (M = −0.560 s, SD = 0.395 s, p < 0.05). In general, mean breath length was significantly different in lengths during the faster and slower commanded respiration series both during the training stage (M = −2.91 s, SD = 1.98 s, p < 0.05), the Creature with the task (M = 1.46 s, SD = 1.18 s, p < 0.05), and the Creature without the task (M = −0.0540 s, SD = 0.296 s, p < 0.05). Means were calculated for the steady portion of Creature motion, when it was operating a constant breathing rate, not the ramp. 83 4 .3. E x p erim en t 2 : C reatu re E n train in g an d R eaction s D u rin g a T ask 0 1 2 3 4 5 6 7 8 9 10 time [s] b r e a t h  l e n g t h  [ s ] 0 50 100 150 200 250 60 80 participant breath lengths and heart rate during mirroring of creature h e a r t  r a t e  [ b p m ] mean heart rate commanded heart rate heart rate commanded   breath length mean   breath length breath length Figure 4.16: Breath lengths and heart rate for a participant during stage 2 of Experiment 2. Green vertical bars represent a single breath. 84 4.3. Experiment 2: Creature Entraining and Reactions During a Task Heart Rate Heart rate was compared using three metrics: interbeat interval (ibi), heart rate variability, and mean heart rate. Interbeat Interval During the training session eight out of nine participants saw a re- duction in the standard deviation of heart rate interbeat intervals. All participants saw an effect from the Creature when sitting calmly with it versus the baseline (p < 0.05), and six out of nine saw an effect from the Creature motion during the task versus the baseline (p < 0.05). Heart Rate Variability Heart rate variability metrics were calculated for each phase for each subject. Overall, there was no significant difference in heart rate variability between or within stages, except for the percentage of high frequency components, which did not show a significant decrease (p > 0.05) from Stage 1 to Stage 2, but did show a significant difference between Stages 2 and 3 (M = −16.4, SD = 12.5, p < 0.05), 2 and 4 (M = −23.0, SD = 18.2, p < 0.05) , and 3 and 4 (M = −6.59, SD = 12.7, p < 0.05). Mean Heart Rate There was no significant difference (p > 0.05) in mean heart rate between or within the stages. Skin Conductance There was no significant difference (p > 0.05) in skin conductance between or within the stages. Skin Temperature There was no significant difference (p > 0.05) in mean skin temper- ature between or within the stages. A summary of physiological results is shown in Table 4.7. 85 4.3. Experiment 2: Creature Entraining and Reactions During a Task Table 4.7: Summary of results from Experiment 2, significant results are in bold. Stage 1: Creature Still; Stage 2: Ramped Creature Motion, User Asked to Mirror; Stage 3: Ramped Creature Motion Without User Task, User Asked to Breathe Normally; Stage 4: Ramped Creature Motion Without User Task, User Asked to Breathe Normally. Breathing rate data is located in Section 4.3.3. metric comparison stages unit 1–2 1–3 1–4 2–3 2–4 3–4 breath length sd mean 1.15 0.780 0.596 -0.377 -0.183 -0.560 s sd 0.535 0.686 0.524 0.411 0.205 0.395 p < 0.001 0.002 0.009 0.011 0.183 < 0.001 heart rate mean mean 0.222 1.22 2.11 1.00 1.89 0.889 bpm sd 4.89 2.70 3.31 4.99 5.13 3.81 p 0.901 0.236 0.109 0.586 0.328 0.528 heart rate sd mean -0.100 0.178 0.122 0.278 0.222 -0.056 bpm sd 1.43 1.32 2.44 1.54 2.73 1.72 p 0.848 0.714 0.891 0.623 0.824 0.930 heart rate var rmsssd mean 13.6 14.2 12 0.667 -1.55 -2.22 sd 32.0 43.1 46.3 19.1 23.7 8.89 p 0.265 0.378 0.484 0.924 0.857 0.500 heart rate var pnn50 mean -3.37 -2.99 -3.3 0.382 0.064 -0.318 sd 11.0 7.50 9.08 7.66 11.2 7.13 p 0.409 0.292 0.333 0.891 0.987 0.902 heart rate var hf% mean 30.5 14.1 7.50 -16.4 -23.0 -6.59 % sd 16.4 21.1 28.9 12.5 18.2 12.7 p < 0.001 0.096 0.483 0.006 0.007 0.008 heart rate var lf% mean -1.44 2.56 10.2 4 11.7 7.67 % sd 24.3 16.3 25.4 25.8 24.1 15.3 p 0.871 0.669 0.288 0.673 0.207 0.194 heart rate LF/HF mean -1.54 -0.736 0.146 0.800 1.95 1.15 sd 5.42 4.01 4.21 4.47 3.97 2.49 p 0.446 0.618 0.787 0.627 0.202 0.228 skin temperature mean mean -0.795 -0.807 -1.47 -0.012 -0.676 -0.664 ◦C sd 1.01 1.66 2.05 1.19 2.14 1.91 p 0.055 0.184 0.064 0.977 0.372 0.328 skin temperature sd mean 0.489 0.49 0.305 0.001 -0.185 -0.185 ◦C sd 1.75 1.84 1.98 0.234 0.475 0.457 p 0.427 0.447 0.657 0.996 0.276 0.259 skin conductance mean mean 0.097 0.071 0.081 -0.026 -0.016 0.010 norm sd 0.386 0.287 0.323 0.115 0.098 0.09 p 0.475 0.481 0.476 0.518 0.634 0.75 skin conductance sd mean -0.030 -0.055 -0.074 -0.025 -0.044 -0.020 norm sd 0.118 0.096 0.096 0.077 0.084 0.058 p 0.467 0.125 0.051 0.365 0.150 0.338 86 4.3. Experiment 2: Creature Entraining and Reactions During a Task 4.3.4 Discussion Questionnaire Results Analysis of participant survey results focused on three areas: their comfort with the Crea- ture and awareness of its mechanisms, the effect of the Creature on their awareness of their own breathing rate and pulse, and their reaction to the Creature while they were performing the reading task. Participants reported a greater awareness of the Creature’s mechanisms than their own corresponding activities. Participants were in general aware of the Creature’s breathing during the experiment, although they were slightly less aware during the reading task (see Figure 4.17). The design of the breathing mechanism likely allows for its activity to be monitored with minimal attention from the user. It produces a motion in the Creature’s abdomen that is quite salient over a large area of the Creature, requiring only a brief touch to obtain awareness of the current breathing rate and position. It should be possible to maintain contact with the Creature with minimal attention as only a brief touch is necessary, but required, to monitor its breathing. ranking ( 1= strongly disagree, 5 = strongly agree) n I was aware of the creature's breathing... 1 2 3 4 5 0 1 2 3 4 5 6 7 8 9 while sitting with active creature during reading task Figure 4.17: Experiment 2 participant responses to survey statement “I was aware of the creature’s breathing.” In comparison, the Creature’s pulse is more difficult to locate and much greater effort is required to maintain awareness of the Creature’s heart rate. The effect of the pulse 87 4.3. Experiment 2: Creature Entraining and Reactions During a Task mechanism can only be felt in the “neck” area of the Creature, near the head, and to do so requires placement of the hand in that area. Although the neck area is a somewhat natural position to place the hand when interacting with the Creature with both hands, it is not as likely to be regularly touched when the participant is primarily interacting with the Creature with one hand, as during the reading task. This is likely the cause for participants reporting much less awareness of the pulse during the reading task, as expected. In general, however, they showed a high awareness of the Creature’s pulse (see Figure 4.18). I was aware of the creature's pulse... ranking (1 = strongly disagree, 5 = strongly agree) n 1 2 3 4 5 0 1 2 3 4 5 6 7 8 9 when asked to mirror creature while sitting with active creature during reading task Figure 4.18: Experiment 2 participant responses to survey statement “I was aware of the creature’s pulse.” Concerning the research question posed related to whether the Creature’s breathing and pulse would cause the participants to be more aware of their own breathing and pulse, participants reported a very high awareness of their own breathing when asked to mirror the Creature (see Figure 4.19). The task naturally requires concentration on breathing rate and intensity. This awareness carried over into the later stages of the experiment, with all but one participant reporting an awareness of their breathing while sitting with the active Creature. Following the same trend as awareness of the Creature’s breathing, participants’ awareness of their own breathing was less during the reading task, with several participants reporting that they were not aware of their own breathing during the task. It was also a research question as to whether participants would be aware of their own heart rate or that the Creature would be able to increase participants’ awareness of their 88 4.3. Experiment 2: Creature Entraining and Reactions During a Task I was aware of my own breathing... n ranking (1 = strongly disagree, 5 = strongly agree) 1 2 3 4 5 0 1 2 3 4 5 6 7 8 9 when asked to mirror creature while sitting with active creature during reading task Figure 4.19: Experiment 2 participant responses to survey statement “The creature’s breathing made me more aware of my own breathing.” own heart rate. In general, people do not have a high awareness of their own heart rate except in extreme conditions, where it is “pounding,” or beating fast enough that they are able to notice it. This result was shown in the reported results, as all participants reported some level of disagreement with the statement “I was aware of my own heart rate” (see Figure 4.20). Participants reported slightly higher disagreement during the reading task, but overall levels of disagreement for all three stages were quite high. Without extensive training, the most common way of being aware of one’s own heart rate is by taking one’s pulse, and participants were generally precluded from doing this during the experiment by sensor wires and the instruction to attempt to maintain one hand on the Creature at all times. Even if the Creature had invoked an increased mental awareness that they have a pulse, participants would likely have been unable to determine their pulse. As in the first experiment, reaction to interaction with the Creature was positive overall, with participants reporting comfort in having the Creature on their laps, and no discomfort with Creature motions and activity. It was desired that participants would not find the Creature overly distracting during their reading assignment; however, user feedback on that subject was mixed and inconclusive (see Figure 4.21). It was noted that during higher levels of engagement with the reading assignment, participants would use at least one hand and sometimes both to assist them in reading the pages; this would preclude haptic interaction 89 4.3. Experiment 2: Creature Entraining and Reactions During a Task I was aware of my own heart rate... ranking ( 1 = strongly disagree, 5 = strongly agree ) n 1 2 3 4 5 0 1 2 3 4 5 6 7 8 9 when asked to mirror creature while sitting with active creature during reading task Figure 4.20: Experiment 2 participant responses to survey statement “The creature’s pulse made me more aware of my own heart rate.” with the device and potentially mitigate some of the potential distracting effect of the Creature. The fact that participants are not forced to monitor the Creature, and that they can always remove their hands from it, may prevent it from becoming an intrusive distraction, but may also make it less effective. 90 4.3. Experiment 2: Creature Entraining and Reactions During a Task ranking (1 = strongly disagree, 5 = strongly agree) n I found the creature's motion distracting during the reading assignment 1 2 3 4 5 0 1 2 3 4 5 6 7 8 9 Figure 4.21: Experiment 2 participant responses to survey statement “I found the creature’s motion distracting during the reading assignment.” Physiological Results Stage 2 (ramped creature motion, user asked to mirror) is always administered prior to Stages 3 and 4 (ramped creature motion with and without task) for reasons of experiment flow and introduction to the Creature. This constitutes a randomization restriction, which might have implications on the interpretation of results incorporating Stages 3 and 4 (e.g. potential confounds with effects of adaptation, learning, habituation or fatigue and bore- dom). We saw this as a necessary constraint. Creature entrainment of breath rate when participants were asked to mirror the Creature was confirmed through the respiration mea- surement. There are likely several reasons why entrainment of heart rate was not similarly successful. In particular, participants were not instructed to mirror the Creature’s heart rate, and even if they had been, most would not have had the ability to do so, as they reported little to no awareness of their own heart rate. It appears likely that entraining had no effect on mean heart rate, as there was no pattern to the trend of mean heart rate between the slow pulse and high pulse stages of the entraining. Skin temperature did, however, increase during the training, an indication of decreased participant arousal. The physiological effects noted during the longer-term interaction with the Creature were also promising, if less pronounced. The standard deviation of breath lengths not only showed a general trend of decreasing greatly during the mirroring stage, as would be 91 4.3. Experiment 2: Creature Entraining and Reactions During a Task expected when commanded to breathe at a steady rhythm, but this reduction in breath rate variability remained even when the participant was not instructed to mirror: breath length variability was less both when sitting calmly and when performing the task than during the baseline. Participant breath length variability was slightly higher when performing the task than when sitting calmly for all participants, but still remained below baseline. This suggests that some aspect of the entrainment lingered even after the training stage. This reduction in breath rate variability corresponding to Creature motion was also noted in the previous experiment when the Creature was activated at a constant rate, but not when it was mirroring the participant. A likely explanation for this is that participants, understanding that the Creature was displaying a breathing rate similar to theirs, were identifying with the rhythmic stability of the Creature’s breathing rate, and “keying in” on it to cause an increased stability in their own breathing rate. This could also explain the decrease in standard deviation of heart rate shown during Creature motion in Experiment 1. Such a “stability effect” could potentially serve as an anxiety coping mechanism, by providing comforting reassurance and by reinforcing anxiety-reducing physiological metrics. A marked decrease in the high frequency percentage of heart rate variability was also noted between the baseline and the mirroring stage. As the high frequency component of heart rate variability is driven primarily by respiration, it is likely that this is partially an effect of the slow breathing exercises undertaken by the participant mirroring the Creature. For many participants, this value remained low during the remainder of the experiment: eight had a lower hf % when sitting calmly with the Creature than during the baseline, and six during the reading task. A summary of significant physiological results is shown in Table 4.8. 92 4.3. Experiment 2: Creature Entraining and Reactions During a Task Table 4.8: Summary of significant results from Experiment 2. Stage 1: Creature Still; Stage 2: Ramped Creature Motion, User Asked to Mirror; Stage 3: Ramped Creature Motion Without User Task, User Asked to Breathe Normally; Stage 4: Ramped Creature Motion Without User Task, User Asked to Breathe Normally. physiological metric comparison mean SD p breath length sd stage 1–2 1.15 s 0.535 s < 0.001 stage 1–3 0.780 s 0.686 s 0.002 stage 1–4 0.596 s 0.524 s 0.009 stage 2–3 −0.377 s 0.411 s 0.011 stage 3–4 −0.560 s 0.395 s < 0.001 mean breath length fast–slow training mode −2.91 s 1.98 s 0.003 Creature with task 1.46 s 1.18 s 0.008 Creature without task −0.0540 s 0.296 s 0.008 heart rate hf% stage 2–3 -16.4 12.5 0.006 stage 2–4 -23.0 18.2 0.007 stage 3–4 -6.59 12.7 0.008 4.3.5 Conclusions Participants were able to consciously mirror the Creature’s respiration rate when instructed to do so. This mirroring produced a reduction in the overall mean standard deviation of breath lengths for participants, as well as changes in mean heart rate for eight out of nine participants. Either this training stage or the motion of the Creature also produced physiological effects in participants during the remainder of the experiment. The standard deviation of breath lengths remained significantly less during all stages with the Creature than during the baseline, but was significantly higher during the stages with the task than when training. When the Creature was present, there was a significant difference in overall participant mean breath length between when the Creature was moving at a slow constant rate and a fast constant rate — this was likely a response to Creature motion. The high frequency component of heart rate variability was significantly different be- tween the training stage and both task stages, as well as between the task stages. In general participants reported feeling comfortable with the Creature on their lap, and despite finding the Creature a bit noisy, most did not find it disturbing or distracting during their task. Overall, participants typically reported a high awareness of the Creature’s breathing, and a lower awareness of the Creature’s pulse; this corresponded with a much greater awareness of their own breathing than their own pulse. 93 4.4. Experiment 3: Experiment with Children 4.3.6 Feedback for Iterated Design This experiment completed the readiness testing of the TAMER platform. With the re- finements in Creature mechanisms and performance made after the first experiment, no major changes were necessary. However, several modifications were made to the overall platform to improve function during future experiments. These included logistical changes, additional data logging capability, and some cosmetic refinements. While the physiological sensors continued to record adequate data, linking the data to both specific moments in the experiment and Creature activity proved difficult. The logging of data from the Creature was found to malfunction occasionally, with several participants’ Creature logs missing several sections. Software protocols were adjusted to be more robust, and alerting added to notify the facilitator when Creature logging had failed. Finally, several cosmetic improvements were made to tidy up the sensor wiring to re- duce the risk of tangles. Where possible the cables were bundled and rerouted away from commonly accessed areas. Much of the procedure from this experiment was carried over into the next experiment. Of concern was the length of time required to gain meaningful physiological data from interaction with the Creature. After two hundred and ten seconds of sitting still with the Creature on their lap, moving at a fairly constant rate, many participants became bored. They looked away from the Creature and began to search around the room for other stimuli; some even asked the facilitator if the experiment were over yet. This is representative of the maximum amount of time participants can be expected to focus solely on the Creature before it becomes tedious. More varying motions of the Creature could be of use in maintaining engagement, but would not have allowed for the physiological effects sought for in this experiment to be measured. 4.4 Experiment 3: Experiment with Children The experience and success gained from the previous experiments provided the method- ological foundation to conduct an experiment with the Creature in a more representative environment. Due to the potential for increased receptiveness, or at the very least varied physiological responses from this very different age group, it had previously been decided that this school experiment would take place regardless of the findings of Creature success in manipulating physiological metrics in the previous experiments. Therefore, the resulting success of the Creature in affecting breathing and heart rate metrics was encouraging. A subject pool of children was expected to provide a very different experience than that of young adults: children were certain to be more physically demanding upon the Creature, due to either rough play or lack of care, but it was expected that they would also prove 94 4.4. Experiment 3: Experiment with Children more physiologically receptive to the Creature. The location for this experiment was the Eaton Arrowsmith School [112], “a co-educational, non-denominational, independent day school for elementary and secondary school students with learning differences/disabilities.” This school was chosen both for its location on the University of British Columbia campus and its staff’s willingness to work with researchers, as well as its unique curriculum and student population. Although the school’s students are not clinically diagnosed with severe emotional, behavioral, or intellectual disorders, they have experienced difficulty functioning academically in the regular school system. During their time at this school they spend several hours each day building cognitive skills through repetitive training exercises. This makes this group an ideal subject pool for the TAMER platform, as they spend the majority of their school day performing timed, intense, stress inducing activities. Many of these activities are performed individually on the computer, allowing for experimental sessions to be performed without disrupting the students’ daily routine. The procedure for this experiment draws heavily from that of the previous experiments, especially Experiment 2. There were several main research goals. The first goal was to confirm that the computer activity performed by the student was able to induce measurable physiological changes, and to determine what are these physiological changes. The computer activity chosen for this experiment was called “Clocks.” This computer program is used as part of the school’s curriculum. During the activity, the screen displays a clock face with tick-marks but no numbers. For each trial, a time is represented using hands of equal length, and the student must input the time displayed based upon the relation between the hands. For example, a clock with one hand pointing towards the 11 position, and another pointing between the 3 and the 4, but close to the 4, must be displaying 3:50; it could not be displaying 11:18. If the hour hand were pointing straight at the 11 position, the minute hand would have to be near the 12 mark on the dial. This exercise is fairly simple for two or three hands, but becomes increasingly difficult as more hands are added (eventually thousandths of a second, second, minute, hour, day, month, year, century, and millennium are displayed on the clock). The students must answer as quickly as possible and are given feedback after each clock and their overall score at the end. The assigned difficulty level is increased after the student masters a level, so that the students are always working at a high level of difficulty for them. Students generally have a high level of engagement with the program and are motivated to produce as high a score as possible as their performance is tracked and assessed. They typically perform this activity for up to half an hour at a time. To investigate this activity, physiological data of students performing the activity were recorded. A second goal was to investigate whether the Creature could be effective in alleviating stress or anxiety during this task. Students were asked to perform the task with the Creature 95 4.4. Experiment 3: Experiment with Children on their lap both still, moving more slowly than their baseline heart rate, and moving more quickly than their baseline heart rate. To determine this, physiological data were gathered to assess any changes from Creature presence and Creature motion, and students were asked their impressions of the Creature during the task and whether it helped or distracted them. The final, and perhaps most important goal, was to evaluate children’s receptiveness to and comfort level with the TAMER platform. Informal pilot studies had been conducted with children on a one–to–one basis as well as with non-EAS school groups, but this was the first time a large-scale study was conducted involving the Haptic Creature, physiological sensors, and children. Receptiveness to the sensors and the Creature was observed, and children were asked what they liked about the Creature and how they felt while playing with it. 4.4.1 Research Questions This experiment investigated the following research questions: • Do participants find the Haptic Creature calming or engaging, based upon subjective response? • Do the students’ computer activities induce physiological changes, and are any of these linked to an increased level of stress or anxiety? • Is the Haptic Creature able to induce physiological changes in participants during the experiment, either when still or moving slowly or quickly relative to the participant’s own rates? Similar to previous experiments, the mean heart rate, heart rate standard deviation, heart rate skewness, heart rate rms standard deviation, heart rate variability: pnn50, vlf%, lf%, mf%, and hf%, skin conductance, skin conductance derivative, electromyogram, elec- tromyogram derivative, skin temperature, skin temperature standard deviation, respiration rate, respiration rate standard deviation, respiration amplitude, and respiration amplitude standard deviation (see Section 3.5.2) were calculated and compared among and between all five experiment stages (see Figure 4.22) for all subjects using two-tailed dependent sample t-tests (all α = 0.05). Additionally, the series of each participant’s heart rate interbeat intervals (ibi) and breath lengths for each stage were compared within subjects using a two-tailed independent sample t-test (α = 0.05). 4.4.2 Experimental Procedure This experiment consisted of five major stages, as shown in Figure 4.22. This experiment took place in an office: the participant sat on one end of a table in front of a personal 96 4.4. Experiment 3: Experiment with Children computer, the experimenter and equipment were diagonally opposite, as far away as possible, at the other end of the table. Participants were taken out of their regular classroom activities during the school day for a thirty minute experiment session, the length of a typical school period, and the timings of each stage chosen to accommodate this length. Twenty-four participants, ages seven to thirteen, took part in the experiment. Participants wore noise- canceling headsets during the experiment. Permission slips were collected from the students’ parents and assent forms from the students by the school’s teachers before the experiment. introduction to creature activity w/o creature activity w/creature inactive activity w/creature “slow” activity w/creature “fast” Figure 4.22: The Experiment 3 procedure diagram. Introduction to Creature Each participant was brought into the room, and told that they were about to participate in an experiment with the Creature. The student was then asked if they would mind wearing the physiological sensors. They were fitted with six sensors: the skin conductance, blood volume pulse, and skin temperature sensors on their non-dominant hand, as well as the heart rate (EKG), EMG on the corrugator muscle of the forehead, and respiration rate sensors. They then had the Creature placed on their lap. An introduction to the Creature was given, in which the mechanisms of the Creature were described and a demonstration of the Creature both mirroring the participant and being actuated at a constant rate were shown. Participants were given an opportunity to pet the Creature and ask questions about it. After the introduction session the Creature was removed. The entire process was scripted to take approximately five minutes, with the gathering of physiological data starting after the sensors were donned, and lasting for about three minutes. This served as the baseline for the Experiment. 97 4.4. Experiment 3: Experiment with Children Activity Without Creature The participant was then instructed to begin their computer activity. He or she continued for about four minutes while physiological data were gathered. Activity With Creature Inactive The participants were then interrupted from their task and given the Creature. They were instructed that it might move, and to resume the activity. The Creature remained motionless for three minutes. Activity With Creature Slow After four minutes the Creature was activated with a breathing rate 20% slower and a heart rate 20 beats per minute less than that of the participant’s during the activity with creature inactive stage. The Creature remained in this state for four minutes. Activity With Creature Fast The Creature then transitioned for ninety seconds from the “slow” rate to the “fast” one: with breathing rate 20% faster and a heart rate 20 beats per minute higher than that of the participant during the activity with creature inactive state. The Creature remained in this state for four minutes, and was then deactivated. The order of the “fast” and the “slow” stages was counterbalanced, with the transition being modified appropriately. The activity without Creature state was always performed first, this minimized the disruption to the participant caused by handing them the Creature or taking it from them, which necessarily distracted them from their computer activity. Experiment Conclusion The Creature was removed from the participant, and then the sensors were removed while the participant was asked to discuss his or her experience. The experimenter initiated a conversation with all subjects during each session to elicit comments on their experience, with the goal of assessing their level of comfort and determining their subjective reactions. Notes were logged immediately following the session to avoid interrupting the flow of the sessions and yet maximize the amount of detail retained related to each session. No explicit questionnaire was used for this discussion. 4.4.3 Results Twenty-six students, 14 female and 12 male, between the ages of 7 and 14 participated in the experiment, with an average age of 10.9. An image of a user during the experiment, 98 4.4. Experiment 3: Experiment with Children attached to the physiological sensors and holding the Haptic Creature, is shown in Figure 4.23. Figure 4.23: Experiment 3 participant during experiment. Data from seven participants were not used for group-wise physiological comparisons. Of these, two were unable or unwilling to complete the specified computer activity, two had equipment failures, and for three there were external disruptions during the experiment that made their data unsuitable for comparison. For the remainder, the computer activity induced a reduction in heart rate variability pnn50, heart rate variability hf%, mean skin conductance, and respiration rate standard deviation (all p < 0.05), an increase in heart rate standard deviation and the standard deviation of skin temperature (all p < 0.05), as well as a change in heart rate variability vlf% (p < 0.05) as compared to baseline. Creature presence during the activity induced an increase in heart rate standard devi- ation, heart rate variability vlf %, skin conductance derivative standard deviation, mean skin temperature, and skin temperature standard deviation (all p < 0.05) as compared to performing the activity without the Creature. A summary of significant results is shown in Table 4.9. Subjective reactions to the Creature and experiment are discussed in Section 4.4.5, raw data is located in Appendix B.4.1. 99 4.4. Experiment 3: Experiment with Children Table 4.9: Summary of significant results from Experiment 3. comparison physiological metric mean sd p unit baseline to activity without Creature heart rate variability sd 422 418 < 0.001 ms heart rate pnn50 -0.054 0.099 0.013 heart rate vlf% 2.12× 10−4 2.31× 10−4 0.019 heart rate hf% -0.001 0.002 0.015 norm. skin conductance mean -0.121 0.170 0.002 skin temperature sd 0.124 0.239 0.019 ◦C respiration rate sd -29.8 66.8 0.039 bpm activity without Creature to activity with Creature heart rate variability sd -85.4 260 0.037 ms heart rate vlf% −1.74× 10−4 2.51× 10−4 0.001 norm. skin conductance derivative sd 0.015 0.0246 0.011 skin temperature mean 1.23 1.67 < 0.001 ◦C skin temperature sd 0.257 0.507 0.032 ◦C 4.4.4 Additional Investigation with the Creature After the first round of experiments was completed, the school at which the experiments were performed asked if the experimenters could return to perform trials with several par- ticipants who were not in school during the first round, but were still eager to participate. Due to equipment and space limitations it would have been impossible to maintain ade- quate controls with the first round of experiments. Therefore, their data were not pooled with others, but subjective results are reported here for completeness. An additional three students were used to pilot different interaction styles with the Creature. For these stu- dents the Creature ran for a longer amount of time, or at a different rate than the previous experiment. Four students who had previously participated in the experiment were brought back to determine second reactions to the Creature. They also participated with the Crea- ture operating continuously for a longer amount of time, and at different speeds than as previously. Physiological data from this part of the experiment were not analyzed or re- ported as the experimental conditions for this group were comparatively poor (the quiet room previously used for the study was not available, and a different, noisy and high traffic room was used). However, subjective reactions to the experience of participating with the creature and the experimental setup were recorded, and those reactions are included in the discussion in the following section. 4.4.5 Discussion Overall reactions to the TAMER platform and the experiment were quite positive. Stu- dents were excited to participate in the experiment; those who participated were sufficiently motivated enough to return a signed permission slip from their parents. They were not mo- 100 4.4. Experiment 3: Experiment with Children tivated just for the opportunity to miss class, since the school does not follow a traditional schedule. The Creature was undoubtedly the most appealing part of the experiment. Par- ticipants were uniformly enthusiastic about getting to know the Creature: all wanted to pet it, and upon entering the experiment room, most were disappointed that they had to have the physiological sensors attached before they could interact with the Creature. Physiological Sensors Reception to the physiological sensors was generally positive, most children were comfortable with the application and wearing of them. Two students expressed extreme apprehension of the sensors — one was calmed with the help of her personal assistant, and another by slowly putting on one sensor at a time. Although the name and purpose of each sensor was explained when they were put on, most seemed uninterested in their descriptions. Several students also expressed interest in viewing their physiological data on the computer. Once the sensors were on, the respiration sensor often required adjustments to ensure proper function. Although overall sensor performance during the experiment was good, a few common glitches were noticed during the experiment. In particular, both the EKG sensor and the BVP sensor would intermittently drop out, although almost never at the same time. Due to this redundancy useful heart rate data were collected, however this did necessitate manually selecting the cleanest signal for each time period. The EKG sensor was particularly prone to coming loose during experiments. To make the sensor less intrusive for the child and experimenter, instead of the common three-lead electrodes placed on separate areas of the chest, a single triode electric was placed in the middle of the chest. This is known to be less sensitive and less reliable, as the weight of the entire sensor is supported by one electrode, and skin adherence of that one electrode can be greatly reduced by perspiration. The blood volume pulse sensor was attached to the finger with a velcro strap that could occasionally become loose or cause the sensor to lose alignment; more often than not this occurred as the participant was petting or stroking the Creature. The skin conductance sensors were also attached by velcro to the fingers, and for two participants the skin conductance sensor electrodes became detached during the experiment. Once the sensors were attached, participants generally did not express discomfort with them during the experiment. Although participants did not seem particularly encumbered by the hand sensors, more demanding activities would not likely have been possible. They would have been unable to write with the hand mounted sensors on, and typing would have been difficult, but possible. Participants were naturally cautious of touching objects with their sensor hand. Several would initially hold their hand in the air without touching anything, and participants often had to be told that it was all right to pet and touch the Creature with the hand bearing the sensors. Once they were told that they could touch the 101 4.4. Experiment 3: Experiment with Children Creature, however, their interaction with it did not seem to be affected by the sensors. A few participants also found the EMG sensor distracting in that the wire, although generally held up by their headphones, could fall down and obstruct vision. The sensor was also difficult to attach to smaller children, who did not have a large forehead area relative to the size of the sensor. As analysis of the EMG data did not reveal any distinguishing characteristics; it was left off for the final subjects. A combination of the many wires and the logistics of the experiment room did make the sensors more cumbersome than they might otherwise have been. Although the wires had been bundled since the previous experiment, the large number of wires connected to the participant did make it somewhat difficult to pass the Creature to them. Due to the layout of the room, it was necessary to hand the Creature to the participant on the same side as the encoder. Had the Creature been able to be on the other side of the participant this would not have been a problem. There was also a worry that if a participant decided to hurriedly leave the experiment room they would drag a large number of sensors and wires with them, possible causing equipment damage, but fortunately this did not happen during trials. The caution most students showed with the sensors also makes this possibility unlikely, although a child in the middle of an anxiety attack might not show such caution. Reaction to Creature Reaction to both Creature presence and Creature motion was positive. Almost all partic- ipants were comfortable with having the Creature in their laps. One student was reticent about Creature on his lap, and desired to interact with the Creature on the desk before he would let it be placed there. Once he achieved initial comfort with the Creature, he was not uncomfortable during the remainder of the experiment. Students were surprised and pleased to find that the Creature was able to emulate their own breathing rate and heart rate. When asked, they felt that the breathing did not seem or sound mechanical. No students complained that the Creature was too heavy or noisy during the experiment. Several students said that the Creature reminded them of their own pets, particularly the warming sensation on the lap. In fact, the majority of students who participated in the experiment reported that they have or had had a dog or a cat at home. Creature motion and Creature activity also generally elicited positive reactions. A common comment after the experiment was that “the Creature felt alive.” Many students after the experiment asked if they could have their own Creature. In particular, several students who reported not being able to have a pet at home expressed that they would like to have the Creature as a substitute. Students also said that they liked the Creature better when it was moving than when it was not. Students also preferred a gently moving Creature to a still one. By sending a null command to the respiration servo, instead of 102 4.4. Experiment 3: Experiment with Children turning it off completely, a gentle humming sound and noise, similar to a continual purr, could be emitted from the Creature. Students preferred this somewhat active resting state to the Creature not moving at all. Interaction with the Creature Two typical interaction styles with the Creature were observed. In one the Creature seems to serve as a “comforting presence,” in the other as a “brief reassurance.” Most students performed their computer activity as normal, but petted the Creature during the activity. They reported the Creature as “comforting” and “pleasant” to have on their laps. Several also reported that they found the Creature to be “calming.” A few students, however, would take a break from the computer activity periodically to stop and look at the Crea- ture, petting it and occasionally breathing with it. These breaks were usually correlated with either the end of a computer activity “level” or the completion of a computer activity problem. One student suggested that the Creature helped her do better on the activity, another that the Creature was comforting to her when she got an answer wrong. Although the computer activities are timed, and thus taking a break during a level might not be ben- eficial for grading purposes, taking a brief break after a level to interact with the Creature, if helpful in reducing stress and anxiety, could have a beneficial effect on performance in the next level. Just as adults were generally unable to distinguish between different Creature motion states, so were the students. Several students reported after the experiment that they had thought the Creature was mirroring their breathing and pulse the entire time. One student said it “felt like him and I [the Creature] were one.” Another said that “I found it calming, it reminds me of my stuffy [stuffed animal].” Not all reactions to the Creature were positive, however. Several students reported that they felt the Creature to be distracting during the activity, and would have preferred it not move during the activity. The initial activation of the Creature also disturbed several students, who either jumped slightly, or briefly looked at the Creature when it turned on. A level of anthropomorphization of the Creature was observed in the students’ reactions to the Creature. They would become worried when the Creature stopped moving, or after a few minutes into the experiment if the Creature had not yet been activated. A few asked if the Creature was “sleeping” when it was not moving, or whether it was awake when it was moving. Older children tended to ask more if the Creature were “on” or “off,” and those more self-aware of the experiment would often ask if something was wrong when the Creature stopped moving. The physiological results shown from this were consistent with previous experiments, in that there were changes in heart rate variability associated with the Creature. This 103 4.4. Experiment 3: Experiment with Children “steadying” effect, also seen in Experiments 1 and 2, was associated with Creature pres- ence. There was no difference between the activity of the Creature “fast” and the Creature “slow” during the experiment. The computer activity reduced heart rate variability with a reduction in heart rate standard deviation and pnn50 that is consistent with the reduction in heart rate variability typically associated with stress, whereas Creature activity increased these metrics. Skin temperature increased as compared to the baseline during both the activity and creature presence stages, this change is most likely due to the increased level of physical and mental exertion caused by the computer activity. The “activity without creature” and “activity with creature inactive” stages were always performed first for rea- sons of experiment flow and introduction to the Creature. This constitutes a randomization restriction, which might have implications on the interpretation of results from this Exper- iment (e.g. potential confounds with effects of adaptation, learning, habituation or fatigue and boredom). We saw this as a necessary constraint. A change to a 30% difference from participant levels in breathing rate and pulse rate for the low and high activity states resulted in a Creature respiration rate that was almost uncomfortably fast, and was reported to be distracting by the test subjects. This was also impractical, since such a large difference or a small error in measurement of respiration rate could result in commanded respiration and pulse rates that nearly exceed the capabilities of the mechanism. The longer-term time frame of investigations allowed for meaningful calculations of the mean of various physiological signals and indices that were not possible in the shorter-term experiment. 4.4.6 Conclusions Overall, students had a positive reaction to interaction with the Haptic Affect Platform. The students reported that they found the Creature comforting during the activity, and expressed a wish to interact with it again. Stressful computer activity induced changes in heart rate variability standard deviation, pnn50, and vlf%; skin conductance mean and standard deviation of derivative; and skin temperature mean and standard deviation. The changes in heart rate variability and skin temperature are typical of response to stressful events. The Haptic Creature was able to induce several physiological changes in participants during the experiment. Creature presence induced changes in heart rate variability standard deviation and vlf%; skin conductance standard deviation of derivative; and skin temperature mean and standard deviation. 4.4.7 Feedback for Iterated Design This experiment provided many lessons and suggestions for interactions where children are the primary subject group, as well as valuable feedback on the TAMER platform’s hardware 104 4.4. Experiment 3: Experiment with Children and software. There were several refinements to experimental protocol that should be incorporated into future experiments with the platform. First, the sensors should be placed on the student before they are shown the Creature or other experiment equipment, as most will be more interested in those things than putting on the sensors. Due to the small size of the experiment room in this experiment it was impossible to hide the Creature completely from view as the students were walking in, and many immediately wanted to see and pet the Creature once they started the experiment. The students did not seem to suffer particular distress from having the Creature removed from their laps during the experiment; therefore, this was unlikely to have affected their task performance. Care must also be taken when describing the experiment to the students. Once several were told that they would be performing a computer activity they immediately started performing the computer activity, even before they had sensors attached. In two cases the computer screen had to be turned off so that they would break away from the activity to don the sensors. Unsurprisingly, children in this subject pool had demonstrably less impulse control than previous adult participants, and were not capable of waiting indepen- dently. They did, however, express a high level of enthusiasm and receptiveness towards the Creature. Experiments in which strict adherence to experimental protocols are necessary to main- tain experimental controls are challenging with younger participants, as they may not be able to accurately follow directions. For this experiment, there were no criteria for ex- clusion of participants. Several students who participated in the experiment were unable to complete the experiment protocol in a way that allowed for meaningful comparisons of physiological data between them and the other participants. Two had no experience with the computer “Clocks” activity that was being used, and two were unwilling to complete the clocks activity. These students were still enthusiastic to see the Creature and, as users, could potentially derive valuable benefits from the TAMER platform, but are not practical participants when limited experimental time is available. For shy or reticent students, a gradual interaction with the platform was found to be the best way to make them comfortable with it. Students who were wary of the sensors became more comfortable with them once the first sensor was put on and shown to cause no harm, and would eventually allow the remainder of the sensors to be put on them. Similarly, several students did not wish to have the Creature on their lap at first, and instead gently petted the Creature while it sat on the desk. After some time seeing Creature motion, the students would let the Creature be placed on their lap. This progressive interaction with the Creature took much longer than the typical experiment session, but allowed for students who otherwise would not have been able to participate to interact with the Creature. In addition to this gradual interaction, students would also benefit from a more coherent Creature “story,” detailing the expected motions and behavior of the Creature. As men- 105 4.4. Experiment 3: Experiment with Children tioned previously, students, particularly the younger ones, tended to anthropomorphize the Creature, and would become concerned when it stopped moving, started moving, or did not move for a long period of time. A narrative that incorporated both the Creature mecha- nisms and expected Creature actions would help alleviate student anxiety about experiment equipment performance, allowing them to focus more on their activity and emotional state. Separate research is ongoing to have the Creature display coherent emotional states: an ex- planation that the Creature is “sleeping” or “awake” would help children understand what the Creature is capable of doing and what to expect from it during the experiment. At the same time, care must be taken in describing the purpose of the Creature to potential subjects, or the parents of potential subjects. In this experiment there was a general aware- ness that the Creature was part of a study about anxiety and anxiety-reducing techniques, which may have colored self-reported comments from the students. While, as in a drug trial, describing the purpose of the Creature should not interfere with results, a greater emphasis on terms such as “companion” or “assistant” would reduce the concerns that user reports were influenced by the experiment vocabulary. While platform participants were concerned about expected Creature behavior, they were also occasionally confused about their own expected behavior. For this experiment students were not instructed to do anything other than pet the Creature during the activity. Several came up with innovative uses of the Creature, including pausing to relax with the Creature between activities, but several seemed confused by the lack of guidance for Creature interaction. Specific behavior instructions, such as pausing to breathe with the Creature or petting the Creature only during certain activities could lead to additional physiological benefit. The computer activities chosen for this experiment may not give useful information about the efficacy of the Creature in anxiety reduction. Students in general had various reactions to the computer activity. Some maintained a high level of engagement with the screen, devoting their attention to it rather than the Creature. This was evident in some students’ body language, where they would make visible or audible gestures of frustration upon getting an answer wrong, or success upon completing a problem. Others were unen- thused by their computer activity, and did not seem to care about their score or success rate. It is possible that this level of engagement with the computer activity affected the effect of the Creature on the participant. It is also possible that the physiological effects of the computer activity are not constant, but vary during the course of an activity session. If the computer activities are to be used for further experiments with the Creature, longer- term analysis of the physiological effects of the computer activities must be investigated. It is possible that physiological effects and scores on the activity are correlated: if true, this could be a useful measure of engagement with the task. The platform hardware could also benefit from from several further refinements. The 106 4.5. Reflections on Results amount of wires necessary for the sensors necessitates a stationary subject, and therefore precludes long-term engagement with the platform. Improved sensor form-factor, perhaps in the form of wearable clothing, would allow for the use of the TAMER platform in more diverse user environments. Creature hardware could also still be improved by the develop- ment of a quieter pulse mechanism, which would allow for use in a classroom. Although the noise was not loud enough to be noticed by the experiment participant when wearing noise canceling headphones, it would be disruptive in a quiet classroom environment. 4.5 Reflections on Results With the completion of these experiments, the TAMER platform has been iteratively re- vised and developed into a functional and engaging tool that is attractive and intriguing to children and many adults. It has been shown to have an effect on heart rate and breath- ing rate metrics. Next steps are to commence longer-term studies of interaction with the platform, to determine both the functionality of the hardware of the system over longer durations as well as working towards developing effective software strategies to accomplish the anxiety reduction goal. These experiments were mostly undirected in that goal; the breathing and heart rate of the Creature were varied in order to determine what effects are provoked in the human user, without attempting more focused interventions. The fact that physiological effects were produced was promising, but even more important was the interaction data gathered that will allow for future effective use of the TAMER platform. It was always the intention to involve therapists and psychologists in the development of the TAMER platform feedback loop, and now that the TAMER platform hardware has stabilized, it may be time to develop interaction scenarios and assessment strategies for therapeutic benefit. 107 Chapter 5 Conclusions and Recommendations This thesis presents a research platform and a set of research questions and experimental observations related to the platform’s use. This chapter first discusses the research out- comes and a methodological critique of the experiments, followed by the conclusions and recommendation for platform design. 5.1 Experimental Outcomes The overall research objectives were to determine the reactions to the Haptic Creature, and to determine whether physiological reactions can be provoked or manipulated in the user through the use of the TAMER platform. The experiments revealed several behavioral and physiological outcomes from Creature presence and actions. In particular, participants tended to find the Creature’s presence comforting. Although they were not generally able to recognize the Creature mirroring their breathing or pulse, once they were informed of this ability they found it intriguing and comforting. It was a concern that this would be perceived as “creepy” or intrusive, but this was an uncommon response. The mirroring also seemed to give participants, particularly the younger ones, a sense of meaning for the experiment and an understanding of the purpose of the physiological sensors. Participants were able to successfully perform the reverse, matching their breathing to that of the Creature, but this did not have any heart rate effects. Suddenly switching from this user-following mode to Creature mirroring mode without informing them did, on occasion, result in what may be described as a positive feedback loop, which one participant found quite uncomfortable in this experiment. Participants preferred an active Creature to an inactive Creature: they preferred even a gentle purring to no motion or sound at all. There was high receptiveness to the Creature among children, who generally did not find the Creature distracting during other activities. A summary of some significant physiological effects found during the experiments is below. During the pilot experiment: • Disturbing images correlated with changes in mean heart rate, mean heart rate stan- dard deviation, mean heart rate acceleration, mean EMG, mean derivative of skin 108 5.1. Experimental Outcomes conductance, mean arousal. • Creature presence correlated with reduced mean skin conductance, mean derivative of skin conductance, mean arousal. In Experiment 1: • Mean heart rate and heart rate variability significantly different between creature constant motion and creature still stages. • Mean skin conductance and skin temperature higher during creature constant motion and creature mirroring user stages than creature still stage. • Standard deviation of breath lengths less during Creature motion stage than Creature still stage In Experiment 2: • Mean breath length significantly different between entraining, creature with task, and creature without task stages. • Creature presence correlated with a difference in mean heart rate. • Standard deviation of breath lengths significantly less during Creature entraining, creature with task, and creature without task stages than baseline. • Hf % of heart rate variability significantly different between Creature entraining and Creature with task, Creature entraining and Creature without task, and Creature without task and Creature with task stages. In Experiment 3: • Computer activity correlated with reduced mean standard deviation of heart rate, heart rate pnn50, and skin conductance. • Computer activity correlated with increased mean derivative of skin conductance, skin temperature, standard deviation of skin temperature, and heart rate vlf %. • Creature presence correlated with increase in mean heart rate standard deviation, heart rate vlf %, standard deviation of derivative of skin conductance, skin tempera- ture, and standard deviation of skin temperature as compared to computer activity without Creature. 109 5.2. Methodological Critique and Recommendations 5.2 Methodological Critique and Recommendations 5.2.1 Platform Presentation The experimental protocol for the TAMER platform used throughout these experiments proved ineffective in certain areas. In particular, there is a great need for explanation when presenting the Creature and the experiment. As mentioned, most participants were not able to recognize the Creature mirroring their breathing and heart rate without an explanation that this would occur. Even after they had just been equipped with physiological sensors that measure their breathing and heart rate, participants were surprised that a link between them and the Creature could be established. Many participants also did not notice the pulse mechanism in the Creature — it was only able to be felt over a small area of the Creature, and should be identified before use. It is necessary to fully describe the mechanisms of the Creature to the participant before the experiment, they are not likely to recognize the mech- anisms on their own. The application of the physiological sensors and interaction with the Creature represents a fairly novel event for most participants, and although the zoomorphic nature of the Creature may imply that it contains certain expressive mechanisms, these are not always evident upon initial investigation. It is also important that the Creature’s mechanisms be activated during this intro- duction, and that the Creature operate at a breathing and pulse rate appropriate for the experiment, in order to set a proper expectation for the behavior of the Creature. The actual breathing and pulse rate of animals the Haptic Creature’s size is different than that of a human, and the awareness of this fact in participants will also vary. The Creature’s normal physiological activity level must be established as similar to that of a human. The timing of this introduction is also important. Consent forms, questionnaires, and physiological sensors should be administered and attached before the participant is able to see the Creature. Participants, in particular children, often wanted to interact with the Creature upon seeing it for the first time, and expressed a desire to hold it and pet it. It was then necessary to temper their enthusiasm in order to attach the physiological sensors, and in the case of several students it was particularly difficult to take away their attention from the Creature in order to setup the experiment. However, there is also the possibility that participants having seen the Creature during this and prior experiments before undergoing their baseline assessment may have inadvertently reduced the effects on the experimental results of any sort of “novelty effect” caused by exposure to the Creature, as it would now also influence their physiological baselines. Most participants also found initial Creature motion new and interesting, but the actions of the Creature mechanisms were simple enough that transitions between Creature activity states during the experiments were not likely to induce a significant effect, and indeed the transitions were not often recognized by the participants. Participants, particularly children, were typically excited to interact with 110 5.2. Methodological Critique and Recommendations the Creature, and this enthusiasm lasted through their experimental sessions. A clinical introduction to the Creature should leverage this initial appeal to help develop a long- term working relationship with the TAMER platform. Although in actual operation the Creature’s motion is quite subtle and non-intrusive, allowing for users to focus on other tasks, a more sophisticated model of emotions for the Creature could help improve a user’s attachment to the Creature. The increased amount of time required to discover all of the Creature’s eventual operating behaviors should guarantee adequate observation time for the child’s psychologist to observe his or her interactions with the Creature and train his or her behaviors. 5.2.2 Platform Interaction A Coherent Story In experiments with children, and possibly in experiments with adults, there is also the need for a coherent “story” to explain Creature activity and motions during an experiment. Children, especially, were surprisingly aware of the behavior of the Creature during the experiment. If the Creature had not moved after a long time period, or stopped moving after being active, they would often express worry or be upset that the Creature was broken, or that something in the experiment was not working. An explanation of the Creature’s behavior implying that the Creature may be asleep sometimes, awake other times, and curious or happy for part of the experiment would help to relieve participant anxiety about Creature functionality, and provide an expectation for Creature behavior. Care should be taken, however, to ensure that a story of the Creature does not impose a specific species, with possible confounding associated behavior expectations, onto the Creature. Informal surveys revealed descriptions of the Creature as variously a cat, rabbit, mouse, pig, guinea pig, or simply a “furry thing,” with no one answer predominating. Avoiding identifying the Creature as a specific species, although presenting creative challenges in developing a story, helps to avoid possible negative reactions to the Creature due to a user’s previous interactions with the chosen species. In particular, this greatly simplifies the introduction of the Creature to children, as a detailed investigation of a user’s past interactions with animals is not necessary before presenting the Creature to them. A behavior model for another version of the Haptic Creature has been developed that links various Creature mechanism activity levels to Creature emotional states, as interpreted by users. Integrating this model into the TAMER platform could allow for more advanced interaction during experiments. At the very least, a comparison between the Creature’s typical activity rates when mirroring a human and the emotional states ascribed to the Creature running at those rates could prove informative. Such a behavior model should also be incorporated into the next stage of TAMER 111 5.2. Methodological Critique and Recommendations platform experiments, that of longer-term engagement with the system. Creature behav- ior during these experiments was extremely simple, it acted as essentially a physiological metronome on the lap of the subject, occasionally changing tempo but only gradually. While short-term results with this method were promising, the eventual usage targeted for the TAMER platform is longer-term anxiety reduction. Although the physiological results from the experiments presented here were promising, these may not occur in a long-term experiment, and may require more sophisticated Creature behaviors to maintain engage- ment. Longer-term experiments, particularly with a broader subject pool, may also reveal personality or background characteristics that would tend to make certain users particularly more or less receptive to engagement with the Creature. No such trends were observed in these studies. Additional experiments, such as comparing the effects of the Creature to a child’s com- panion stuffed animal, could also provide valuable feedback as to the effectiveness of the TAMER platform compared to typical therapy methods. Interaction Models In order to enable longer duration experiments, as well as to improve the shorter exper- iments, there is a need for more focused and directed interaction with Creature. Except when asked to follow the Creature’s breathing, participants were not given any instructions on how to interact with the Creature. Often participants, particular adult participants, seemed uncertain of how to behave with the Creature. Child participants were aware that the Creature was related to a study on anxiety, and therefore might help to calm them down, but were not aware of how this would actually occur. More detailed instructions to participants about the desired effects of the Creature, and what actions they could take to help achieve them, might help the participants achieve greater success in accomplishing these goals. Observations of interaction with the Creature during the experiment suggest three po- tential interaction models to be experimentally investigated. The first is Creature guided interaction, where the Creature is used to lead the participant through a series of breathing exercises. Experimental data showed that participants could easily follow the Creature’s breathing with their own: this could be of potential use in anxiety inducing situations. Many relaxation techniques involve deep breathing exercises, and the Creature could pro- vide calming and engaging guidance in this task. A short break to breath slowly and deeply with the Creature either before, after, or in the middle of an anxiety-provoking task might be able to produce calming effects in the participant, and allow them to access their previously taught strategies for coping with stress more easily. The second is Creature mirroring of users to improve awareness of their own physiological 112 5.3. Platform Design state. Participants generally reported a low awareness of their own heart rate. Using the Creature to improve the user’s self awareness could help in training them to recognize increased levels of stress or an impending anxiety attack. By having knowledge of their own typical and stressed body states, users could again intervene with situation appropriate coping skills. The third interaction model is that of intervention. Once the TAMER platform is capable of recognizing either anxious states or the precursors to anxiety, the Creature could become active only when the user is approaching an anxiety attack. Instead of running all the time mirroring the user, the Creature would activate only when necessary, alerting the user to their anxious state. Once active, the Creature could then attempt to calm down the user. A simple slow, steady breathing rhythm similar to that used in the present experiment might prove sufficient in reducing anxiety, but more sophisticated behaviors are possible. For example, the Creature itself could present an anxious state, either by mirroring the user or acting independently. The user could then be trained to reduce the Creature’s level of anxiety by breathing slowly or performing other therapeutic techniques, and the Creature’s activity level could gradually decrease in response to changes in physiological metrics. These behaviors should be investigated as allowing for longer-term use of the Creature and TAMER platform in anxiety reduction. 5.3 Platform Design 5.3.1 Outcomes Overall, the individual components of the TAMER platform were integrated to produce an effective and reliable system. However, there are several improvements that could be made to improve overall functionality. Creature The Haptic Creature was shown to be an effective device for displaying affect through breathing and heart rate mechanisms, while being comfortable for and engaging with its users. Participants reported high levels of comfort with the Creature: they were receptive to it being placed on their laps and moving around, and found its fur to be soft and pleasing to the touch. In particular, participants responded positively to the warmth of the Creature and its life-like attributes. They desired that it gently purr even when still, as opposed to just sitting as a dead-weight on their lap. The Creature’s breathing mechanism was successfully able to portray breathing, it was recognized as such by users. The Creature’s pulse mechanism was particularly noisy, but it did successfully generate a pulse sensation in a narrow area of the Creature. 113 5.3. Platform Design Sensors Participants, both the adults and mildly-anxious children, were surprisingly receptive to wearing the physiological sensors, and generally did not find them distracting or uncomfort- able during experiments. Several minor problems were associated with sensor functionality during experiments. The EKG sensor, when mounted to the chest with a single triode electrode, instead of three separate electrodes, would occasionally become detached during experiments, leading to loss of signal. The blood volume pulse sensor, and indeed all the sensors attached to the fingers, could occasionally become detached during the experiment. Participants were typically hesitant to touch anything with the hand attached to the sensors, and had to be instructed that it was acceptable to pet the Creature with that hand. Once told, they did not seem encumbered by the sensors. Had the participants attempted to move around during the experiment, however, they would have found their motion constrained by the sensors. The numerous wires required for the sensors were continually getting tangled, and there was a concern that a sudden large motion by a participant, such as a nervous child desiring to leave the room, could cause damage to the equipment. This scenario did not occur during experiments, however. 5.3.2 Recommendations There are several changes recommended for the hardware and software of the TAMER platform. The Creature, although functional, requires several modifications that would allow for more effective-longer term experiments, and help to move the Creature from a laboratory environment to a less clinical setting, such as a school or a home. In particular, the noise of the pulse mechanism was moderately audible, and would be noticeable in the quiet of a classroom. A pulse mechanism that created a motion able to be felt over a larger area of the Creature could make the pulse mechanism more effectively able to convey the pulse sensation. Creature noise must be assessed to reduce it to a level that would not bother other students in a quiet room. The gentle vibration that participants preferred to the Creature being completely inactive did have an audio component; the use of the purring mechanism instead of the breathing mechanism linkage to create this sensation should be investigated. In addition, consideration should be made towards eliminating as many wires to the Creature as possible. Presently, with the radio system in use, the power cable is the only wire that must be attached to the Creature. Provisions exist on the electronics board for an internal battery pack to be mounted; the use of this should be investigated. The need for an external power supply not only restricts the usage of the Creature, but there is also the risk that the cable could become inadvertently disconnected during use, disabling the Creature prematurely. The software for reading the physiological sensors and controlling the Creature was 114 5.3. Platform Design adequate for the experiment. However, to support both this platform and other experiments with the physiological sensors, the integration of additional timing and observational inputs, such as video feeds and push-button controls, should be developed. Linking the physiological data to exact moments in the experiment, or indeed to exact times in general is often difficult. A more advanced sensor suite incorporating video and physiological data, as well as a better system for marking notable occurrences during an experiment, would greatly simplify data analysis, and potentially allow for more subtle results to be uncovered. In addition, the physiological data gathered should be used at a higher level than simply mean values. Medical interpretation of the physiological data gathered by the platform, or the use of an inference engine or machine learning techniques to estimate clinical assessments of anxiety, such as the Multidimensional Anxiety Scale for Children [113], based on training data provided by psychologists, could provide more reliable estimations of the effect the platform has on anxiety levels. Better online estimation of anxiety levels could allow for a more effective platform, as the specific Creature activities could be associated with their effects on anxiety and then used appropriately in a therapy regimen. A physiological sensing system with both the form factor and capability to support longer-term observations could also allow for the identification of chronic anxiety with the physiological sensors. This along with medical observations would aid the determination of any effects of the TAMER platform on longer-term, chronic anxiety. Sensor form factor is one of the limiting factors of this platform. The present sensors are somewhat intrusive, and require both a large amount of time to set up and many wires to be connected to the participant. Reducing the amount of wires necessary to be attached to the participant, or, ideally, eliminating wires all together, would both improve participant experience and allow the sensors to be used on mobile participants in an actual classroom environment. Combining several sensors into a single form factor, such as a piece of clothing or a glove, could also help participants who were reticent about having the sensors attached to feel more comfortable. Additionally, several of the sensors, particularly heart rate and heart rate variability, can generate useful data over observation periods of several hours or even days, that may be useful in anxiety reduction. Integration of these long-term wearable sensors into the platform could produce improved results and more useful data. Better integration of the sensor data into the experiment procedure might also lead to interesting avenues of investigation and applications for the platform. Several children were interested in viewing their physiological data on the computer, and wanted to know more about the sensor readings. The children who participated in the computer activity were score and goal focused, due to typically having their performance assessed during these activities. Cataloging physiological measures related to anxiety and then displaying them to the user has been shown to be of benefit for adults. Feedback to users of both long- term and short-term physiological data as both a visual and a haptic (Creature) display 115 5.4. Conclusion could assist them in recognizing anxiety inducing behaviors and therefore eliminating them. Even at this young age, these children are already quite familiar with improving score-based performance. Overall TAMER platform functionality was sufficient for the experiment, but these changes recommended should improve platform performance. 5.4 Conclusion This thesis has described the construction of the TAMER platform and the initial testing and experimental verification of the same. The Haptic Creature constructed as part of the TAMER platform distinguishes itself from other robotic companions by recreating physi- ological activities through a solely haptic presentation method, and is, uniquely, capable of reacting to a user’s sensed physiological state or displaying a user’s state with its own mechanisms. This link establishes an advancement in biofeedback technology, as it should be easier, especially for children, to relate to a robot than to a pulse or heart rate moni- tor. Physiological interaction with robots is advanced: the TAMER platform demonstrates real-time reaction to a user’s physiological state and real-time interaction with the potential to guide the user’s physiological state in a controlled feedback loop. Further, the TAMER platform has been demonstrated in a school environment. Results from the experiments in this thesis support the potential of the TAMER platform to be used in anxiety manage- ment therapy. Users of the Haptic Creature, in particular children, reported a strong desire to interact with and work with the Creature; they also found it comforting and calming during tasks. Users were able to follow the Haptic Creature in a breathing related experi- ment. Physiological effects from the Creature were also found in users interacting with the creature — a first step towards fully controlled manipulation of user physiological state. 116 Bibliography [1] P. Rani and N. Sarkar, “Emotion-sensitive robots-a new paradigm for human-robot interaction,” in 2004 4th IEEE/RAS International Conference on Humanoid Robots, pp. 149–167, 2004. [2] R. Picard, E. Vyzas, and J. Healey, “Toward machine emotional intelligence: Analysis of affective physiological state,” IEEE transactions on pattern analysis and machine intelligence, pp. 1175–1191, 2001. [3] S. Yohanan and K. MacLean, “The Haptic Creature Project: Social Human-Robot Interaction through Affective Touch,” in Proceedings of the AISB 2008 Symposium on the Reign of Catz & Dogs: The Second AISB Symposium on the Role of Virtual Creatures in a Computerised Society, vol. 1, pp. 7–11, 2008. [4] C. Suveg, P. Kendall, J. Comer, and J. Robin, “Emotion-focused cognitive-behavioral therapy for anxious youth: A multiple-baseline evaluation,” Journal of Contemporary Psychotherapy, vol. 36, no. 2, pp. 77–85, 2006. [5] G. Macklem, Practitioner’s guide to emotion regulation in school-aged children. Springer Verlag, 2007. [6] R. Reiner, “Integrating a portable biofeedback device into clinical practice for pa- tients with anxiety disorders: Results of a pilot study,” Applied Psychophysiology and Biofeedback, vol. 33, no. 1, pp. 55–61, 2008. [7] M. Hertenstein, “Touch: Its communicative functions in infancy,” Human Develop- ment, vol. 45, no. 2, pp. 70–94, 2002. [8] S. Yohanan and K. MacLean, “A tool to study affective touch,” in Proceedings of the 27th international conference extended abstracts on Human factors in computing systems, pp. 4153–4158, ACM, 2009. [9] D. Kulić and E. Croft, “Affective State Estimation for Human–Robot Interaction,” IEEE Transactions on Robotics, vol. 23, no. 5, pp. 991–1000, 2007. [10] “Sony AIBO Support.” http://support.sony-europe.com/aibo/index.asp, March 2010. [11] “Furby wikipedia entry.” http://en.wikipedia.org/wiki/Furby, March 2010. [12] B. Friedman, P. Kahn Jr, and J. Hagman, “Hardware companions?: What online AIBO discussion forums reveal about the human-robotic relationship,” in Proceedings of the SIGCHI conference on Human factors in computing systems, pp. 273–280, ACM New York, NY, USA, 2003. 117 Bibliography [13] G. Melson, P. Kahn Jr, A. Beck, B. Friedman, T. Roberts, and E. Garrett, “Robots as dogs?: Children’s interactions with the robotic dog AIBO and a live Australian Shepherd,” in CHI’05 extended abstracts on Human factors in computing systems, p. 1652, ACM, 2005. [14] G. Melson, P. Kahn Jr, A. Beck, and B. Friedman, “Robotic pets in human lives: Implications for the human-animal bond and for human relationships with personified technologies,” Journal of Social Issues, vol. 65, no. 3, pp. 545–567, 2009. [15] M. Banks, L. Willoughby, and W. Banks, “Animal-assisted therapy and loneliness in nursing homes: use of robotic versus living dogs,” Journal of the American Medical Directors Association, 2007. [16] T. Tamura, S. Yonemitsu, A. Itoh, D. Oikawa, A. Kawakami, Y. Higashi, T. Fu- jimooto, and K. Nakajima, “Is an entertainment robot useful in the care of elderly people with severe dementia?,” Journals of Gerontology Series A: Biological and Med- ical Sciences, vol. 59, no. 1, p. 83, 2004. [17] T. Shibata, T. Mitsui, K. Wada, A. Touda, T. Kumasaka, K. Tagami, and K. Tanie, “Mental commit robot and its application to therapy of children,” in 2001 IEEE/ASME International Conference on Advanced Intelligent Mechatronics, 2001. Proceedings, vol. 2, 2001. [18] K. Wada and T. Shibata, “Robot therapy in a care house-its sociopsychological and physiological effects on the residents,” in Proc. IEEE ICRA, pp. 3966–3971, 2006. [19] C. Kidd, W. Taggart, and S. Turkle, “A sociable robot to encourage social interaction among the elderly,” in IEEE International Conference on Robotics and Automation (ICRA), Orlando, Florida, USA, Citeseer, 2006. [20] T. Shibata, K. Wada, Y. Ikeda, and S. Sabanovic, “Cross-Cultural Studies on Sub- jective Evaluation of a Seal Robot,” Advanced Robotics, vol. 23, no. 4, pp. 443–458, 2009. [21] “Paro Robots Inc..” http://parorobots.com/, March 2010. [22] W. Stiehl, J. Lieberman, C. Breazeal, L. Basel, L. Lalla, and M. Wolf, “Design of a therapeutic robotic companion for relational, affective touch,” Robot and Human Interactive Communication, 2005. ROMAN 2005. IEEE International Workshop on, pp. 408–415, Aug. 2005. [23] W. Stiehl, C. Breazeal, K. Han, J. Lieberman, L. Lalla, A. Maymin, J. Salinas, D. Fuentes, R. Toscano, C. Tong, et al., “The huggable: a therapeutic robotic compan- ion for relational, affective touch,” in ACM SIGGRAPH 2006 Emerging technologies, p. 15, ACM, 2006. [24] W. Stiehl, J. Lee, C. Breazeal, M. Nalin, A. Morandi, and A. Sanna, “The huggable: a platform for research in robotic companions for pediatric care,” in Proceedings of the 8th International Conference on Interaction Design and Children, pp. 317–320, ACM New York, NY, USA, 2009. 118 Bibliography [25] J. Saldien, K. Goris, S. Yilmazyildiz, W. Verhelst, and D. Lefeber, “On the design of the huggable robot Probo,” Journal of Physical Agents, vol. 2, no. 2, p. 3, 2008. [26] K. Goris, J. Saldien, I. Vanderniepen, and D. Lefeber, “The Huggable Robot Probo, a Multi-disciplinary Research Platform,” in Proceedings of the EUROBOT Conference, pp. 22–24, Springer, 2008. [27] A. van Breemen, X. Yan, and B. Meerbeek, “icat: an animated user-interface robot with personality,” in AAMAS ’05: Proceedings of the fourth international joint confer- ence on Autonomous agents and multiagent systems, (New York, NY, USA), pp. 143– 144, ACM, 2005. [28] G. Castellano, A. Pereira, I. Leite, A. Paiva, and P. W. McOwan, “Detecting user engagement with a robot companion using task and social interaction-based features,” in ICMI-MLMI ’09: Proceedings of the 2009 international conference on Multimodal interfaces, (New York, NY, USA), pp. 119–126, ACM, 2009. [29] C. Breazeal, “Toward sociable robots,” Robotics and Autonomous Systems, vol. 42, no. 3-4, pp. 167–175, 2003. [30] M. Heerink, B. Kröse, B. Wielinga, and V. Evers, “Enjoyment intention to use and actual use of a conversational robot by elderly people,” in Proceedings of the 3rd ACM/IEEE international conference on Human robot interaction, pp. 113–120, ACM, 2008. [31] T. Kanda, T. Hirano, D. Eaton, and H. Ishiguro, “Interactive robots as social partners and peer tutors for children: A field trial,” Human-Computer Interaction, vol. 19, no. 1, pp. 61–84, 2004. [32] H. Kozima, M. Michalowski, and C. Nakagawa, “A Playful Robot for Research, Ther- apy, and Entertainment,” Int J Soc Robot, vol. 1, pp. 3–18, 2009. [33] C. Plaisant, A. Druin, C. Lathan, K. Dakhane, K. Edwards, J. M. Vice, and J. Mon- temayor, “A storytelling robot for pediatric rehabilitation,” in Assets ’00: Proceedings of the fourth international ACM conference on Assistive technologies, (New York, NY, USA), pp. 50–55, ACM, 2000. [34] G. Kronreif and P. GmbH, “Robot Systems for Play in Education and Therapy of Dis- abled Children,” Towards Intelligent Engineering and Information Technology, p. 221, 2009. [35] A. Cook, B. Bentz, N. Harbottle, C. Lynch, and B. Miller, “School-based use of a robotic arm system by children with disabilities,” Neural Systems and Rehabilitation Engineering, IEEE Transactions on, vol. 13, pp. 452 –460, dec. 2005. [36] H. Krebs, B. Ladenheim, C. Hippolyte, L. Monterroso, J. Mast, and G. Wittenberg, “Robot-assisted task-specific training in cerebral palsy.,” Developmental medicine and child neurology, vol. 51, p. 140, 2009. 119 Bibliography [37] K. Dautenhahn and A. Billard, “Games children with autism can play with robota, a humanoid robotic doll,” Universal Access and Assistive Technology, pp. 179–190, 2002. [38] B. Robins, K. Dautenhahn, R. Te Boekhorst, and A. Billard, “Effects of repeated exposure to a humanoid robot on children with autism,” Designing a More Inclusive World, pp. 225–236, 2004. [39] T. Salter, I. Werry, and F. Michaud, “Going into the wild in child–robot interaction studies: issues in social robotic development,” Intelligent Service Robotics, vol. 1, no. 2, pp. 93–108, 2008. [40] C. Liu, K. Conn, N. Sarkar, and W. Stone, “Online Affect Detection and Robot Behavior Adaptation for Intervention of Children With Autism,” IEEE Transactions on Robotics, vol. 24, no. 4, pp. 883–896, 2008. [41] B. Robins, K. Dautenhahn, and J. Dubowski, “Robots as isolators or mediators for children with autism? A cautionary tale,” in Proc. AISB, vol. 5, pp. 82–88, Citeseer, 2005. [42] M. Raskin, G. Johnson, and J. Rondestvedt, “Chronic anxiety treated by feedback- induced muscle relaxation: A pilot study,” Archives of General Psychiatry, vol. 28, no. 2, p. 263, 1973. [43] R. Townsend, J. House, and D. Addario, “A comparison of biofeedback-mediated relaxation and group therapy in the treatment of chronic anxiety,” American Journal of Psychiatry, vol. 132, no. 6, p. 598, 1975. [44] P. Lehrer, E. Vaschillo, and B. Vaschillo, “Resonant frequency biofeedback training to increase cardiac variability: rationale and manual for training,” Applied Psychophys- iology and Biofeedback, vol. 25, no. 3, pp. 177–191, 2000. [45] P. Lehrer and R. Woolfolk, “Research on clinical issues in stress management,” Prin- ciples and practice of stress management, pp. 703–721, 2007. [46] J. Murphy, Comparison of relaxation techniques for group cognitive behavioral therapy for generalized anxiety disorder. PhD thesis, Alliant International University, 2009. [47] J. Fisher, M. Rytting, and R. Heslin, “Hands touching hands: Affective and evaluative effects of an interpersonal touch,” Sociometry, vol. 39, no. 4, pp. 416–421, 1976. [48] F. Willis and H. Hamm, “The use of interpersonal touch in securing compliance,” Journal of Nonverbal Behavior, vol. 5, no. 1, pp. 49–55, 1980. [49] J. Vormbrock and J. Grossberg, “Cardiovascular effects of human-pet dog interac- tions,” Journal of Behavioral Medicine, vol. 11, no. 5, pp. 509–517, 1988. [50] S. Shiloh, G. Sorek, and J. Terkel, “Reduction of state-anxiety by petting animals in a controlled laboratory experiment,” Anxiety, Stress and Coping, vol. 16, no. 4, pp. 387–395, 2003. 120 Bibliography [51] A. Haans and W. A. IJsselsteijn, “The virtual midas touch: Helping behavior after a mediated social touch,” IEEE Transactions on Haptics, vol. 2, pp. 136–140, 2009. [52] H. Cramer, N. Kemper, A. Amin, and V. Evers, “The effects of robot touch and proactive behaviour on perceptions of human-robot interactions.,” in Proceedings of the 4th ACM/IEEE international conference on Human robot interaction, pp. 275– 276, ACM, 2009. [53] K. Salminen, V. Surakka, J. Lylykangas, J. Raisamo, R. Saarinen, R. Raisamo, J. Rantala, and G. Evreinov, “Emotional and behavioral responses to haptic stim- ulation,” in CHI ’08: Proceeding of the twenty-sixth annual SIGCHI conference on Human factors in computing systems, (New York, NY, USA), pp. 1555–1562, ACM, 2008. [54] C. Swindells, K. MacLean, K. Booth, and M. Meitner, “Exploring affective design for physical controls,” in Proceedings of the SIGCHI conference on Human factors in computing systems, p. 942, ACM, 2007. [55] J. Smith and K. MacLean, “Communicating emotion through a haptic link: Design space and methodology,” International Journal of Human-Computer Studies, vol. 65, no. 4, pp. 376–387, 2007. [56] A. Chang, B. Resner, B. Koerner, X. Wang, and H. Ishii, “Lumitouch: an emotional communication device,” in CHI ’01: CHI ’01 extended abstracts on Human factors in computing systems, (New York, NY, USA), pp. 313–314, ACM, 2001. [57] F. Vetere, M. R. Gibbs, J. Kjeldskov, S. Howard, F. F. Mueller, S. Pedell, K. Mecoles, and M. Bunyan, “Mediating intimacy: designing technologies to support strong-tie relationships,” in CHI ’05: Proceedings of the SIGCHI conference on Human factors in computing systems, (New York, NY, USA), pp. 471–480, ACM, 2005. [58] F. F. Mueller, F. Vetere, M. R. Gibbs, J. Kjeldskov, S. Pedell, and S. Howard, “Hug over a distance,” in CHI ’05: CHI ’05 extended abstracts on Human factors in com- puting systems, (New York, NY, USA), pp. 1673–1676, ACM, 2005. [59] L. Bonanni, C. Vaucelle, J. Lieberman, and O. Zuckerman, “Taptap: a haptic wear- able for asynchronous distributed touch therapy,” in CHI ’06: CHI ’06 extended ab- stracts on Human factors in computing systems, (New York, NY, USA), pp. 580–585, ACM, 2006. [60] K. Kim, S. Bang, and S. Kim, “Emotion recognition system using short-term mon- itoring of physiological signals,” Medical and biological engineering and computing, vol. 42, no. 3, pp. 419–427, 2004. [61] J. Wagner, J. Kim, and E. André, “From physiological signals to emotions: Imple- menting and comparing selected methods for feature extraction and classification,” in IEEE International Conference on Multimedia & Expo (ICME 2005), pp. 940–943, 2005. 121 Bibliography [62] D. Kulić and E. Croft, “Physiological and subjective responses to articulated robot motion,” Robotica, vol. 25, no. 01, pp. 13–27, 2006. [63] L. Rabiner and B. Juang, “An introduction to hidden Markov models,” IEEE ASSp Magazine, vol. 3, no. 1, pp. 4–16, 1986. [64] C. Liu, K. Conn, N. Sarkar, and W. Stone, “Affect recognition in robot assisted rehabilitation of children with autism spectrum disorder,” in Proc. of the 15th IEEE Intl. Conf. on Robotics and Automation, Citeseer, 2006. [65] P. Rani, C. Liu, N. Sarkar, and E. Vanman, “An empirical study of machine learning techniques for affect recognition in human&#x2013;robot interaction,” Pattern Anal. Appl., vol. 9, no. 1, pp. 58–69, 2006. [66] C. Bethel, K. Salomon, R. Murphy, and J. Burke, “Survey of psychophysiology mea- surements applied to human-robot interaction,” in IEEE International Symposium on Robot and Human Interactive Communication, Jeju, Korea, 2007. [67] Y. Takahashi, N. Hasegawa, K. Takahashi, and T. Hatakeyama, “Human interface using PC display with head pointing device for eating assist robot and emotional evaluation by GSR sensor,” in IEEE International Conference on Robotics and Au- tomation, vol. 4, pp. 3674–3679, IEEE; 1999, 2001. [68] K. Itoh, H. Miwa, Y. Nukariya, M. Zecca, H. Takanobu, S. Roccella, M. Carrozza, P. Dario, and T. Atsuo, “Development of a Bioinstrumentation System in the Inter- action between a Human and a Robot,” in International Conference of Intelligent Robots and Systems, pp. 2620–2625, 2006. [69] P. Rani, N. Sarkar, C. A. Smith, and L. D. Kirby, “Anxiety detecting robotic system— towards implicit human-robot collaboration,” Robotica, vol. 22, no. 1, pp. 85–95, 2004. [70] N. Hanajima, T. Goto, Y. Ohta, H. Hikita, and M. Yamashita, “A motion rule for human-friendly robots based on electrodermal activity investigations and its appli- cation to mobile robot,” in 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2005.(IROS 2005), pp. 3791–3797, Citeseer, 2005. [71] D. Kulić and E. Croft, “Pre-collision safety strategies for human-robot interaction,” Autonomous Robots, vol. 22, no. 2, pp. 149–164, 2007. [72] S. Yohanan, M. Chan, J. Hopkins, H. Sun, and K. MacLean, “Hapticat: exploration of affective touch,” in ICMI ’05: Proceedings of the 7th international conference on Multimodal interfaces, (New York, NY, USA), pp. 222–229, ACM, 2005. [73] J. Chang, K. MacLean, and S. Yohanan, “The haptic creature’s gesture recognition engine,” in EuroHaptics, (Amsterdam), Springer, 2010. [74] M. Banzi, D. Cuartielles, T. Igoe, G. Martino, and D. Mellis., “Arduino.” http: //www.arduino.cc. [75] “Dimension engineering llc.” http://www.dimensionengineering.com/. 122 Bibliography [76] B. Fry and C. Reas, “Processing programming language.” http://www.processing. org. [77] M. Craske, Anxiety disorders: Psychological approaches to theory and treatment. Basic Books, 1999. [78] C. Bankart and R. Elliott, “Heart rate and skin conductance in anticipation of shocks with varying probability of occurrence,” Psychophysiology, vol. 11, no. 2, pp. 160–174, 1974. [79] A. Öhman and J. Soares, ““Unconscious Anxiety”: Phobic Responses to Masked Stimuli,” Journal of Abnormal Psychology, vol. 103, no. 231–231, p. 1994, 1994. [80] H. Caprara, P. Eleazer, R. Barfield, and S. Chavers, “Objective measurement of patient’s dental anxiety by galvanic skin reaction,” Journal of Endodontics, vol. 29, no. 8, pp. 493–496, 2003. [81] R. Hoehn-Saric, D. McLeod, and W. Zimmerli, “Somatic manifestations in women with generalized anxiety disorder: Psychophysiological responses to psychological stress.,” Archives of General Psychiatry, vol. 46, no. 12, pp. 1113–1119, 1989. [82] R. Hoehn-Saric and D. McLeod, “Anxiety and arousal: physiological changes and their perception,” Journal of Affective Disorders, vol. 61, no. 3, pp. 217–224, 2000. [83] R. Dishman, Y. Nakamura, M. Garcia, R. Thompson, A. Dunn, and S. Blair, “Heart rate variability, trait anxiety, and perceived stress among physically fit men and women,” International Journal of Psychophysiology, vol. 37, no. 2, pp. 121–133, 2000. [84] E. Blom, E. Olsson, E. Serlachius, M. Ericson, and M. Ingvar, “Heart rate variability (HRV) in adolescent females with anxiety disorders and major depressive disorder,” Acta Pædiatrica, vol. 99, no. 4, pp. 604–611, 2010. [85] J. Thayer, B. Friedman, and T. Borkovec, “Autonomic characteristics of generalized anxiety disorder and worry,” Biological Psychiatry, vol. 39, no. 4, pp. 255–266, 1996. [86] B. Friedman and J. Thayer, “Anxiety and autonomic flexibility: a cardiovascular approach,” Biological Psychology, vol. 47, no. 3, pp. 243–263, 1998. [87] W. Suess, A. Alexander, D. Smith, H. Sweeney, and R. Marion, “The effects of psy- chological stress on respiration: A preliminary study of anxiety and hyperventilation,” Psychophysiology, vol. 17, no. 6, pp. 535–540, 1980. [88] J. Martinez, J. Kent, J. Coplan, S. Browne, L. Papp, G. Sullivan, M. Kleber, F. Pere- pletchikova, A. Fyer, D. Klein, et al., “Respiratory variability in panic disorder.,” Depression and anxiety, vol. 14, no. 4, p. 232, 2001. [89] V. Niccolai, M. van Duinen, and E. Griez, “Respiratory patterns in panic disor- der reviewed: a focus on biological challenge tests,” Acta Psychiatrica Scandinavica, vol. 120, no. 3, pp. 167–177, 2009. 123 Bibliography [90] S. Rimm-Kaufman and J. Kagan, “The psychological significance of changes in skin temperature,” Motivation and Emotion, vol. 20, no. 1, pp. 63–78, 1996. [91] B. Mittelmann and H. Wolff, “Emotions and skin temperature: Observations on patients during psychotherapeutic (psychoanalytic) interviews,” Psychosomatic Medicine, vol. 5, no. 3, p. 211, 1943. [92] P. Boudewyns, “A comparison of the effects of stress vs. relaxation instruction on the finger temperature response,” Behavior Therapy, vol. 7, no. 1, pp. 54–67, 1976. [93] J. Smith, M. Bradley, and P. Lang, “State anxiety and affective physiology: effects of sustained exposure to affective pictures,” Biological Psychology, vol. 69, no. 3, pp. 247–260, 2005. [94] J. Cacioppo, R. Petty, M. Losch, and H. Kim, “Electromyographic activity over fa- cial muscle regions can differentiate the valence and intensity of affective reactions,” Journal of personality and social psychology, vol. 50, no. 2, pp. 260–268, 1986. [95] U. Dimberg, “Facial electromyography and emotional reactions.,” Psychophysiology, vol. 27, no. 5, pp. 481–494, 1990. [96] U. Dimberg, M. Thunberg, and K. Elmehed, “Unconscious facial reactions to emo- tional facial expressions,” Psychological Science, pp. 86–89, 2000. [97] D. Kulić, Safety for Human-Robot Interaction. PhD thesis, The University of British Columbia, 2005. [98] “Thought Technology inc..” http://www.thoughttechnology.com/, January 2010. [99] P. Hamilton and W. Tompkins, “Quantitative investigation of QRS detection rules using the MIT/BIH arrhythmia database,” IEEE Trans. Biomed. Eng, vol. 33, no. 12, pp. 1157–1165, 1986. [100] K. Jensen-Urstad, B. Saltin, M. Ericson, N. Storck, and M. Jensen-Urstad, “Pro- nounced resting bradycardia in male elite runners is associated with high heart rate variability,” Scandinavian journal of medicine & science in sports, vol. 7, no. 5, pp. 274–278, 1997. [101] M. Malik, J. Bigger, A. Camm, R. Kleiger, A. Malliani, A. Moss, and P. Schwartz, “Heart rate variability: Standards of measurement, physiological interpretation, and clinical use,” European Heart Journal, vol. 17, no. 3, p. 354, 1996. [102] “American History X (video clip).” http://www.youtube.com/watch?v= rdVeW4hCLpE, 1998. [103] P. Lang, A. Ohman, and D. Vaitl, “The international affective picture system (photo- graphic slides),” Gainesville, FL: Center for Research in Psychophysiology, University of Florida, 1988. [104] M. McManis, M. Bradley, W. Berg, B. Cuthbert, and P. Lang, “Emotional reactions in children: Verbal, physiological, and behavioral responses to affective pictures,” Psychophysiology, vol. 38, no. 02, pp. 222–231, 2001. 124 [105] C. Weems, A. Zakem, N. Costa, M. Cannon, and S. Watts, “Physiological Response and Childhood Anxiety: Association With Symptoms of Anxiety Disorders and Cog- nitive Bias,” Journal of Clinical Child and Adolescent Psychology, vol. 34, no. 4, pp. 712–723, 2005. [106] P. Rani, J. Sims, R. Brackin, and N. Sarkar, “Online stress detection using psy- chophysiological signals for implicit human-robot cooperation,” Robotica, vol. 20, no. 06, pp. 673–685, 2002. [107] S. Zoghbi, D. Kulić, E. Croft, and M. Van der Loos, “Evaluation of affective state estimations using an on-line reporting device during human-robot interactions,” in Proceedings of the 2009 IEEE/RSJ international conference on Intelligent robots and systems, pp. 3742–3749, IEEE Press, 2009. [108] D. Kulić and E. Croft, “Estimating robot induced affective state using hidden markov models,” in Robot and Human Interactive Communication, 2006. ROMAN 2006. The 15th IEEE International Symposium on, pp. 257–262, 2006. [109] M. Lader, “Palmar skin conductance measures in anxiety and phobic states.,” Journal of Psychosomatic Research, vol. 11, no. 3, pp. 271–281, 1967. [110] S. Barker and K. Dawson, “The effects of animal-assisted therapy on anxiety ratings of hospitalized psychiatric patients,” Psychiatric Services, vol. 49, no. 6, p. 797, 1998. [111] “The graduate record exam.” http://www.ets.org/gre/. [112] “Eaton Arrowsmith School®.” http://www.eatonarrowsmithschool.com/. [113] J. March, J. Parker, K. Sullivan, P. Stallings, and C. Conners, “The Multidimensional Anxiety Scale for Children (MASC): factor structure, reliability, and validity,” Journal of Amer Academy of Child & Adolescent Psychiatry, vol. 36, no. 4, p. 554, 1997. 125 Appendix A Derivations A.1 Creature Physiological Mirroring Derivations The following variables are used throughout these derivations: y Commanded Creature breathing servo amplitude (roughly how high the abdomen ap- pears) [0,1] r User average respiration rate [breaths per second] t time [seconds] l User average breath length [seconds] p User averaged heart rate [bps] i Commanded Creature interbeat interval (time between heart beats) [seconds] A.1.1 Derivation of Ramped Breathing Motion Commands A sinusoidal wave increasing frequency at rate k: y = cos(2pit(f0 + k 2 t)) (A.1) General From t0 to t1: k = r2 − r0 t1 − t0 ⇒ y1 = − cos ( 2pi(t− t0) ( (r2 − r0)(t− t0) 2(t1 − t0) + r0 )) (A.2) y1 = − cos ( 2pi(t− t0) ( (r2 − r0)(t− t0) 2(t1 − t0) + r0 )) (A.3) and: y1(t = t1) = − cos ( 2pi(t1 − t0) ( (r2 − r0)(t1 − t0) 2(t1 − t0) + r0 )) (A.4) 126 A.1. Creature Physiological Mirroring Derivations From t1 to t2: y2 = − cos(2pir2t+ β) (A.5) y2(t = t1) = y1(t = t1) (A.6) − cos(2pir2t+ β) = − cos ( 2pi(t1 − t0) ( (r2 − r0)(t1 − t0) 2(t1 − t0) + r0 )) (A.7) 2pir2t1 + β = 2pi(t1 − t0) ( (r2 − r0)(t1 − t0) 2(t1 − t0) + r0 ) (A.8) β = 2pi(t1 − t0) ( (r2 − r0)(t1 − t0) 2(t1 − t0) + r0 ) − 2pir2t1 (A.9) y2 = − cos (2pir2t+ β) where β = 2pi(t1 − t0) ( (r2 − r0)(t1 − t0) 2(t1 − t0) + r0 ) − 2pir2t1 (A.10) and: y2(t = t2) = − cos (2pir2t2 + β) (A.11) From t2 to t3: k = r4 − r2 t3 − t2 ⇒ y3 = − cos ( 2pi(t− t2) ( (r4 − r2)(t− t2) 2(t3 − t2) + r0 )) + γ) (A.12) y2(t = t2) = y3(t = t2) (A.13) − cos (2pir2t2 + β) = − cos ( 2pi(t2 − t2) ( (r4 − r2)(t2 − t2) 2(t3 − t2) + r0 )) + γ) (A.14) γ = 2pir2t2 + β (A.15) y3 = − cos ( 2pi(t− t2) ( (r4 − r2)(t− t2) 2(t3 − t2) + r0 )) + γ) (A.16) (A.17) y3 = − cos ( 2pi(t− t2) ( (r4−r2)(t−t2) 2(t3−t2) + r0 )) + γ) where γ = 2pir2t2 + β β = 2pi(t1 − t0) ( (r2−r0)(t1−t0) 2(t1−t0) + r0 ) − 2pir2t1 (A.18) and: y3(t = t3) = − cos ( 2pi(t3 − t2) ( (r4 − r2)(t3 − t2) 2(t3 − t2) + r0 )) + γ) (A.19) From t3 to t4: 127 A.1. Creature Physiological Mirroring Derivations y4 = − cos(2pir4t+ δ) (A.20) y4(t = t3) = y3(t = t3) (A.21) − cos(2pir4t3 + δ) = − cos ( 2pi(t− t2) ( (r4 − r2)(t− t2) 2(t3 − t2) + r0 )) + γ) (A.22) 2pir2t3 + δ = 2pi(t3 − t2) ( (r4 − r2)(t3 − t2) 2(t3 − t2) + r0 ) + γ (A.23) δ = 2pi(t3 − t2) ( (r4 − r2)(t3 − t2) 2(t3 − t2) + r0 ) + γ (A.24) y4 = − cos(2pir4t+ δ) where δ = 2pi(t3 − t2) ( (r4−r2)(t3−t2) 2(t3−t2) + r0 ) + γ γ = 2pir2t2 + β β = 2pi(t1 − t0) ( (r2−r0)(t1−t0) 2(t1−t0) + r0 ) − 2pir2t1 (A.25) Examples Where r2 = 1.2r0; r4 = 0.8r0; t0 = 30; t1 = 90; t2 = 150; t3 = 210: y(t) =  − cos(2pi(t− 30)(r0 + 1600r0(t− 30))) t < 30 − cos(125 pir0t− 84pir0) 30 ≤ t < 90 − cos(2pi(t− 150)(r0 − 1300r0(t− 150))) 90 ≤ t < 150 − cos(85pir0t+ 372pir0) 150 ≤ t < 210 (A.26) Where r2 = 1.2r0; r4 = 0.8r0; t0 = 60; t1 = 180; t2 = 300; t3 = 420: y(t) =  − cos(2pi(t− 60)(r0 + 11200r0(t− 60))) t < 60 − cos(125 pir0t− 168pir0) 60 ≤ t < 180 − cos(2pi(t− 300)(r0 − 1600r0(t− 300))) 180 ≤ t < 300 − cos(85pir0t+ 744pir0) 300 ≤ t < 420 (A.27) A.1.2 Derivation of Ramped Pulse Rate General From t0 to t1: i1 = 60 p0 + p2 t1−t0 t ; (A.28) From t1 to t2: i2 = 60 p2 (A.29) 128 A.1. Creature Physiological Mirroring Derivations From t2 to t3: i3 = 60 p2 + p4 t3−t2 t ; (A.30) From t3 to t4: i4 = 60 p4 (A.31) Examples Where p2 = 1.2p0; p4 = 0.8p0; t1 = 30; t2 = 90; t3 = 150; t4 = 210: i(t) =  60 1.2p0+ p0 30t t < 30 50 p0 30 ≤ t < 90 60 0.8p0+ 1.2p0 60t 90 ≤ t < 150 75 p0 150 ≤ t < 210 (A.32) Where p2 = 1.2p0; p4 = 0.8p0; t1 = 60; t2 = 180; t3 = 300; t4 = 420: i(t) =  60 1.2p0+ p0 30t t < 60 50 p0 60 ≤ t < 180 60 0.8p0+ 1.2p0 120t 180 ≤ t < 300 75 p0 300 ≤ t < 420 (A.33) A.1.3 Derivation of Ramped Breathing Motion Commands [Simplified Motion] General From t1 to t2: yA = − cos(2pirAt) (A.34) and: yA(t = t2) = − cos(2pirAt2) (A.35) From t2 to t3: k = rc − ra t3 − t2 ⇒ yB = − cos ( 2pi(t− t2) ( (rA − rC)(t− t2) 2(t2 − t3) + rA) ) + β ) (A.36) 129 A.1. Creature Physiological Mirroring Derivations yB(t = t2) = yA(t = t2) (A.37) − cos(β) = − cos(2pirAt2) (A.38) β = 2pirAt2 (A.39) (A.40) yB = − cos ( 2pi(t− t2) ( (rA − rC)(t− t2) 2(t2 − t3) + rA) ) + 2pirAt2 ) (A.41) and: yB(t = t3) = 4pi(t3 − t2)(rA + rC) + 2pirAt2 (A.42) From t3 to t4: yC = − cos(2pirCt+ γ) (A.43) γ = yB(t = t3) = 4pi(t3 − t2)(rA + rC) + 2pirAt2 (A.44) yC = − cos(2pirCt+ 4pi(t3 − t2)(rA + rC) + 2pirAt2) (A.45) Experiment 3 In Experiment 3, where rA = 1.2r0; rC = 0.8r0; t1 = 0; t2 = 240; t3 = 350: y(t) =  yA = cos(2.4pir0t) t < 240 yB = cos(2pi(t− 240)(65r0 − 1550r0(t− 240)) + 576pir0) 240 ≤ t < 350 yC = cos(1456pir0 + 8 5pir0t) 350 ≤ t < 590 (A.46) In Experiment 3, where rA = 0.8r0; rC = 1.2r0; t1 = 0; t2 = 240; t3 = 350: y(t) =  yA = cos(1.6pir0t) t < 240 yB = cos(2pi(t− 240)(45r0 − 1550r0(t− 240)) + 384pir0) 240 ≤ t < 350 yC = cos(1264pir0 + 12 5 pir0t) 350 ≤ t < 590 (A.47) A.1.4 Derivation of Ramped Pulse Rate [Simplified Motion] General From t1 to t2: iA = 60 pA ; (A.48) 130 A.2. Physiological Sensor Data Analysis Methods From t2 to t3: iB = 60 pA + pC t3−t2 t ; (A.49) From t3 to t4: iC = 60 pC ; (A.50) Experiment 3 In Experiment 3, where pA = 1.2p0; pC = 0.8p0; t1 = 0; t2 = 240; t3 = 350: i(t) =  50 p0 t < 240 60 1.2p0− 0.4p090t 240 ≤ t < 350 75 p0 350 ≤ t < 590 (A.51) In Experiment 3, where pA = 0.8r0; pC = 1.2r0; t1 = 0; t2 = 240; t3 = 350: i(t) =  75 p0 t < 240 60 0.8p0+ 0.4p0 90t 240 ≤ t < 350 50 p0 350 ≤ t < 590 (A.52) A.2 Physiological Sensor Data Analysis Methods During experiments, the following physiological measures were typically calculated. • mean heart rate • heart rate standard deviation • heart rate skewness • heart rate rms standard deviation • heart rate variability: – pnn50 – vlf% – lf% – mf% 131 A.2. Physiological Sensor Data Analysis Methods – hf% • skin conductance • skin conductance derivative • electromyogram • electromyogram derivative • skin temperature • skin temperature standard deviation • respiration rate • respiration rate standard deviation • respiration amplitude, • respiration amplitude standard deviation For an experiment with n physiological measures m calculated, t stages p, o subjects s, for each physiological measure the physiological measure for each subject for each stage ν was calculated. For pool-, or group-wise comparisons, two-tailed dependent sample t-tests with α of 0.05 were performed between the columns of ν. mn =  p1 · · · pt s1 ν1,1 ... . . . so νo,t  From these comparisons it was possible to state whether the condition difference between stages had an effect on that physiological measure. Participant’s heart rate interbeat intervals (ibi) and breath lengths were a series variable, there were numerous samples for each participant for each stage for each experiment. for 132 A.2. Physiological Sensor Data Analysis Methods these two variables only, two-tailed independent sample t-tests were used within subjects to determine for each participant if the series of ibis or breath lengths were different between stages. Between subjects comparisons were not performed. From these comparisons it was possible to state whether a participant’s ibi or breath lengths were different between stages. In response to the examination committee, a Bonferroni comparison does not seem appropriate for this situation. These results are also clearly labeled as exploratory, and comparisons made here are single analyses on separate sensor channels for the users. There are no statistical analyses that I am aware of that can be performed on the qualitative survey results to produce significant results. A graph of the survey results is therefore included where these are discussed in the text. 133 Appendix B Experiment Documents This chapter contains the experiment data not included in the main body. For each exper- iment a pre-experiment questionnaire, post-experiment questionnaire, sample data, sample comparisons, and participant consent form are included. As explained in each experiment’s “Experiment Procedure” section, there was no pre-experiment questionnaire for Experi- ments 1, 2, and 3; and no post-experiment questionnaire for Experiment 3. The following is a list of what is included in this section: • Preliminary Experiment – Pre-Experiment Questionnaire – Post-Experiment Questionnaire – Sample Data – Sample Comparisons – Participant Consent Form • Experiment 1 – Post-Experiment Questionnaire – Data Tables – Sample Data – Sample Comparisons – Participant Consent Form • Experiment 2 134 B.1. Preliminary Experiment – Post-Experiment Questionnaire – Data Tables – Sample Data – Sample Comparisons – Participant Consent Form • Experiment 3 – Sample Data – Sample Comparisons – Participant Consent Form – Participant Assent Form A table of contents is also located in the Table of Contents. B.1 Preliminary Experiment B.1.1 Pre-Experiment Questionnaire Participant Questionnaire for Haptics and Anxiety Study 1. Age:  18-22  23-26  27-30  30+ 2. Gender:  Male  Female 3. Profession or Program of Study: 135 B.1. Preliminary Experiment 4. Do/Did you have pets or do you regularly interact with pets. If so, what kind of pets? 5. In general do you enjoy the company of animals? If so, what kind of animals (if different from above)? 6. Do you often interact with young children or babies?  Yes  No 7. If yes, list a few of your most pleasurable interactions (e.g., carrying the child, tucking them into bed...): 8. Did you have stuffed toys when you were a child?  Yes  No 9. Do you currently interact (e.g., play, cuddle, sleep with, etc...) with a stuffed toy?  Yes  No 10. Please rate your comfort with the following ’physical touch’ situations: 136 B.1. Preliminary Experiment Being hugged by a loved-one: (not comfortable) 1 2 3 4 5 (very comfortable) Being hugged by a new acquaintance: (not comfortable) 1 2 3 4 5 (very comfortable) Shaking hands with a colleague: (not comfortable) 1 2 3 4 5 (very comfortable) Shaking hands with a stranger: (not comfortable) 1 2 3 4 5 (very comfortable) Patting a family member’s back: (not comfortable) 1 2 3 4 5 (very comfortable) Patting a friend’s back: (not comfortable) 1 2 3 4 5 (very comfortable) Are there other situations that you would like to mention? 137 B.1. Preliminary Experiment B.1.2 Post-Experiment Questionnaire Post-Experiment Questionnaire for Haptics and Anxiety Study 1. Please answer the following questions on the given scales. Please rate your emotional state while watching the first set of images: (Anxious) 1 2 3 4 5 (Relaxed) (Agitated) 1 2 3 4 5 (Calm) (Quiescent 1 2 3 4 5 (Surprised) Please rate your emotional state while watching the second set of images: (Anxious) 1 2 3 4 5 (Relaxed) (Agitated) 1 2 3 4 5 (Calm) (Quiescent 1 2 3 4 5 (Surprised) I found the haptic device comforting while watching the images: (Strongly Disagree) 1 2 3 4 5 (Strongly Agree) I found the actions of the haptic creature to be a distraction while watching the images: (Strongly Disagree) 1 2 3 4 5 (Strongly Agree) I feel that the haptic creature would be useful in reducing my anxiety in other situations: (Strongly Disagree) 1 2 3 4 5 (Strongly Agree) 2. Please comment on your reaction to the haptic creature: 138 B.1. Preliminary Experiment Definitions: anxious: troubled or uneasy in mind. relaxed: at ease, free from constraint or tension. agitated: excited, disturbed in mind. calm: quiet, still, tranquil, serene. quiescent: being at rest; quiet; still; inactive or motionless. surprise: to come upon or discover suddenly and unexpectedly. B.1.3 Sample Data 0 50 100 150 200 250 300 350 400 450 500 -0.5 -0.4 -0.3 -0.2 -0.1 0 0.1 0.2 0.3 0.4 time [s] he ar t r at e ac ce le ra tio n heart rate acceleration Figure B.1: Heart rate acceleration for a participant during the Pilot Experiment. Black lines delineate experiment stages. 139 B.1. Preliminary Experiment 0 50 100 150 200 250 300 350 400 450 500 65 70 75 80 85 90 95 100 105 110 time [s] he ar t r at e [b pm ] heart rate Figure B.2: Heart rate for a participant during the Pilot Experiment. Black lines delineate experiment stages. 0 50 100 150 200 250 300 350 400 450 500 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 time [s] SC Rn or m normalized skin conductance Figure B.3: Normalized skin conductance for a participant during the Pilot Experiment. Black lines delineate experiment stages. 140 B.1. Preliminary Experiment 0 50 100 150 200 250 300 350 400 450 500 -0.3 -0.2 -0.1 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 time [s] dS CR no rm normalized derivative of skin conductance Figure B.4: Skin conductance derivative for a participant during the Pilot Experiment. Black lines delineate experiment stages. 0 50 100 150 200 250 300 350 400 450 500 -0.19 -0.18 -0.17 -0.16 -0.15 -0.14 -0.13 -0.12 -0.11 -0.1 -0.09 time [s] no rm al ize d EM G normalized EMG Figure B.5: Normalized EMG for a participant during the Pilot Experiment. Black lines delineate experiment stages. 141 B.1. Preliminary Experiment B.1.4 Sample Comparisons 1 2 3 4 5 6 7 8 9 0 10 20 30 40 50 60 70 80 90 100 mean heart rate for participants during preliminary experiment participant he ar t r at e [b pm ] baseline no creature disturbing images no creature baseline creature disturbing images creature Figure B.6: Mean heart rate for participants during Pilot Experiment. 142 B.1. Preliminary Experiment 1 2 3 4 5 6 7 8 9 0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 mean normalized standard deviation of heart rate for all participants during preliminary experiment participant HR  n or m  s d baseline no creature disturbing images no creature baseline creature disturbing images creature Figure B.7: Standard deviation of normalized heart rates for participants during Pilot Experiment. 1 2 3 4 5 6 7 8 9 −0.07 −0.06 −0.05 −0.04 −0.03 −0.02 −0.01 0 0.01 0.02 mean normalized heart rate acceleration for all participants during preliminary experiment participant HR Ac ce ln or m baseline no creature disturbing images no creature baseline creature disturbing images creature Figure B.8: Mean normalized heart rate acceleration for participants during Pilot Experi- ment. 143 B.1. Preliminary Experiment 1 2 3 4 5 6 7 8 9 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 mean normalized skin conductance for all participants during preliminary experiment participant SC RN or m baseline no creature disturbing images no creature baseline creature disturbing images creature Figure B.9: Mean normalized skin conductance for participants during Pilot Experiment. 1 2 3 4 5 6 7 8 9 −0.03 −0.02 −0.01 0 0.01 0.02 0.03 0.04 0.05 mean normalized derivative of skin conductance for all participants during preliminary experiment participant dS CR no rm baseline no creature disturbing images no creature baseline creature disturbing images creature Figure B.10: Mean normalized derivative of skin conductance for participants during Pilot Experiment. 144 B.1. Preliminary Experiment 1 2 3 4 5 6 7 8 9 −0.2 −0.15 −0.1 −0.05 0 0.05 0.1 0.15 0.2 mean normalized EMG for participants during preliminary experiment participant EM Gn or m baseline no creature disturbing images no creature baseline creature disturbing images creature Figure B.11: Mean normalized EMG for participants during Pilot Experiment. 1 2 3 4 5 6 7 8 9 0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 estimated arousal for all participants during preliminary experiment participant ar ou sa l baseline no creature disturbing images no creature baseline creature disturbing images creature Figure B.12: Mean estimated arousal for participants during Pilot Experiment. 145 B.1. Preliminary Experiment B.1.5 Participant Consent Form                  THE UNIVERSITY OF BRITISH COLUMBIA            Department of Computer Science 2366 Main Mall Vancouver, B.C.  Canada  V6T 1Z4 tel:   (604) 822-3061 fax:  (604) 822-4231 (PARTICIPANT’S  COPY CONSENT FORM) Project Title: Physical user interfaces: Communication of information and affect  (UBC Ethics #B01-0470) Principal Investigator: Associate Professor K. MacLean, tel. 604-822-8169  The purpose of this study is to examine the role of haptic (touch sense) feedback on anxiety levels.  You will be asked to wear external (i.e., non-invasive) sensors that collect some basic physiological information such as the heart rate, respiration rate, some muscle activity, and perspiration.  Please tell the experimenter if you find the sensor positioning uncomfortable, and adjustments will be made.  You will be asked to answer questions in two questionnaires as part of the experiment. The study will be viewed by the experimenters in a separate room via a webcam.  It will not be recorded. For this study, you will also be asked to view two slide-shows of pictures that you may find disturbing.  The outline of the study is as follows:  You will first be asked to answer a questionnaire. You will then be connected to the bio-sensors.  Then you will then be shown a two-minute slide show of approximately ten pictures.  Next you will be given a haptic creature that you will hold while watching another set of pictures shown in the same format as before.  Finally, you will complete another questionnaire.  If you are not sure about any instructions, do not hesitate to ask.  REIMBURSEMENT: $5 per  hour session TIME COMMITMENT:  hour session CONFIDENTIALITY: Your results will be confidential:  you will not be identified by name in any study reports. Test results will be stored in a secure Computer Science account accessible only to the experimenters.  You understand that the experimenter will ANSWER ANY QUESTIONS you have about the instructions or the procedures of this study. After participating, the experimenter will answer any questions you have about this study. Your participation in this study is entirely voluntary and you may refuse to participate or withdraw from the study at any time without jeopardy. Your signature below indicates that you have received a copy of this consent form for your own records, and consent to participate in this study.  If you have any concerns about your treatment or rights as a research subject, you may contact the Research Subject Info Line in the UBC Office of Research Services at 604-822-8598. 146 B.2. Experiment 1 B.2 Experiment 1 B.2.1 Post-Experiment Questionnaire Creature Impression I found the creature comfortable on my lap. (strongly disagree) 1 2 3 4 5 (strongly agree) Did this impression change at all once the creature started moving? What changes would you recommend to make the creature more comfortable? Creature Activity Describe your overall impression of the creature’s activity What did you like the most about the creature’s activity? What did you like the least about the creature’s activity? 147 B.2. Experiment 1 Did you expect this sort of activity from the creature? I was startled by the activation of the creature. (strongly disagree) 1 2 3 4 5 (strongly agree) I found the creature’s motion dis- turbing. (strongly disagree) 1 2 3 4 5 (strongly agree) I found the noise of the creature dis- tracting. (strongly disagree) 1 2 3 4 5 (strongly agree) How many distinct creature operating modes were you able to observe? Please describe all the modes you were able to observe. Which sequence did you find more pleasurable? 148 B.2. Experiment 1 Overall Response It was easy to recognize the creature mirroring my breathing. (strongly disagree) 1 2 3 4 5 (strongly agree) I found the creature mirroring my breathing comforting. (strongly disagree) 1 2 3 4 5 (strongly agree) I found the creature mirroring my breathing disturbing. (strongly disagree) 1 2 3 4 5 (strongly agree) The creatures breathing made me more aware of my own breathing. (strongly disagree) 1 2 3 4 5 (strongly agree) Was it evident that the creature was mirroring your breathing? It was easy to recognize the creature mirroring my pulse. (strongly disagree) 1 2 3 4 5 (strongly agree) I found the creature mirroring my pulse comforting. (strongly disagree) 1 2 3 4 5 (strongly agree) I found the creature mirroring my pulse disturbing (strongly disagree) 1 2 3 4 5 (strongly agree) The creatures pulse made me more aware of my own heart rate. (strongly disagree) 1 2 3 4 5 (strongly agree) Was it evident that the creature was mirroring your pulse? 149 B.2. Experiment 1 Describe your overall reaction to the creature mirroring your breathing rate and pulse. Were you surprised at your breathing rate when you felt it in the creature? Were you surprised at your breathing rate when you felt it in the creature? 150 B.2. Experiment 1 B.2.2 Data Tables Table B.1: Table of results from Experiment 1 questionnaire (1 = strongly disagree, 5 = strongly agree). Responses Statement 1 2 3 4 5 na 6 2 2 1 It was easy to recognize the creature mirroring my breathing. 1 1 8 I found the creature mirroring my breathing comforting. 2 1 2 5 I found the creature mirroring my breathing disturbing. 2 2 2 4 The creature’s breathing made me more aware of my own breathing. 7 1 1 1 1 It was easy to recognize the creature mirroring my pulse. 1 1 7 I found the creature mirroring my pulse comforting. 1 2 2 5 I found the creature mirroring my pulse disturbing. 5 2 1 2 The creature’s pulse made me more aware of my own heart rate. 1 8 1 I found the creature comfortable on my lap. 3 1 3 3 I was startled by the activation of the creature. 3 4 3 I found the creature’s motion disturbing. 4 5 1 I found the noise of the creature distracting Table B.2: Results for two-tailed unequal variance t-test between breath lengths for each subject between all stages. ’Y’ indicates a significant difference between the two stages. subject condition tested 1 2 3 4 5 6 7 8 9 10 Σ still–constant motion 0.005 0.475 < 0.001 0.226 0.007 < 0.001 0.7482 0.014 < 0.001 0.145 6 still–mirroring 0.002 0.116 < 0.001 0.558 0.033 < 0.001 0.631 < 0.001 0.011 0.184 7 constant motion–mirroring 0.049 0.129 0.027 0.385 0.417 0.400 0.451 0.080 0.176 0.738 3 still–2.5 s breaths 0.050 0.970 0.121 0.021 0.368 0.012 < 0.001 0.528 0.018 0.117 5 constant motion–2.5 s breaths < .001 0.727 0.942 0.018 0.144 0.167 < 0.001 0.020 0.261 0.005 5 mirroring–2.5 s breaths 0.069 0.313 0.598 0.001 0.157 0.469 0.014 0.016 0.534 0.015 5 151 B.2. Experiment 1 Table B.3: Results for two-tailed unequal variance t-test between series of interbeat intervals for each subject between all stages. ’Y’ indicates a significant difference between the two stages. subject condition tested 1 2 3 4 5 6 7 8 9 10 Σ still–constant motion 0.314 < 0.001 < 0.001 0.385 0.078 < 0.001 0.120 0.090 < 0.001 0.009 7 still–mirroring < 0.001 < 0.001 < 0.001 0.565 0.005 < 0.001 0.085 0.009 < 0.001 < 0.001 9 constant motion–mirroring < 0.001 0.800 < 0.001 0.148 0.478 0.185 0.749 0.218 0.062 < 0.001 5 still–70bpm 0.004 0.478 0.688 < 0.001 0.712 0.916 0.228 0.003 0.014 < 0.001 5 constant motion–70bpm 0.010 0.810 0.017 < 0.001 0.583 0.684 0.716 0.015 0.806 < 0.001 5 mirroring–70bpm 0.223 0.779 0.059 < 0.001 0.390 0.630 0.779 0.065 0.492 < 0.001 4 B.2.3 Sample Data 0 50 100 150 200 250 300 350 -10 -8 -6 -4 -2 0 2 4 6 he ar t r at e ac ce le ra tio n time [s] heart rate acceleration for participant 1 Figure B.13: Heart rate acceleration for a participant during Experiment 1. Black lines delineate experiment stages ii, iii, and iv as listed in Figure 4.9. 152 B.2. Experiment 1 0 50 100 150 200 250 300 350 -0.6 -0.5 -0.4 -0.3 -0.2 -0.1 0 0.1 0.2 0.3 0.4 no rm al ize d he ar t r at e ac ce le ra tio n time [s] normalized heart rate acceleration for participant 1 Figure B.14: Normalized heart rate acceleration for a participant during Experiment 1. Black lines delineate experiment stages ii, iii, and iv as listed in Figure 4.9. 0 50 100 150 200 250 300 350 60 65 70 75 80 85 90 95 100 he ar t r at e [b pm ] time [s] heart rate for participant 1 Figure B.15: Heart rate for a participant during Experiment 1. Black lines delineate exper- iment stages ii, iii, and iv as listed in Figure 4.9. 153 B.2. Experiment 1 0 50 100 150 200 250 300 350 -0.4 -0.3 -0.2 -0.1 0 0.1 0.2 0.3 no rm al ize d he ar t r at e time [s] normalized heart rate for participant 1 Figure B.16: Normalized heart rate for a participant during Experiment 1. Black lines delineate experiment stages ii, iii, and iv as listed in Figure 4.9. 0 50 100 150 200 250 300 350 0 20 40 60 80 100 120 140 160 he ar t r at e st an da rd  d ev ia tio n (m s) time [s] heart rate standard deviation for participant 1 Figure B.17: Normalized heart rate standard deviation for a participant during Experiment 1. Black lines delineate experiment stages ii, iii, and iv as listed in Figure 4.9. 154 B.2. Experiment 1 0 50 100 150 200 250 300 350 4.5 5 5.5 6 6.5 7 7.5 8 8.5 9 9.5 sk in  c on du ct an ce  [s ie m en s] time [s] skin conductance for participant 1 Figure B.18: Skin conductance response for a participant during Experiment 1. Black lines delineate experiment stages ii, iii, and iv as listed in Figure 4.9. 0 50 100 150 200 250 300 350 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 no rm al ize d sk in  c on du ct an ce time [s] normalized skin conductance for participant 1 Figure B.19: Normalized skin conductance for a participant during Experiment 1. Black lines delineate experiment stages ii, iii, and iv as listed in Figure 4.9. 155 B.2. Experiment 1 0 50 100 150 200 250 300 350 -0.3 -0.2 -0.1 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 no rm al ize d sk in  c on du ct an ce  d er iv at iv e time [s] normalized skin conductance derivative for participant 1 Figure B.20: Skin conductance derivative for a participant during Experiment 1. Black lines delineate experiment stages ii, iii, and iv as listed in Figure 4.9. 0 50 100 150 200 250 300 350 -0.6 -0.5 -0.4 -0.3 -0.2 -0.1 0 0.1 0.2 0.3 0.4 time [s] de riv at iv e of  s ki n co nd uc ta nc e no rm al ize d derivative of skin conductance normalized for participant 1 Figure B.21: Normalized skin conductance derivative for a participant during Experiment 1. Black lines delineate experiment stages ii, iii, and iv as listed in Figure 4.9. 156 B.2. Experiment 1 0 50 100 150 200 250 300 350 350 400 450 500 550 600 650 em g [m V] time [s] emg for participant 1 Figure B.22: EMG for a participant during Experiment 1. Black lines delineate experiment stages ii, iii, and iv as listed in Figure 4.9. 0 50 100 150 200 250 300 350 -0.04 -0.02 0 0.02 0.04 0.06 0.08 0.1 no rm al ize d em g time [s] normalized emg for participant 1 Figure B.23: Normalized EMG for a participant during Experiment 1. Black lines delineate experiment stages ii, iii, and iv as listed in Figure 4.9. 157 B.2. Experiment 1 0 50 100 150 200 250 300 350 0 2 4 6 8 10 12 breath lengths for participant 1 br ea th  le ng th s [s ] time [s] Figure B.24: Breath lengths for a participant during Experiment 1. Black lines delineate experiment stages ii, iii, and iv as listed in Figure 4.9. 158 B.2. Experiment 1 B.2.4 Sample Comparisons 1 2 3 4 5 6 7 8 9 10 0 1 2 3 4 5 6 7 mean breath length for all participants participant m ea n br ea th  le ng th  [s ]   creature still creature constant motion creature mirroring Figure B.25: Mean breath lengths for participants during Experiment 1. 159 B.2. Experiment 1 1 2 3 4 5 6 7 8 9 10 0 0.5 1 1.5 2 2.5 3 mean breath length sd for all participants participant m ea n br ea th  le ng th  s d [s ]   creature still creature constant motion creature mirroring Figure B.26: Breath length standard deviation for participants during Experiment 1. 1 2 3 4 5 6 7 8 9 10 -0.6 -0.4 -0.2 0 0.2 0.4 0.6 0.8 mean heart rate acceleration for all participants participant m ea n he ar t r at e ac ce l   creature still creature constant motion creature mirroring Figure B.27: Mean heart rate acceleration for participants during Experiment 1. 160 B.2. Experiment 1 1 2 3 4 5 6 7 8 9 10 0 2 4 6 8 10 12 14 16 18 mean heart rate accel sd for all participants participant m ea n he ar t r at e ac ce l s d   creature still creature constant motion creature mirroring Figure B.28: Heart rate acceleration standard deviation for participants during Experiment 1. 1 2 3 4 5 6 7 8 9 10 0 2 4 6 8 10 12 14 mean skin conductance response for all participants participant m ea n sc r [ si em en s]   baseline no creature creature Figure B.29: Mean skin conductance for participants during Experiment 1. 161 B.2. Experiment 1 1 2 3 4 5 6 7 8 9 10 0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 mean skin conductance standard deviation for all participants participant m ea n sk in  c on du ct an ce  s d [s ie m en s]   creature still creature constant motion creature mirroring Figure B.30: Skin conductance standard deviation for participants during Experiment 1. 162 B.2. Experiment 1 mean breath lengths during experiment participant m ea n br ea th  le ng th  [s ] 1 2 3 4 5 6 7 8 9 10 0 1 2 3 4 5 6 creature still creature constant motion creature mirroring αβγδεζ γ αβδ εζ αβδ αβγδ βεζ αβεζ αβδ εζ creature constant motion breath length participant st an da rd  d ev ia tio n of  b re at h le ng th s [s ] standard deviation of breath lengths during experiment 1 2 3 4 5 6 7 8 9 10 0 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2 creature still creature constant motion creature mirroring Figure B.31: Mean and standard deviation of breath lengths of participants during each stage of Experiment 1. Greek letters refer to within-subject mean differences. For each participant, α indicates significant difference between still and constant motion stages. β indicates significant difference between still and mirroring stages. γ indicates significant difference between constant motion and mirroring stages. δ indicates significant difference between still stage and constant motion and constant 2.5 s breaths.  indicates significant difference between constant motion stage and constant 2.5 s breaths. ζ indicates significant difference between mirroring stage and constant 2.5 s breaths. The standard deviation of the constant motion stage is at or close to zero. 163 B.2. Experiment 1 1 2 3 4 5 6 7 8 9 10 40 50 60 70 80 90 100 110 120 mean heart rate for all participants participant m ea n he ar t r at e [b pm ]   creature still creature constant motion creature mirroring βγδε αβ αβγεζ δεζ αβ αβγ β αβδεζ αβγδ αβγδεζ creature constant motion heart rate 1 2 3 4 5 6 7 8 9 10 0 50 100 150 200 250 300 mean heart rate standard deviation for all participants participant m ea n he ar t r at e sd  [m s]   creature still creature constant motion creature mirroring Figure B.32: Mean and standard deviation of heart rate for participants during Experiment 1. Greek letters refer to within-subject mean differences. For each participant, α indicates significant difference between still and constant motion stages. β indicates significant differ- ence between still and mirroring stages. γ indicates significant difference between constant motion and mirroring stages. δ indicates significant difference between still stage and con- stant motion at 70bpm.  indicates significant difference between constant motion stage and constant motion at 70bpm. ζ indicates significant difference between mirroring stage and constant motion at 70bpm. The standard deviation of the constant motion stage is at or close to zero. 164 B.2. Experiment 1 B.2.5 Participant Consent Form Version 1.0 / August 10, 2009             THE UNIVERSITY OF BRITISH COLUMBIA    Department of Computer Science 2366 Main Mall Vancouver, B.C.  Canada  V6T 1Z4 tel:   (604) 822-3061 fax:  (604) 822-4231 (PARTICIPANT’S  COPY CONSENT FORM) Project Title:  Investigation of haptic-affect loop through the haptic creature  (UBC Ethics #B01-0470)  Principal Investigators:  Dr. Karon MacLean, Department of Comptuer Science, 604-822-8169      Dr. Elizabeth Croft, Department of Mechanical Engineering, 604-822-6614 Student Investigator: Joseph P. Hall III, Department of Mechanical Engineering, jphiii@interchange.ubc.ca  The purpose of this study is to examine your reaction to interaction through touch with a robotic pet. You will be asked to hold and touch a small robot that may gently move.  You will be asked to wear external (i.e. non-invasive) sensors that collect some basic physiological information such as heart rate, respiration rate, some muscle activity, and perspiration.  Please tell the experimenter if you find the sensors uncomfortable and adjustments will be made. You will be asked to answer questions in a questionnaire as part of the experiment.  Parts of this experiment will be videotaped for later analysis. If you are unsure about any instructions, do not hesitate to ask.  TIME COMMITMENT:  -1 hour session CONFIDENTIALITY: Your results will be confidential:  you will not be identified by name in any study reports. Test results will be stored in a secure computer account accessible only to the experimenters.  You understand that the experimenters will ANSWER ANY QUESTIONS you have about the instructions or the procedures of this study. After participating, the experimenter will answer any other questions you have about this study. Your participation in this study is entirely voluntary and you may refuse to participate or withdraw from the study at any time without jeopardy. Your signature below indicates that you have received a copy of this consent form for your own records, and consent to participate in this study.  If you have any concerns about your treatment or rights as a research subject, you may contact the Research Subject Info Line in the UBC Office of Research Services at 604-822-8598. 165 B.3. Experiment 2 B.3 Experiment 2 B.3.1 Post-Experiment Questionnaire When asked to mirror the creature I was able to easily mirror the crea- tures breathing. (strongly disagree) 1 2 3 4 5 (strongly agree) I was aware of the creatures pulse. (strongly disagree) 1 2 3 4 5 (strongly agree) I was comfortable with the creature on my lap. (strongly disagree) 1 2 3 4 5 (strongly agree) I was aware of my own breathing. (strongly disagree) 1 2 3 4 5 (strongly agree) I was aware of my own heart rate. (strongly disagree) 1 2 3 4 5 (strongly agree) I found the noise of the creature dis- tracting. (strongly disagree) 1 2 3 4 5 (strongly agree) While sitting still with the creature I was aware of the creatures breath- ing. (strongly disagree) 1 2 3 4 5 (strongly agree) I was aware of the creatures pulse. (strongly disagree) 1 2 3 4 5 (strongly agree) I noticed changes in the creatures breathing. (strongly disagree) 1 2 3 4 5 (strongly agree) I noticed changes in the creatures pulse. (strongly disagree) 1 2 3 4 5 (strongly agree) I was aware of my own breathing. (strongly disagree) 1 2 3 4 5 (strongly agree) I was aware of my own heart rate. (strongly disagree) 1 2 3 4 5 (strongly agree) I was comfortable with the creature on my lap. (strongly disagree) 1 2 3 4 5 (strongly agree) 166 B.3. Experiment 2 During the reading assignment I was aware of the creatures breath- ing. (strongly disagree) 1 2 3 4 5 (strongly agree) I was aware of the creatures pulse. (strongly disagree) 1 2 3 4 5 (strongly agree) I noticed changes in the creatures breathing. (strongly disagree) 1 2 3 4 5 (strongly agree) I noticed changes in the creatures pulse. (strongly disagree) 1 2 3 4 5 (strongly agree) I was aware of my own breathing. (strongly disagree) 1 2 3 4 5 (strongly agree) I was aware of my own heart rate. (strongly disagree) 1 2 3 4 5 (strongly agree) I was comfortable with the creature on my lap. (strongly disagree) 1 2 3 4 5 (strongly agree) I found the creatures motion dis- tracting. (strongly disagree) 1 2 3 4 5 (strongly agree) In general during the experiment The creature made me more aware of my own breathing. (strongly disagree) 1 2 3 4 5 (strongly agree) The creature made me more aware of my own heart rate. (strongly disagree) 1 2 3 4 5 (strongly agree) I enjoyed interacting with the crea- ture. (strongly disagree) 1 2 3 4 5 (strongly agree) 167 B.3. Experiment 2 B.3.2 Data Tables Table B.4: Table of results from Experiment 2 questionnaire (1 = strongly disagree, 5 = strongly agree), n = 10. 1 2 3 4 5 5 2 2 1 It was easy to recognize the creature mirroring my breathing. 1 1 I found the creature mirroring my breathing comforting (if noticed). 2 1 2 I found the creature mirroring my breathing disturbing (if noticed). 2 2 2 4 The creature’s breathing made me more aware of my own breathing. 6 1 1 1 1 It was easy to recognize the creature mirroring my pulse. 1 1 8 I found the creature mirroring my pulse comforting. 1 2 2 5 I found the creature mirroring my pulse disturbing. 5 2 1 2 The creature’s pulse made me more aware of my own heart rate. 1 8 1 I found the creature comfortable on my lap. 3 1 3 3 I was startled by the activation of the creature. 3 4 3 I found the creature’s motion disturbing. 3 4 3 I found the noise of the creature distracting Table B.5: Results for two-tailed unequal variance t-test between series of interbeat intervals for each subject between all stages. ’Y’ indicates a significant difference between the two stages. subject 1 2 3 4 5 6 7 8 9 baseline-training 0.108 0.014 0.000 0.000 0.000 0.000 0.000 0.008 0.018 baseline-task 0.466 0.001 0.001 0.012 0.000 0.103 0.004 0.000 0.023 baseline-no task 0.000 0.040 0.000 0.000 0.000 0.000 0.000 0.000 0.000 baseline - 70bpm 0.723 0.146 0.205 0.778 0.000 0.965 0.001 0.571 0.158 task-70bpm 0.732 0.057 0.794 0.421 0.032 0.581 0.012 0.415 0.297 no task-70bpm 0.840 0.226 0.042 0.553 0.221 0.899 0.003 0.864 0.436 168 B.3. Experiment 2 Table B.6: Questionnaire results from Experiment 2 post-experiment survey (1 = strongly disagree, 5 = strongly agree). When asked to mirror creature: 1 4 4 I was able to easily mirror the creature’s breathing 2 1 6 I was aware of the creature’s pulse 5 4 I was comfortable with creature on my lab 9 I was aware of own breathing 6 3 I was aware of own heartrate 4 4 1 I found noise of creature distracting 1 2 3 4 5 While sitting with active creature: 2 7 I was aware of the creature’s breathing 1 1 1 6 I was aware of the creature’s pulse 1 1 4 3 I noticed changes in the creature’s breathing 6 1 2 I noticed changes in the creature’s pulse 1 2 4 2 I was aware of my own breathing 3 4 2 I was aware of my own heart rate 1 4 4 I was comfortable with creature on my lap 1 2 3 4 5 During reading task: 1 1 5 2 I was aware of the creature’s breathing 1 4 3 1 I was aware of the creature’s pulse 1 4 1 1 2 I noticed changes in the creature’s breathing 4 4 1 I noticed changes in the creature’s pulse 4 3 2 I was aware of my own breathing 6 2 1 I was aware of my own heart rate 1 2 5 1 I was comfortable with creature on my lap 1 3 1 3 1 I found creature’s motion distracting 1 2 3 4 5 Overall: 1 4 4 creature made me more aware of breathing 1 6 2 creature made me more aware of heart rate 2 7 enjoyed interacting 1 2 3 4 5 169 B.3. Experiment 2 B.3.3 Sample Data 0 200 400 600 800 1000 1200 -8 -6 -4 -2 0 2 4 he ar t r at e ac ce le ra tio n time [s] heart rate acceleration for participant 1 Figure B.33: Heart rate acceleration for a participant during Experiment 2. Black lines delineate experiment stages i, ii, iii, and iv as listed in Figure 4.14. 170 B.3. Experiment 2 0 200 400 600 800 1000 1200 -0.5 -0.4 -0.3 -0.2 -0.1 0 0.1 0.2 0.3 no rm al ize d he ar t r at e ac ce le ra tio n time [s] normalized heart rate acceleration for participant 1 Figure B.34: Normalized heart rate acceleration for a participant during Experiment 2. Black lines delineate experiment stages i, ii, iii, and iv as listed in Figure 4.14. 0 200 400 600 800 1000 1200 60 65 70 75 80 85 90 95 he ar t r at e [b pm ] time [s] heart rate for participant 1 Figure B.35: Heart rate for a participant during Experiment 2. Black lines delineate exper- iment stages i, ii, iii, and iv as listed in Figure 4.14. 171 B.3. Experiment 2 0 200 400 600 800 1000 1200 -0.4 -0.3 -0.2 -0.1 0 0.1 0.2 0.3 no rm al ize d he ar t r at e time [s] normalized heart rate for participant 1 Figure B.36: Normalized heart rate for a participant during Experiment 2. Black lines delineate experiment stages i, ii, iii, and iv as listed in Figure 4.14. 0 200 400 600 800 1000 1200 0 20 40 60 80 100 120 he ar t r at e sd  (m s) time [s] heart rate standard deviation for participant 1 Figure B.37: Normalized heart rate standard deviation for a participant during Experiment 2. Black lines delineate experiment stages i, ii, iii, and iv as listed in Figure 4.14. 172 B.3. Experiment 2 0 200 400 600 800 1000 1200 6 7 8 9 10 11 12 13 14 sk in  c on du ct an ce  [s ie m en s] time [s] skin conductance for participant 1 Figure B.38: Skin conductance response for a participant during Experiment 2. Black lines delineate experiment stages i, ii, iii, and iv as listed in Figure 4.14. 0 200 400 600 800 1000 1200 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 no rm al ize d sk in  c on du ct an ce time [s] normalized skin conductance for participant 1 Figure B.39: Normalized skin conductance for a participant during Experiment 2. Black lines delineate experiment stages i, ii, iii, and iv as listed in Figure 4.14. 173 B.3. Experiment 2 0 200 400 600 800 1000 1200 -6 -4 -2 0 2 4 6 x 10 -3 de riv at iv e of  s ki n co nd uc ta nc e [s ie m en s/ s] time [s] derivative of skin conductance for participant 1 Figure B.40: Skin conductance derivative for a participant during Experiment 2. Black lines delineate experiment stages i, ii, iii, and iv as listed in Figure 4.14. 0 200 400 600 800 1000 1200 -0.6 -0.4 -0.2 0 0.2 0.4 0.6 0.8 no rm al ize d sk in  c on du ct an ce  d er iv at iv e time [s] normalized skin conductance derivative for participant 1 Figure B.41: Normalized skin conductance derivative for a participant during Experiment 2. Black lines delineate experiment stages i, ii, iii, and iv as listed in Figure 4.14. 174 B.3. Experiment 2 0 200 400 600 800 1000 1200 480 500 520 540 560 580 600 620 640 em g [m V] time [s] emg for participant 1 Figure B.42: EMG for a participant during Experiment 2. Black lines delineate experiment stages i, ii, iii, and iv as listed in Figure 4.14. 0 200 400 600 800 1000 1200 -0.05 -0.04 -0.03 -0.02 -0.01 0 0.01 no rm al ize d em g time [s] normalized emg for participant 1 Figure B.43: Normalized EMG for a participant during Experiment 2. Black lines delineate experiment stages i, ii, iii, and iv as listed in Figure 4.14. 175 B.3. Experiment 2 0 200 400 600 800 1000 1200 28.5 29 29.5 30 30.5 31 time [s] sk in  te m pe ra tu re  [º C] skin temperature for participant 1 Figure B.44: Skin temperature for a participant during Experiment 2. Black lines delineate experiment stages i, ii, iii, and iv as listed in Figure 4.14. 0 200 400 600 800 1000 1200 0 2 4 6 8 10 12 breath lengths for participant 1 br ea th  le ng th  [s ] time [s] Figure B.45: Breath lengths for a participant during Experiment 2. Black lines delineate experiment stages i, ii, iii, and iv as listed in Figure 4.14. 176 B.3. Experiment 2 B.3.4 Sample Comparisons st an da rd  d ev ia tio n [s ] participant Standard deviation of breath lengths for participants during experiment 1 2 3 4 5 6 7 8 9 0 0.5 1 1.5 2 2.5 baseline when asked to mirror creature when sitting calmly when performing task Figure B.46: Standard deviation of breath lengths for all participants during Experiment 2. 177 B.3. Experiment 2 1 2 3 4 5 6 7 8 9 0 5 10 15 20 25 participant m ea n br ea th  le ng th  [s ] mean breath length for participants during experiment 2   slow training fast training slow no task fast no task slow task fast task Figure B.47: Mean breath length for all participants during Experiment 2. 1 2 3 4 5 6 7 8 9 55 60 65 70 75 80 85 heart rate for all participants participant m ea n he ar t r at e [b pm ]   calm images disturbing images w/o creature calm images disturbing images with creature Figure B.48: Mean heart rate for participants during Experiment 2. 178 B.3. Experiment 2 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 participant hr  s d [b pm ] hr sd for all participants during experiment 2   baseline training task no task Figure B.49: Mean heart rate standard deviation for participants during Experiment 2. HF % of heart rate variability during experiment participant hf  % 1 2 3 4 5 6 7 8 9 0 10 20 30 40 50 60 70 baseline when asked to mirror creature when sitting calmly when performing task Figure B.50: High frequency component of heart rate variability during Experiment 2 for all participants for all stages. 179 B.3. Experiment 2 1 2 3 4 5 6 7 8 9 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 mean normalized skin conductance for all participants participant m ea n no rm al ize d sk in  c on du ct an ce   baseline training no task task Figure B.51: Mean skin conductance for participants during Experiment 2. 1 2 3 4 5 6 7 8 9 -0.025 -0.02 -0.015 -0.01 -0.005 0 0.005 0.01 0.015 0.02 normalized skin conductance derivative for all participants participant m ea n no rm al ize d sk in  c on du ct an ce  d er iv at iv e   baseline training no task task Figure B.52: Mean derivative of skin conductance for participants during Experiment 2. 180 B.3. Experiment 2 1 2 3 4 5 6 7 8 9 0 100 200 300 400 500 600 700 EMG for all participants participant m ea n EM G [m V]   baseline training no task task Figure B.53: Mean EMG for participants during Experiment 2. 1 2 3 4 5 6 7 8 9 0 5 10 15 20 25 30 35 skin temperature for all participants participant m ea n sk in  te m pe ra tu re  [˚ C]   baseline training no task task Figure B.54: Mean skin temperature for participants during Experiment 2. 181 B.3. Experiment 2 B.3.5 Participant Consent Form Version 1.1 / December 2, 2009             THE UNIVERSITY OF BRITISH COLUMBIA               Department of Computer Science 2366 Main Mall Vancouver, B.C.  Canada  V6T 1Z4 tel:   (604) 822-3061 fax:  (604) 822-4231 (PARTICIPANT’S  COPY CONSENT FORM) Project Title:  Investigation of haptic-affect loop through the haptic creature (UBC Ethics #B01-0470) Principal Investigators:   Dr. Karon MacLean, Department of Computer Science, 604-822-8169     Dr. Elizabeth Croft, Department of Mechanical Engineering, 604-822-6614 Student Investigator:  Joseph P. Hall III, Department of Mechanical Engineering, jphiii@interchange.ubc.ca  The purpose of this study is to examine your reaction to interaction through touch with a robotic pet. You will be asked to hold and touch a small robot that may gently move.  You will be asked to wear external (i.e. non-invasive) sensors that collect some basic physiological information such as  heart rate, respiration rate, some muscle activity, and perspiration.  Please tell the experimenter if you find the sensors uncomfortable and adjustments will be made. You will be asked to answer questions in a questionnaire as part of the experiment. If you are unsure about any instructions, do not hesitate to ask. TIME COMMITMENT: ½ -1 hour session CONFIDENTIALITY: Your results will be confidential:  you will not be identified by name in any study reports. Test results will be stored in a secure computer account accessible only to the experimenters. You understand that the experimenters will ANSWER ANY QUESTIONS you have about the instructions or the procedures of this study. After participating, the experimenter will answer any other questions you have about this study. Your participation in this study is entirely voluntary and you may refuse to participate or withdraw from the study at any time without jeopardy. Your signature below indicates that you have received a copy of this consent form for your own records, and consent to participate in this study.  If you have any concerns about your treatment or rights as a research subject, you may contact the Research Subject Info Line in the UBC Office of Research Services at 604-822-8598. 182 B.4. Experiment 3 B.4 Experiment 3 B.4.1 Sample Data 0 100 200 300 400 500 600 700 800 900 1000 -4 -3 -2 -1 0 1 2 3 time [s] he ar t r at e ac ce le ra tio n heart rate acceleration for participant 1 Figure B.55: Heart rate acceleration for a participant during Experiment 3. Black lines delineate experiment stages as listed in Figure 4.22. 183 B.4. Experiment 3 0 100 200 300 400 500 600 700 800 900 1000 -0.2 -0.15 -0.1 -0.05 0 0.05 0.1 0.15 0.2 time [s] no rm al ize d he ar t r at e ac ce le ra tio n normalized heart rate acceleration for participant 1 Figure B.56: Normalized heart rate acceleration for a participant during Experiment 3. Black lines delineate experiment stages as listed in Figure 4.22. 0 100 200 300 400 500 600 700 800 900 1000 85 90 95 100 105 110 time [s] he ar t r at e [b pm ] heart rate for participant 1 Figure B.57: Heart rate for a participant during Experiment 3. Black lines delineate exper- iment stages as listed in Figure 4.22. 184 B.4. Experiment 3 0 100 200 300 400 500 600 700 800 900 1000 -0.15 -0.1 -0.05 0 0.05 0.1 0.15 0.2 time [s] no rm al ize d he ar t r at e normalized heart rate for participant 1 Figure B.58: Normalized Heart Rate for a participant during Experiment 3. Black lines delineate experiment stages as listed in Figure 4.22. 0 100 200 300 400 500 600 700 800 900 1000 0 5 10 15 20 25 30 35 40 time [s] he ar t r at e st an da rd  d ev ia tio n [m s] heart rate standard deviation for participant 1 Figure B.59: Normalized heart rate standard deviation for a participant during Experiment 3. Black lines delineate experiment stages as listed in Figure 4.22. 185 B.4. Experiment 3 0 100 200 300 400 500 600 700 800 900 1000 3.5 4 4.5 5 5.5 6 6.5 7 7.5 8 8.5 time [s] sk in  c on du ct an ce  re sp on se  [s ie m en s] skin conductance response for participant 1 Figure B.60: Skin conductance response for a participant during Experiment 3. Black lines delineate experiment stages as listed in Figure 4.22. 0 100 200 300 400 500 600 700 800 900 1000 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 time [s] no rm al ize d sk in  c on du ct an ce  re sp on se normalized skin conductance response for participant 1 Figure B.61: Normalized skin conductance for a participant during Experiment 3. Black lines delineate experiment stages as listed in Figure 4.22. 186 B.4. Experiment 3 0 100 200 300 400 500 600 700 800 900 1000 -2 -1.5 -1 -0.5 0 0.5 1 1.5 2 2.5 3 x 10 -3 time [s] de riv at iv e of  s ki n co nd uc ta nc e [s ie m en s/ s] derivative of skin conductance response for participant 1 Figure B.62: Skin conductance derivative for a participant during Experiment 3. Black lines delineate experiment stages as listed in Figure 4.22. 0 100 200 300 400 500 600 700 800 900 1000 -0.4 -0.2 0 0.2 0.4 0.6 0.8 1 time [s] no rm al ize d de riv at iv e of  s ki n co nd uc ta nc e normalized derivative of skin conductance response for participant 1 Figure B.63: Normalized skin conductance derivative for a participant during Experiment 3. Black lines delineate experiment stages as listed in Figure 4.22. 187 B.4. Experiment 3 0 100 200 300 400 500 600 700 800 900 1000 0 2 4 6 8 10 12 14 time [s] em g [m V] emg for participant 1 Figure B.64: EMG for a participant during Experiment 3. Black lines delineate experiment stages as listed in Figure 4.22. 0 100 200 300 400 500 600 700 800 900 1000 -0.2 0 0.2 0.4 0.6 0.8 1 1.2 time [s] no rm al ize d em g normalized emg for participant 1 Figure B.65: Normalized EMG for a participant during Experiment 3. Black lines delineate experiment stages as listed in Figure 4.22. 188 B.4. Experiment 3 0 100 200 300 400 500 600 700 800 900 1000 25.4 25.6 25.8 26 26.2 26.4 26.6 26.8 27 27.2 time [s] sk in  te m pe ra tu re  [º C] skin temperature for participant 1 Figure B.66: Skin temperature for a participant during Experiment 3. Black lines delineate experiment stages as listed in Figure 4.22. 0 100 200 300 400 500 600 700 800 900 1000 0 2 4 6 8 10 12 breath lengths for participant 1 br ea th  le ng th s [s ] time [s] Figure B.67: Breath lengths for a participant during Experiment 3. Black lines delineate experiment stages as listed in Figure 4.22. 189 B.4. Experiment 3 B.4.2 Sample Comparisons 190 B .4. E x p erim en t 3 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 hr sd for all participants participant h r  s d  [ s ]   baseline no creature creature creature slow creature fast activity creature presence creature motion Figure B.68: Mean heart rate standard deviation for participants during Experiment 3.191 B .4. E x p erim en t 3 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 hr pnn50 for all participants participant h r  p n n 5 0   baseline no creature creature creature slow creature fast activity creature presence creature motion Figure B.69: Mean heart rate pnn50 for participants during Experiment 3.192 B .4. E x p erim en t 3 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 scr norm mean for all subjects subject s c r  n o r m  m e a n   baseline no creature creature creature slow creature fast activity creature presence creature motion Figure B.70: Mean skin conductance for participants during Experiment 3. 193 B .4. E x p erim en t 3 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 0.1 0.15 0.2 0.25 0.3 0.35 0.4 scr norm sd for all participants participant s c r  n o r m  s d   baseline no creature creature creature slow creature fast activity creature presence creature motion Figure B.71: Mean skin conductance standard deviation for participants during Experiment 3. 194 B .4. E x p erim en t 3 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 25 30 35 skin temp. mean for all participants participant m e a n  s k i n  t e m p  [ ˚ C ]   baseline no creature creature creature slow creature fast activity creature presence creature motion Figure B.72: Mean skin temperature for participants during Experiment 3.195 B .4. E x p erim en t 3 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 0 0.5 1 1.5 2 2.5 skin temp. sd for all participants participant s k i n  t e m p e r a t u r e  s d  [ ˚ C ]   baseline no creature creature creature slow creature fast activity creature presence creature motion Figure B.73: Mean skin temperature standard deviation for participants during Experiment 3.196 B .4. E x p erim en t 3 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 0 1 2 3 4 5 6 7 8 breath lengths mean for all participants participant b r e a t h  l e n g t h s  m e a n  [ s ]   baseline no creature creature creature slow creature fast activity creature presence creature motion Figure B.74: Mean breath length for participants during Experiment 3.197 B .4. E x p erim en t 3 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 0 0.5 1 1.5 2 2.5 3 3.5 x 10 -3 hr vlf % for all participants participant h r  v l f  %   baseline no creature creature creature slow creature fast activity creature presence creature motion Figure B.75: Heart rate vlf% for participants during Experiment 3.198 B .4. E x p erim en t 3 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 0 0.05 0.1 0.15 0.2 0.25 dscr norm sd for all participants participant d s c r  n o r m  s d   baseline no creature creature creature slow creature fast activity creature presence creature motion Figure B.76: Mean derivative of skin conductance standard deviation for participants during Experiment 3. 199 B .4. E x p erim en t 3 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 0 1 2 3 4 5 6 7 breath length sd for all participants participant b r e a t h  l e n g t h s  [ s ]   baseline no creature creature creature slow creature fast activity creature presence creature motion Figure B.77: Mean breath length standard deviation for participants during Experiment 3.200 B .4. E x p erim en t 3 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 60 70 80 90 100 110 120 hr mean for all participants participant h r  m e a n  [ b p m ]   baseline no creature creature creature slow creature fast activity creature presence creature motion Figure B.78: Mean heart rate for participants during Experiment 3. 201 B .4. E x p erim en t 3 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 hr rms ssd (from ibi) for all subjects subject h r  r m s  s s d  ( f r o m  i b i )   baseline no creature creature creature slow creature fast activity creature presence creature motion Figure B.79: Mean heart rate rms standard deviation for participants during Experiment 3.202 B .4. E x p erim en t 3 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 -0.02 -0.015 -0.01 -0.005 0 0.005 0.01 0.015 dscr norm mean for all participants participant d s c r  n o r m  m e a n   baseline no creature creature creature slow creature fast activity creature presence creature motion Figure B.80: Mean derivative of skin conductance for participants during Experiment 3.203 B.4. Experiment 3 { introduction to creature activity w/o creature activity w/creature inactive activity w/creature “slow” activity w/creature “fast” (i) ~ 90 s (ii) ~ 180 s (iii) ~ 240 s (iv) ~ 240 s (v) ~ 240 s i ii iii iv v {ii, iii, iv, v} - activity {iii, iv, v} - creature activity {iv, v} - creature motion iii iv v {iii, iv, v} {iv, v} iv v {iv, v}ii iii iv - v { { Stage Comparisons 1 2 3 4 5 6 7 9 10 8 11 12 13 16 15 14 Figure B.81: Summary of comparisons made during Experiment 3. 204 B .4. E x p erim en t 3 Table B.7: Summary of results from Experiment 3. Investigated columns in green, significant results are in bold. See Figure B.81 for comparisons. comparison unit 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 unit ibi mean p 0.263 0.181 0.200 0.160 0.178 0.174 0.175 0.513 0.635 0.377 0.476 0.458 0.944 0.578 0.746 0.406 s mean 0.011 0.014 0.014 0.020 0.014 0.015 0.017 0.004 0.003 0.009 0.005 0.006 0.000 0.006 0.003 0.006 sd 0.046 0.050 0.051 0.067 0.048 0.054 0.058 0.026 0.031 0.049 0.032 0.039 0.029 0.048 0.037 0.035 ibi sd p 0.000 0.016 0.465 0.543 0.320 0.608 0.658 0.531 0.272 0.481 0.242 0.311 0.242 0.524 0.295 0.828 s mean-0.022 -0.018 -0.008 -0.010 -0.009 -0.006 -0.006 0.004 0.013 0.012 0.015 0.015 0.010 0.008 0.012 -0.001 sd 0.022 0.034 0.054 0.077 0.044 0.060 0.068 0.027 0.058 0.081 0.062 0.072 0.040 0.063 0.053 0.031 hr mean p 0.308 0.168 0.185 0.157 0.168 0.156 0.161 0.308 0.428 0.366 0.312 0.354 0.893 0.805 0.912 0.642 mean-1.286 -2.019 -1.915 -2.299 -1.838 -2.060 -2.118 -0.733 -0.628 -1.013 -0.774 -0.832 0.105 -0.280 -0.099 -0.385 sd 6.042 6.952 6.859 7.707 6.328 6.883 7.172 3.443 3.813 5.377 3.667 4.308 3.787 5.479 4.356 4.004 hr sd p 0.000 0.003 0.004 0.002 0.004 0.005 0.002 0.916 0.156 0.893 0.037 0.091 0.070 0.985 0.097 0.275 mean 422 410 285 407 148 255 275 -11.7 -137 -14.4 -85 -146 -125 -2.63 -135 122 sd 418 597 441 560 329 405 394 542 456 519 260 406 322 666 381 535 hr skewness p 0.859 0.349 0.703 0.575 0.533 0.656 0.753 0.232 0.556 0.494 0.529 0.631 0.293 0.853 0.232 0.725 mean-0.042 0.298 0.119 0.233 0.176 0.141 0.096 0.340 0.161 0.275 0.183 0.138 -0.179 -0.065 -0.202 0.113 sd 1.144 1.526 1.517 2.001 1.361 1.524 1.472 1.357 1.324 1.933 1.398 1.385 0.812 1.714 0.808 1.556 hr rms ssd p 0.001 0.029 0.371 0.512 0.229 0.473 0.531 0.830 0.575 0.684 0.529 0.596 0.500 0.662 0.540 0.988 mean-0.024 -0.022 -0.015 -0.015 -0.015 -0.013 -0.013 0.002 0.009 0.009 0.011 0.011 0.008 0.008 0.009 0.000 sd 0.030 0.047 0.078 0.108 0.060 0.085 0.098 0.040 0.081 0.110 0.086 0.102 0.055 0.083 0.074 0.041 hr pnn50 p 0.013 0.072 0.025 0.014 0.009 0.017 0.013 0.820 0.784 0.552 0.803 0.643 0.549 0.368 0.409 0.750 mean-0.054 -0.049 -0.060 -0.066 -0.065 -0.059 -0.063 0.005 -0.006 -0.012 -0.005 -0.009 -0.012 -0.017 -0.014 -0.005 sd 0.099 0.127 0.123 0.120 0.104 0.112 0.114 0.113 0.109 0.094 0.091 0.093 0.093 0.090 0.083 0.081 hr vlf % p 0.019 0.064 0.165 0.037 0.004 0.001 0.003 0.267 0.648 0.943 0.001 0.003 0.630 0.247 0.053 0.398 mean 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 sd 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 hr lf % p 0.098 0.142 0.232 0.399 0.268 0.217 0.275 0.932 0.134 0.275 0.115 0.182 0.091 0.222 0.139 0.553 mean 0.000 0.000 0.001 0.001 0.000 0.001 0.001 0.000 0.001 0.001 0.001 0.001 0.001 0.001 0.001 0.000 sd 0.001 0.001 0.004 0.004 0.002 0.003 0.004 0.001 0.004 0.004 0.003 0.004 0.003 0.004 0.004 0.002 hr mf % p 0.051 0.185 0.941 0.624 0.634 0.894 0.706 0.228 0.246 0.324 0.276 0.303 0.295 0.361 0.347 0.425 mean-0.001 -0.001 0.000 0.001 0.000 0.000 0.001 0.000 0.001 0.002 0.001 0.002 0.001 0.002 0.001 0.001 sd 0.002 0.002 0.004 0.009 0.003 0.005 0.007 0.001 0.004 0.009 0.005 0.007 0.003 0.008 0.006 0.005 hr hf % p 0.015 0.010 0.591 0.720 0.623 0.987 0.900 0.601 0.618 0.460 0.501 0.497 0.405 0.389 0.390 0.408 mean-0.001 -0.001 -0.001 0.001 0.000 0.000 0.000 0.000 0.001 0.002 0.001 0.001 0.001 0.002 0.001 0.001 sd 0.002 0.002 0.005 0.012 0.004 0.007 0.009 0.001 0.005 0.012 0.007 0.009 0.004 0.011 0.008 0.008 Continued. . . 205 B .4. E x p erim en t 3 unit 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 unit scr norm mean p 0.002 0.000 0.000 0.000 0.000 0.000 0.000 0.874 0.842 0.845 0.967 0.993 0.577 0.890 0.839 0.434 mean-0.121 -0.123 -0.112 -0.126 -0.120 -0.120 -0.119 -0.004 0.007 -0.007 -0.001 0.000 0.011 -0.003 0.004 -0.013 sd 0.170 0.123 0.125 0.133 0.117 0.117 0.122 0.119 0.155 0.168 0.143 0.156 0.089 0.101 0.086 0.081 scr norm sd p 0.057 0.286 0.034 0.695 0.584 0.456 0.357 0.271 0.451 0.156 0.141 0.162 0.242 0.476 0.738 0.098 mean-0.028 -0.012 -0.021 -0.005 -0.005 -0.007 -0.009 0.018 0.009 0.025 0.023 0.021 -0.009 0.007 0.003 0.016 sd 0.069 0.051 0.044 0.057 0.043 0.046 0.047 0.077 0.056 0.082 0.071 0.068 0.036 0.046 0.035 0.044 dscr norm mean p 0.195 0.067 0.336 0.198 0.155 0.159 0.245 0.443 0.917 0.899 0.816 0.984 0.208 0.359 0.240 0.657 mean-0.002 -0.003 -0.002 -0.002 -0.002 -0.002 -0.002 -0.001 0.000 0.000 0.000 0.000 0.001 0.001 0.001 0.000 sd 0.008 0.007 0.008 0.007 0.006 0.007 0.007 0.008 0.007 0.008 0.007 0.007 0.005 0.005 0.005 0.004 dscr norm sd p 0.434 0.028 0.006 0.012 0.010 0.005 0.006 0.270 0.011 0.023 0.011 0.008 0.095 0.214 0.101 0.553 mean 0.006 0.014 0.024 0.022 0.019 0.022 0.023 0.007 0.017 0.014 0.015 0.016 0.010 0.007 0.009 -0.003 sd 0.036 0.029 0.039 0.038 0.032 0.033 0.037 0.027 0.029 0.027 0.024 0.026 0.028 0.028 0.026 0.022 emg norm mean p 0.420 0.630 0.424 0.512 0.739 0.471 0.464 0.207 0.182 0.252 0.195 0.214 0.595 0.750 0.669 0.417 mean 0.022 -0.012 -0.024 -0.019 -0.008 -0.019 -0.021 -0.035 -0.047 -0.042 -0.042 -0.044 -0.012 -0.007 -0.010 0.005 sd 0.133 0.113 0.140 0.135 0.108 0.121 0.136 0.128 0.163 0.171 0.149 0.166 0.108 0.108 0.107 0.029 emg norm sd p 0.053 0.011 0.009 0.007 0.067 0.014 0.010 0.148 0.047 0.052 0.242 0.078 0.406 0.283 0.745 0.916 mean-0.035 -0.062 -0.074 -0.073 -0.031 -0.052 -0.066 -0.027 -0.039 -0.038 -0.017 -0.030 -0.012 -0.011 -0.003 0.002 sd 0.085 0.108 0.124 0.118 0.076 0.094 0.111 0.086 0.090 0.088 0.068 0.079 0.070 0.047 0.049 0.067 skin temp. mean p 0.069 0.001 0.001 0.000 0.001 0.000 0.000 0.002 0.002 0.001 0.001 0.001 0.032 0.008 0.011 0.870 ◦C mean 0.273 1.167 1.699 1.727 1.180 1.570 1.714 0.879 1.411 1.439 1.232 1.426 0.532 0.560 0.547 0.028 sd 0.702 1.508 2.102 1.951 0.739 1.825 1.987 1.186 1.908 1.708 1.671 1.764 1.117 0.925 0.942 0.800 skin temp. sd p 0.019 0.814 0.890 0.722 0.000 0.002 0.016 0.056 0.080 0.344 0.032 0.308 0.856 0.763 0.091 0.719 ◦C mean 0.124 0.015 0.008 0.033 0.739 0.344 0.221 -0.114 -0.121 -0.096 0.257 0.092 -0.007 0.019 0.206 0.025 sd 0.239 0.296 0.281 0.444 0.621 0.463 0.405 0.271 0.315 0.474 0.507 0.423 0.171 0.294 0.235 0.332 resp rate mean p 0.260 0.239 0.132 0.081 0.091 0.075 0.100 0.968 0.974 0.829 0.907 0.898 0.993 0.865 0.930 0.369 s mean 36.0 38.9 38.6 45.2 40.4 41.1 41.9 1.37 1.08 7.65 3.61 4.38 -0.29 6.28 3.01 6.57 sd 153 154 118 118 102 106 117 164 159 167 146 162 152 175 163 34.4 resp rate sd p 0.039 0.021 0.029 0.009 0.450 0.171 0.017 0.664 0.861 0.337 0.099 0.778 0.650 0.298 0.520 0.331 s mean -29.8 -27.2 -33.7 -44.9 -8.3 -16.7 -34.9 3.9 -2.6 -13.9 14.4 -3.8 -6.5 -17.7 -7.7 -11.2 sd 66.8 52.5 69.3 75.2 52.0 56.7 64.8 42.3 71.3 67.7 64.6 64.3 67.8 49.4 56.6 54.2 breath lengths meanp 0.261 0.130 0.096 0.101 0.070 0.092 0.092 0.886 0.597 0.222 0.743 0.632 0.311 0.335 0.414 0.496 s mean 1.565 1.363 2.058 2.416 0.789 1.868 1.956 -0.202 0.493 0.851 0.303 0.391 0.695 1.053 0.593 0.359 sd 6.649 4.248 5.816 6.919 1.540 5.208 5.450 6.852 4.504 3.323 4.467 3.941 3.287 5.241 3.488 2.537 breath lengths sd p 0.255 0.251 0.202 0.118 0.155 0.129 0.121 0.096 0.185 0.099 0.114 0.110 0.443 0.116 0.182 0.339 s mean-0.580 0.588 1.230 1.808 1.371 1.207 1.353 1.168 1.810 2.388 1.787 1.933 0.642 1.220 0.764 0.578 sd 2.431 2.450 4.589 5.451 4.562 3.753 4.109 3.292 6.484 6.801 5.333 5.689 4.027 3.660 2.720 2.902 Continued. . . 206 B .4. E x p erim en t 3 unit 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 unit scr mean p 0.859 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.002 0.002 0.001 0.875 S mean 0.032 0.901 1.510 1.485 1.040 1.341 1.498 0.849 1.458 1.433 1.288 1.446 0.609 0.584 0.597 -0.025 sd 0.865 0.970 1.353 1.505 1.108 1.247 1.381 0.743 1.060 1.109 0.900 1.018 0.819 0.799 0.718 0.746 scr sd p 0.740 0.933 0.516 0.594 0.000 0.002 0.045 0.994 0.433 0.542 0.000 0.017 0.240 0.468 0.005 0.138 S mean 0.015 0.004 0.043 -0.026 0.433 0.243 0.142 0.000 0.039 -0.030 0.239 0.138 0.039 -0.030 0.137 -0.069 sd 0.222 0.245 0.313 0.229 0.393 0.325 0.319 0.202 0.235 0.231 0.275 0.256 0.154 0.196 0.213 0.215 dscr mean p 0.029 0.043 0.054 0.034 0.029 0.032 0.038 0.672 0.937 0.659 0.716 0.773 0.720 0.995 0.847 0.560 S mean 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 sd 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 dscr sd p 0.020 0.680 0.115 0.352 0.496 0.192 0.161 0.003 0.001 0.000 0.000 0.000 0.113 0.449 0.154 0.356 S mean 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 sd 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 207 B.4. Experiment 3 B.4.3 Participant Consent Form Page 1 of 5  Version 1.2 - December 3, 2009    THE UNIVERSITY OF BRITISH COLUMBIA  Department of Computer Science 2366 Main Mall Vancouver, BC Canada V6T 1Z4 Phone: (604) 822-3061 Fax: (604) 822-4231   PARTICIPANT & PARENT INFORMATION AND CONSENT FORM  Tamer: Touch-guided Anxiety Management via Engagement with a Robot Pet Principal Investigator:  Associate Professor Karon MacLean, PHD Department of Computer Science University of British Columbia (604)-822-8169 Sponsor:  Name(s) of industry sponsor or granting agency  INTRODUCTION You (or your child) are being invited to take part in this research study because we feel that your participation and feedback will greatly assist us in developing anxiety-reducing robotic devices.  YOUR PARTICIPATION IS VOLUNTARY Your participation is entirely voluntary, so it is up to you to decide whether or not to take part in this study.  Before you decide, it is important for you to understand what the research involves. This consent form will tell you about the study, why the research is being done, what will happen to you during the study and the possible benefits, risks, and discomforts.  If you wish to participate, you will be asked to sign this form.  If you do decide to take part in this study, you are still free to withdraw at any time and without giving any reasons for your decision.  If you do not wish to participate, you do not have to provide any reason for your decision not to participate.  Please take time to read the following information carefully.  WHO IS CONDUCTING THE STUDY? The study is being conducted/funded by the National Science and Engineering Research Council of Canada (NSERC). The Principal Investigator has received funds from this agency to compensate subjects for participating in this study.  You are entitled to request any details concerning this compensation from the Principal Investigator.  BACKGROUND This project’s goal is to advance a novel tool and technique to help young children attain independent anxiety regulation skills.  Engagement will be utilize to give children access to cognitive training by interacting with an expressive animatronic pet.  This robot will be programmed to respond physically to a combination of a child’s pattern of touch and biometrically sensed emotional state in a way that rewards patience and progress. 208 B.4. Experiment 3 Page 2 of 5  Version 1.2 - December 3, 2009  WHAT IS THE PURPOSE OF THE STUDY? The purpose of this study is to examine the role of haptic (touch sense) feedback on anxiety levels.  This study investigates your reaction to a small robotic creature that is touch-sensitive and can breath, purr and stiffen its ears.  WHO CAN PARTICIPATE IN THE STUDY? This study is open to children from ages 5-17, particularly those who may have been diagnosed with mild anxiety or learning disorders, as well as adult subjects between the ages of 17-50. We expect to enroll approximately 20 children and 10 adults in this experiment.  WHAT DOES THE STUDY INVOLVE? You will be asked to wear external (i.e., non-invasive) sensors that collect some basic physiological information such as the heart rate, respiration rate, some muscle activity, and perspiration. We request that you tell the experimenter if you find the sensor positioning uncomfortable, and adjustments will be made. You will be invited to answer questions in two questionnaires as part of the experiment. The study will be viewed by the experimenters in a separate room via a webcam. It will not be recorded. The time commitment required for this session will range from one to three hours.  Image 1 shows a photo of a child attached to the physiological sensors that will be used during these experiments. There are four primary sensors that will be used during these experiments: a. Respiration Rate: A Velcro band is worn around the abdomen outside of the clothing. Expands and contracts with the abdomen to measure respiration rate, waveform, and amplitude. b. Blood Volume Pulse: also known as a photoplethysmograph (PPG) sensor. A small black box attaches to the distal end of a finger with a Velcro strap. Measures heart rate. c. Skin Conductance: Two electrodes attach to Velcro straps, each in turn attached to the distal end of two fingers on the same hand. Measures galvanic skin response (GSR), the electrical resistance of the skin. d. ECG: Three electrodes attach to the upper right and left sides of the chest and the lower abdomen.  Measures heart electrical activity.  IF YOU DECIDE TO JOIN THIS STUDY: SPECIFIC PROCEDURES After being fitted with the sensors, you will be invited to hold the creature in your lap.  The creature may move during this experiment.  You will then complete a questionnaire about your interaction during this experiment. If you are not sure about any instructions, do not hesitate to ask.  WHAT ARE THE POSSIBLE HARMS AND SIDE EFFECTS OF PARTICIPATING?  Image 1: User demonstrating possible physiological sensors: respiration rate (a), blood volume pulse (b), skin conductance (c), ECG (d). 209 B.4. Experiment 3 Page 3 of 5  Version 1.2 - December 3, 2009 There are no expected harms or side effects from participating in this experiment. Nothing will be done to impose stress or anxiety on you.  The biosensors that are worn are non-intrusive, and FDA-approved safe for medical uses.  WHAT ARE THE BENEFITS OF PARTICIPATING IN THIS STUDY? No one knows whether or not you will benefit from this study.  There may or may not be direct benefits to you from taking part in this study.  We hope that the information learned from this study can be used in the future to benefit others.  WHAT HAPPENS IF I DECIDE TO WITHDRAW MY CONSENT TO PARTICIPATE? Your participation in this research is entirely voluntary.  You may withdraw from this study at any time.  If you choose to enter the study and then withdraw at a later time, all data collected about you during your enrolment in the study will be retained for analysis.  By law, this data cannot be destroyed.  WHAT WILL THE STUDY COST ME? You are not expected to incur any personal expenses as a result of your participation in this study.  You will be compensated $5 for each 1/2-hour study session.  WILL MY TAKING PART IN THIS STUDY BE KEPT CONFIDENTIAL? Your confidentiality will be respected.  No information that discloses your identity will be released or published without your specific consent to the disclosure. Research records identifying you may be inspected in the presence of the Investigator or his or her designate by representatives of Health Canada and the UBC Research Ethics Board for the purpose of monitoring the research.  However, no records which identify you by name or initials will be allowed to leave the Investigators’ offices.  WHO DO I CONTACT IF I HAVE QUESTIONS ABOUT THE STUDY DURING MY PARTICIPATION? If you have any questions or desire further information about this study before or during participation, you can contact Karon Maclean at (604)-822-8169.  WHO DO I CONTACT IF I HAVE ANY QUESTIONS OR CONCERNS ABOUT MY RIGHTS AS A SUBJECT DURING THE STUDY? If you have any concerns about your rights as a research subject and/or your experiences while participating in this study, contact the Research Subject Information Line in the University of British Columbia Office of Research Services by e-mail at RSIL@ors.ubc.ca or by phone at 604- 822-8598.  210 B.4. Experiment 3 B.4.4 Participant Assent Form Page 1 of 2  Version 1.0 – September 22, 2009 10:23 PM   THE UNIVERSITY OF BRITISH COLUMBIA  Department of Computer Science 2366 Main Mall Vancouver, BC Canada V6T 1Z4 Phone: (604) 822-3061 Fax: (604) 822-4231  SUBJECT ASSENT FORM  Tamer: Touch-guided Anxiety Management via Engagement with a Robot Pet  INVITATION I am being invited to be part of a research study.  A research study tries to find better treatments to help children like me.  It is up to me if I want to be in this study.  No one will make me be part of the study.  Even if I agree now to be part of the study, I can change my mind later.  No one will be mad at me if I choose not to be part of this study. WHY ARE WE DOING THIS STUDY? We are doing this study to investigate how a robot may help reduce my anxiety levels.  We want to see my reactions to a robot that purrs, breathes, and moves on my lap. WHAT WILL HAPPEN IN THIS STUDY? During this experiment you will be asked to wear physiological sensors as shown in Image 1 on your hands and chest.  These will allow us to record your heart rate, pulse, breathing rate, and skin conductance (how sweaty you are).  If at any time these are uncomfortable please let us know, and we will adjust them for you. We are investigating your reaction to a small robotic creature is touch-sensitive and can breath, purr and stiffen its ears.  You will be asked to hold the creature in your lap while you complete some schoolwork.  The creature may move during this experiment.  WHO IS DOING THIS STUDY? Karon Maclean and other investigators from the UBC Computer Science Department will be doing this study.  They will answer any questions I have about this study.  I can also call them at 604-822-8169, if I am having any problems or if there is an emergency and I cannot talk to my parents. CAN ANYTHING BAD HAPPEN TO ME? There is nothing in this study that should cause anything bad to happen to me.  Image 1: User demonstrating possible physiological sensors:  respiration rate (a), blood volume pulse (b), skin conductance (c), ECG (d). 211 B.5. Experiment Equipment WHO WILL KNOW I AM IN THE STUDY? Only the people who are involved in the study will know I am it.  When the study is finished, the investigators will write a report about what was learned.  This report will not say my name or that I was in the study.  My parents and I do not have to tell anyone I am in the study if we don’t want to. WHEN DO I HAVE TO DECIDE? I have as much time as I want to decide to be part of the study.  I have also been asked to discuss my decision with my parents.  If I put my name at the end of this form, I agree to be in the study. SUBJECT'S ASSENT TO PARTICIPATE IN RESEARCH I have had the opportunity to read this consent form, to ask questions about my participation in this research, and to discuss my participation with my parents/guardians.  All my questions have been answered. I understand that I may withdraw from this research at any time, and that this will not interfere with the availability to me of other health care. I have received a copy of this consent form. I assent to participate in this study.  ____________________________  ____________________________  ____________ PRINTED NAME OF SUBJECT SIGNATURE DATE  B.5 Experiment Equipment Figure B.82 shows the command and control scheme used during the experiments. During the experiment the host computer was an IBM Thinkpad T400P with an Intel Core 2 Duo T9400 processor and 2 gigabytes of RAM, running Windows XP. Communications between the sensors and host computer was by USB. Communication of touch data and hardware state from the Creature to the host computer was by Bluetooth radio, and of creature commands from the host computer to the Creature by the XBee wireless radio system. CREATURE SENSORS USER PHYSIO DATA TOUCH DATA HARDWARE STATE CREATURE COMMANDS PHYSIO SOFTWARE CREATURE DISPLAY Figure B.82: Diagram of TAMER command and control scheme. Arrows represent com- munications links between system components, dashed arrows identify the connections that are typically wireless. 212 Appendix C Schematics C.1 Radio Base Station Schematics Figure C.1: The radio base station board. 213 C .1. R a d io B ase S tation S ch em atics ATMEGA8 10k +5V 100n +5V GND 22p 22p GND GND GND 100n GND 100u 100u GND GND GND 180 100n GND 100n 100 +5V 180 180 1k 1k 16MHz 100_NM MC33269D-5.0 100n GND +5V 100n GND G N D NDT2955 LM358D LM358D 10k 10k GND 100n GND +5V 100n +5V 100n 10k 1k GND GND GND GND BC547B GND 3.3V GND S4301B S4301B S4301B MAX7219CNG GND 100n10uF 2 8 k 180 180 180 68 56 G N D YEL WHT G N D G N D POWER_JACKPTH_LOCK GND FT232RLSSOP +5V + 5 V (ADC5)PC5 28 (ADC4)PC4 27 (ADC3)PC3 26 (ADC2)PC2 25 (ADC1)PC1 24 (ADC0)PC0) 23 (SCK)PB5 19 (MISO)PB4 18 (MOSI)PB3 17 (SS)PB2 16 (OC1)PB1 15 (ICP)PB0 14 (AIN1)PD7 13 (AIN0)PD6 12 (T1)PD5 11 (T0)PD4 6 (INT1)PD3 5 (INT0)PD2 4 (TXD)PD1 3 (RXD)PD0 2 GND8 VCC7 AVCC20 AREF21 XTAL19 XTAL210 RESET1 AGND22 IC1 R1 C1 C2 C3 C5C6 C7 3 1 2 4 S1 R7 C4 C8 LPWRR6 LRX LTX R4 R5 R8 R9 D1 Q2 R2 VI3 1 VO 2 IC4 ADJ C9 1 F C10 L T1 2 3 1 IC5A 6 5 7 IC5B 8 4 R10 R11 C11 C12 C13 21 RESET-EN VCC 1 DOUT 2 DIN/CONFIG 3 CD/DOUT_EN/DO8 4 RESET 5 PWM0/RSSI 6 DTR/SLEEP_RQ/DI8 9 GND 10 RF_TX/AD4/DIO4 11 CTS/DIO7 12 ON/SLEEP 13 VREF 14 ASSOC/AD5/DIO5 15 RTS/AD6/DIO6 16 COORD_SEL/AD3/DIO3 17 AD2/DIO2 18 AD1/DIO1 19 AD0/DIO0 20 1 2 3 XBEE_CSEL0 1 2 3 XBEE_CSEL1 1 2 XBEE_RESET1 1 2 XBEE_RESET2 R18 R21 Q1 6 2 5 9 10 3 LED1 7 4 1 8 a b c d e f g DP C C 6 2 5 9 10 3 LED2 7 4 1 8 a b c d e f g DP C C 6 2 5 9 10 3 LED3 7 4 1 8 a b c d e f g DP C C DIG2 6 DIG3 7 DIG4 3 DIG5 10 DIG6 5 DIG7 8 SEGA 14 SEGB 16 SEGC 20 SEGD 23 SEGE 21 SEGF 15 SEGG 17 SEGDP 22 DIG1 11 DIG0 2 DIN1 DOUT24 LOAD12 CLK13 ISET18 GND9 GND4 VCC19 U$5 C14C16 R 3 R12 R13 R14 R15 R16 GRN BLU C RED LRGB LED-TRICOLOR-THROUGHHOLE L0 L1 1 2 3 XBEE_CSEL1S 1 2 3 XBEE_CSEL0S 1 2 3 4 5 JP1 6 1 2 3 4 5 JP2 6 J1 D+ D- VBUS GND RESET19 OSCI27 OSCO28 DSR 9 DCD 10 RI 6 3V3OUT17 USBDM16 USBDP15 GND77 GND1818 GND2121 TXD 1 RXD 5 VCCIO4 AGND25 IC3 TEST26 VCC20 TXLED 23 RXLED 22 RTS 3 CTS 11 DTR 2 PWREN 14 TXDEN 13 SLEEP 12 +5V +5V +5V +5V +5V GND GND GND N$10 N$11 N$12 N$13 N$14 N$14 N$15 N$15 N$16 N$19 N$5 N$5 N$22 N$22 AREF AREF RESET VIN VIN M8RXD M8RXD M8RXD M8RXD M8RXD M8TXD M8TXD M8TXD SCK SCK PWRIN PWRIN D- D- D+ D+ VCC3O VCC3O VCC3O VCC3O MISO MISO MOSI MOSI SS SS RTS RTS DTR DTR GATE_CMD CMP USBVCC USBVCC CTS DSR DCD RI X9 X9 X16 X16 N$49 N$53 VBUS VBUS LRXPLRXP LTXP LTXP N$20 N$21 U S B 0 1 2 3 4 5 6 7 8 9 10 11 12 13 HAPTICAT RADIO BASE STATION 9/30/09 JOSEPH P. HALL III DISPLAY POWER XBEE USB ATMEGA G N D 5 0 0 m A 1 5 k R 1 7 R E S E T 1 0 0 R 1 9 R 2 0 1 0 0 B L U E L R S S I G R N L A S S O C G N D G N D Figure C.2: The radio base station schematic. 214 C.1. Radio Base Station Schematics Part Value Device Form Factor Source Part No. C1 100n Ceramic Capacitor .1” Through-holeDigikey PCC1828CT-ND C10 100n Ceramic Capacitor .1” Through-holeDigikey PCC1828CT-ND C11 100n Ceramic Capacitor .1” Through-holeDigikey PCC1828CT-ND C12 100n Ceramic Capacitor .1” Through-holeDigikey PCC1828CT-ND C13 100n Ceramic Capacitor .1” Through-holeDigikey PCC1828CT-ND C14 100n Ceramic Capacitor .1” Through-holeDigikey PCC1828CT-ND C16 10uF Electrolytic Capaci- tor Radial Digikey P975-ND C2 22p Ceramic Capacitor .1” Through-holeDigikey 445-4763-ND C3 22p Ceramic Capacitor .1” Through-holeDigikey 445-4763-ND C4 100n Ceramic Capacitor .1” Through-holeDigikey PCC1828CT-ND C5 100n Ceramic Capacitor .1” Through-holeDigikey PCC1828CT-ND C6 100u Electrolytic Capaci- tor Radial Digikey vP12924-ND C7 100u Electrolytic Capaci- tor Radial Digikey P12924-ND C8 100n Electrolytic Capaci- tor Radial Digikey PCC1828CT-ND C9 100n Electrolytic Capaci- tor .1” Through-holeDigikey PCC1828CT-ND D1 - DIODE-1N4001 SparkFun Digikey 1N4001FSCT-ND F1 500mA Resettable Fuse SMD Digikey MF-MSMF030- 2CT- ND IC1 - ATMEGA8 DIL28-3 Sparfun IC3 - FT232RL USB to Se- rial Converter SSOP28DB Digikey 768-1007-1-ND IC4 - MC33269D-5.0 DPACK IC5 - LM358D Low Dropout OpAmp SO08 Digikey LM358DR2GOSCT- ND J1 - Power Jack Sparkfun JP1 - Front Header Pins Digikey JP2 - Front Header Pins Digikey L WHT Indicator Light 1206 SMD Digikey 160-1737-1-ND L0 YEL LED 5MM Radial Digikey 365-1190-ND L1 WHT LED 5MM Radial Digikey 67-1695-ND LASSOC GRN LED 5MM Radial Digikey C503B-GCN- CY0C0791-ND LED1 - LED Bar Graph Digikey 160-1068-ND LED2 - LED Bar Graph Digikey 160-1068-ND LED3 - LED Bar Graph Digikey 160-1068-ND LPWR ORG Power Light 1206 SMD Digikey 350-2049-1-ND LRGB - Tricolor LED 5MM Radial LRSSI BLUE LED5MM 5MM Radial Digikey C503B-BAN- CY0C0461-ND LRX BLUE LED-1206 1206 SMD Digikey LTX BLUE LED-1206 1206 SMD Digikey Q1 - BC547B TO92 BC547B Q2 16MHz Crystal Oscillator HC49/S CRYTALHC49S R1 10 kOhm Resistor AXIAL-0.3 Digikey P10.0KCACT -ND Continued. . . 215 C.1. Radio Base Station Schematics Part Value Device Form Factor Source Part No. R10 10 kOhm Resistor AXIAL-0.3 Digikey P10.0KCACT -ND R11 10 kOhm Resistor AXIAL-0.3 Digikey P10.0KCACT -ND R12 180 Ohm Resistor AXIAL-0.3 Digikey P180CACT-ND R13 180 Ohm Resistor AXIAL-0.3 Digikey P180CACT-ND R14 180 Ohm Resistor AXIAL-0.3 Digikey P180CACT-ND R15 68 Ohm Resistor AXIAL-0.3 Digikey R16 56 Ohm Resistor AXIAL-0.3 Digikey R17 15 kOhm Resistor AXIAL-0.3 Digikey P15.0KCACT -ND R18 10 kOhm Resistor AXIAL-0.3 Digikey P10.0KCACT -ND R19 100 Ohm Resistor AXIAL-0.3 Digikey P100CACT-ND R2 100 Ohm Resistor AXIAL-0.3 Digikey P100CACT-ND R20 100 Ohm Resistor AXIAL-0.3 Digikey P100CACT-ND R21 1 kOhm Resistor AXIAL-0.3 Digikey P1.00KCACT -ND R3 28 kOhm Resistor AXIAL-0.3 Digikey P28.0KCACT -ND R4 180 Ohm Resistor AXIAL-0.3 Digikey P180CACT-ND R5 180 Ohm Resistor AXIAL-0.3 Digikey P180CACT-ND R6 100 Ohm Resistor AXIAL-0.3 Digikey P100CACT-ND R7 180 Ohm Resistor AXIAL-0.3 Digikey P180CACT-ND R8 1 kOhm Resistor AXIAL-0.3 Digikey P1.00KCACT -ND R9 1 kOhm Resistor AXIAL-0.3 Digikey P1.00KCACT -ND RESET-EN - SJ jumper S1 - 6mm Tactile Switch 6mm Digikey SW793-ND T1 NDT2955 PMOSSOT223 SOT223 U$5 MAX7219CNG DIL24-3 MAX7219CNG U$6 LTA-1000GLTA-1000G LTA-1000G X1 USBPTH USBPTH USB-B-PTH XB1 - Xbee Radio - Digikey XB24-AWI-001-ND XBEE CSEL0 - Xbee RX/TX Jumper Pins SparkFun XBEE CSEL0S - Xbee RX/TX Jumper Pins SparkFun XBEE CSEL1 - Xbee RX/TX Jumper Pins SparkFun XBEE CSEL1S - Xbee RX/TX Jumper Pins SparkFun XBEE RESET1- Xbee Reset Jumper Pins SparkFun XBEE RESET2- Xbee Reset Jumper Pins SparkFun 216 C.2. Creature Board Schematics C.2 Creature Board Schematics 217 C.2. Creature Board Schematics Figure C.3: The Creature board board. 218 C.2. Creature Board Schematics Figure C.4: The Creature board schematic. 219 C .2 . C reatu re B oard S ch em atics Part Value Name Number 1W R 4.7k RES 4.70K OHM .25W 1% 1206 SMD RHM4.70KFRCT-ND 3VREG IC LDO REG W/SD 3.3V SOT223-3 LT1129CST-3.3#PBF-ND 5VREG IC LDO REG W/SD 5V SOT223-3 LT1129CST-5#PBF-ND Bluetooth Bluetooth SMD Module - Bluegiga WRL-08771 C C1 .1uF CAP .10UF 50V CERAMIC X7R 10% BC1084CT-ND C C2 .01uF CAP .01UF 50V 10% CER RADIAL 399-4148-ND C C3 10uF CAP ELECT 10UF 25V KS RADIAL P975-ND C C4 .01uF CAP .01UF 50V 10% CER RADIAL 399-4148-ND C CA 10uF CAP ELECT 10UF 25V KS RADIAL P975-ND C Q1 TRANS PNP PWR GP 7A 50V TO220AB 2N6109GOS-ND C R1 1.4k RES 1.40K OHM 1/4W 1% 1206 SMD RHM1.40KFCT-ND c R2 150 RES 150K OHM 1/4W 1% 1206 SMD RHM150KFRCT-ND C R3 68k RES 68.0K OHM 1/4W 1% 1206 SMD RHM68.0KFRCT-ND C R4 22k RES 22.0K OHM 1/4W 1% 1206 SMD RHM22.0KFRCT-ND C RS 138 RES 137 OHM 1/4W 1% 1206 SMD RHM137FCT-ND C1 .1uF CAP .10UF 50V CERAMIC X7R 10% BC1084CT-ND C10 .1uF CAP .10UF 50V CERAMIC X7R 10% BC1084CT-ND C11 .1uF CAP .10UF 50V CERAMIC X7R 10% BC1084CT-ND C12 .1uF CAP .10UF 50V CERAMIC X7R 10% BC1084CT-ND C13 .1uF CAP .10UF 50V CERAMIC X7R 10% BC1084CT-ND C14 .1uF CAP .10UF 50V CERAMIC X7R 10% BC1084CT-ND C2 10uF CAP ELECT 10UF 25V KS RADIAL P975-ND C21 .1uF CAP .10UF 50V CERAMIC X7R 10% BC1084CT-ND C22 .1uF CAP .10UF 50V CERAMIC X7R 10% BC1084CT-ND C3 10uF CAP ELECT 10UF 25V KS RADIAL P975-ND C4 10uF CAP ELECT 10UF 25V KS RADIAL P975-ND C5 10uF CAP ELECT 10UF 25V KS RADIAL P975-ND C6 .1uF CAP .1UF 25V CERAMIC X7R 0805 PCC1828CT-ND C7 .1uF CAP .1UF 25V CERAMIC X7R 0805 PCC1828CT-ND C8 .1uF CAP .10UF 50V CERAMIC X7R 10% BC1084CT-ND C9 .1uF CAP .10UF 50V CERAMIC X7R 10% BC1084CT-ND D1 DIODE GEN PURPOSE 50V 1A DO41 1N4001FSCT-ND DIGIPOT0 100k IC DGTL POT DUAL 256-TAP 14TSSOP MAX5479EUD+-ND Continued. . . 220 C .2 . C reatu re B oard S ch em atics Part Value Name Number DIGIPOT1 100k IC DGTL POT DUAL 256-TAP 14TSSOP MAX5479EUD+-ND DIGIPOT2 100k IC DGTL POT DUAL 256-TAP 14TSSOP MAX5479EUD+-ND HB-STPR0 IC QUAD HALF-H DRVR 16-DIP 296-9518-5-ND HB-STPR1 IC QUAD HALF-H DRVR 16-DIP 296-9518-5-ND HEAT-T0 MOSFET P-CH 12V 8.9A 8-SOIC IRF7433PBFCT-ND HEAT-T1 MOSFET P-CH 12V 8.9A 8-SOIC IRF7433PBFCT-ND HEAT-T2 MOSFET P-CH 12V 8.9A 8-SOIC IRF7433PBFCT-ND HEAT-T3 MOSFET P-CH 12V 8.9A 8-SOIC IRF7433PBFCT-ND IC1 IC BATT FASTCHRG NICD/NIMH16SOIC MAX712CSE+-ND LED1 YEL/GRNLED ALINGAP YW/GN CLEAR 1206 SMD 350-2052-1-ND LED2 YEL/GRNLED ALINGAP YW/GN CLEAR 1206 SMD 350-2052-1-ND LED3 GRN LED ALINGAP GREEN CLEAR 1206 SMD 350-2053-1-ND LED4 GRN LED ALINGAP GREEN CLEAR 1206 SMD 350-2053-1-ND LED5 GRN LED ALINGAP GREEN CLEAR 1206 SMD 350-2053-1-ND LED6 WHT LED WHITE YELLOW 260MCD 1206 160-1737-1-ND LED7 GRN LED ALINGAP GREEN CLEAR 1206 SMD 350-2053-1-ND LED8 BLUE LED INGAN BLUE CLEAR 1206 SMD 350-2055-1-ND LEDH0 RED/ORGLED ALINGAP RD/OR CLEAR 1206 SMD 350-2048-1-ND LEDH1 RED/ORGLED ALINGAP RD/OR CLEAR 1206 SMD 350-2048-1-ND LEDH2 RED/ORGLED ALINGAP RD/OR CLEAR 1206 SMD 350-2048-1-ND LEDH3 RED/ORGLED ALINGAP RD/OR CLEAR 1206 SMD 350-2048-1-ND LEDRXB BLUE LED INGAN BLUE CLEAR 1206 SMD 350-2055-1-ND LEDTXB BLUE LED INGAN BLUE CLEAR 1206 SMD 350-2055-1-ND MUX0 IC MUX/DEMUX 1X16 24SOIC 568-4591-5-ND MUX1 IC MUX/DEMUX 1X16 24SOIC 568-4591-5-ND MUX2 IC MUX/DEMUX 1X16 24SOIC 568-4591-5-ND MUX3 IC MUX/DEMUX 1X16 24SOIC 568-4591-5-ND MUXR0 1k RES 1.00K OHM 1/4W 1% 1206 SMD RHM1.00KFRCT-ND MUXR1 1k RES 1.00K OHM 1/4W 1% 1206 SMD RHM1.00KFRCT-ND MUXR2 1k RES 1.00K OHM 1/4W 1% 1206 SMD RHM1.00KFRCT-ND MUXR3 1k RES 1.00K OHM 1/4W 1% 1206 SMD RHM1.00KFRCT-ND OPAMP0 IC OP AMP LOW PWR DUAL 8-SOIC 497-1591-1-ND OPAMP1 IC OP AMP LOW PWR DUAL 8-SOIC 497-1591-1-ND OPAMP2 IC OP AMP LOW PWR DUAL 8-SOIC 497-1591-1-ND Continued. . . 221 C .2 . C reatu re B oard S ch em atics Part Value Name Number POWER JACK Q1 BC547B TRANS NPN 45V 100MA TO-92 BC547BTACT-ND R/A HEAD R RST 47k RES 47.0K OHM 1/4W 1% 1206 SMD RHM47.0KFRCT-ND R1 180 RES 180 OHM 1/4W 1% 1206 SMD RHM180FRCT-ND R10 180 RES 180 OHM 1/4W 1% 1206 SMD RHM180FRCT-ND R11 150 RES 150 OHM 1/4W 1% 1206 SMD RHM150FRCT-ND R12 150 RES 150 OHM 1/4W 1% 1206 SMD RHM150FRCT-ND R13 150 RES 150 OHM 1/4W 1% 1206 SMD RHM150FRCT-ND R14 100 RES 100 OHM 1/4W 1% 1206 SMD RHM100FRCT-ND MUXR5 1k RES 1.00K OHM 1/4W 1% 1206 SMD RHM1.00KFRCT-ND MUXR4 1k RES 1.00K OHM 1/4W 1% 1206 SMD RHM1.00KFRCT-ND R17 15k RES 10.0K OHM 1/4W 1% 1206 SMD RHM10.0KFRCT-ND R18 10k RES 15.0K OHM 1/4W 1% 1206 SMD RHM15.0KFRCT-ND R19 150 RES 150 OHM 1/4W 1% 1206 SMD RHM150FRCT-ND R2 180 RES 180 OHM 1/4W 1% 1206 SMD RHM180FRCT-ND R20 100 RES 100 OHM 1/4W 1% 1206 SMD RHM100FRCT-ND R21 1k RES 1.00K OHM 1/4W 1% 1206 SMD RHM1.00KFRCT-ND R3 180 RES 180 OHM 1/4W 1% 1206 SMD RHM180FRCT-ND R4 1.5k RES 1.50K OHM 1/4W 1% 1206 SMD RHM1.50KFRCT-ND R5 1.5k RES 1.50K OHM 1/4W 1% 1206 SMD RHM1.50KFRCT-ND R6 180 RES 180 OHM 1/4W 1% 1206 SMD RHM180FRCT-ND R7 100 RES 100 OHM 1/4W 1% 1206 SMD RHM100FRCT-ND R8 100 RES 100 OHM 1/4W 1% 1206 SMD RHM100FRCT-ND R9 180 RES 180 OHM 1/4W 1% 1206 SMD RHM180FRCT-ND RVM1 20k RES 20.0K OHM 1/4W 1% 1206 SMD RHM20.0KFRCT-ND RVM2 10k RES 10.0K OHM 1/4W 1% 1206 SMD RHM10.0KFRCT-ND S1 SWITCH TACT 6MM 260GF H=4.3MM SW793-ND TEMPB2 IC THERM MICROLAN PROG-RES TO-92 DS18B20+PAR-ND U$9 IC VOLT-LVL TRANSL 2BIT BI SM8 296-21978-1-ND VR HEAT Dimension Engineering 10W Adjustable Switching Regulator VR MOTORS Dimension Engineering 10W Adjustable Switching Regulator VR SERVO Dimension Engineering 10W Adjustable Switching Regulator XBEE MODULE ZIGBEE 100MW W/CHIP ANT XBP24-ACI-001-ND Continued. . . 222 C .2 . C reatu re B oard S ch em atics Part Value Name Number XBPINS 2mm 10pin XBee Socket LEDTXX ORG LED ALINGAP ORN CLEAR 1206 SMD 350-2049-1-ND LEDRXX ORG LED ALINGAP ORN CLEAR 1206 SMD 350-2049-1-ND R15 150 RES 150 OHM 1/4W 1% 1206 SMD RHM150FRCT-ND R16 150 RES 150 OHM 1/4W 1% 1206 SMD RHM150FRCT-ND LEDRXO YEL LED ALINGAP YLW CLEAR 1206 SMD 350-2050-1-ND LEDTXO YEL LED ALINGAP YLW CLEAR 1206 SMD 350-2050-1-ND R22 150 RES 150 OHM 1/4W 1% 1206 SMD RHM150FRCT-ND R23 150 RES 150 OHM 1/4W 1% 1206 SMD RHM150FRCT-ND 223 Appendix D Code Herein is code used for the experiments in thesis. Creature accepts incoming serial byte at 9600bps from XBee radio. If that byte is 0-252, value mapped to breathing servo. If that byte is 253, pulse triggered. Heat, ears, and purr all deactivated. The most recently received value and serial port is sent out via Bluetooth. Table D.1: Haptic Creature communications protocol. Input to Creature f Pulse out ten steps d Pulse in ten steps a Pulse out one step s Pulse in one step r Start reporting temperature sensor data t Stop reporting temperature sensor data Output from Creature R. Current respiration servo position T. Output of breathing servo temperature sensor U. Output of anterior temperature sensor V. Output of electronics board temperature sensor Listing D.1: Haptic Creature Mirroring Code 1 /∗∗∗∗ Arduino code f o r Haptic Creature to a l low mir ro r ing o f ←↩ breath ing and pu l s e ∗∗∗∗ 2 ∗ Joseph P. Hal l I I I 3 ∗ 03/27/10 4 ∗ 5 ∗ Accepts incoming s e r i a l byte at 9600 bps 6 ∗ I f that byte i s 0−252 , va lue mapped to breath ing servo 7 ∗ I f that byte i s 253 , pu l s e t r i g g e r e d 8 ∗ heat , ears , and purr o f f 9 ∗ 10 ∗ Use with new e l e c t r o n i c s board : 11 ∗ − Sends data out v ia bluetooth , in v ia XBee 12 ∗ 13 ∗ ∗/ 14 15 // Connection D e f i n i t i o n s , p re t ty much s e l f −explanatory 224 Appendix D. Code 16 #define STEPPER PIN1 48 17 #define STEPPER PIN2 49 18 #define STEPPER PIN3 50 19 #define STEPPER PIN4 41 20 #define LIMIT SWITCH PIN 25 21 #define PULSE LS 25 22 23 #define BREATH PIN 23 24 #define SEAR PIN 27 25 #define PEAR PIN 29 26 27 #define PURR ENABLE PIN 7 28 #define PURR DIR1 PIN 5 29 #define PURR DIR2 PIN 6 30 31 // OneWire f o r temperature r ead ings 32 #include <OneWire . h> 33 OneWire ds (53) ; // s t a r t onewire on pin 53 34 //OneWire and temperature p r o c e s s i n g v a r i a b l e s 35 byte present = 0 ; // 1 i f s e n s o r s pre sent 36 byte data [ 1 2 ] ; // data read from s e n s o r s 37 byte addr [ 8 ] ; // address o f s enso r from which to read 38 int HighByte , LowByte , TReading , SignBit , Tc , Tc_100 , Whole , Fract ; //←↩ vars f o r conver t ing data to degree s F 39 byte tempsense1 [ 8 ] = {40 , 136 , 25 , 15 , 2 , 0 , 0 , 15} ; // address o f ←↩ temperature s enso r in decimal , breath ing servo 40 byte tempsense2 [ 8 ] = {40 , 81 , 22 , 2 , 2 , 0 , 0 , 175} ; // address o f ←↩ temperature s enso r in decimal , on board 41 byte tempsense3 [ 8 ] = {40 , 35 , 65 , 2 , 2 , 0 , 0 , 157} ; // address o f ←↩ temperature s enso r in decimal , c r e a tu r e a n t e r i o r 42 boolean t2sflag = false ; // True when we should p r i n t a temperature ←↩ read ing 43 int t2s = 0 ; // temperature to be sent over s e r i a l 44 byte senstoread = 0 ; // the next temperature s enso r to read , t y p i c a l l y←↩ 1−3 45 46 // Stepper motor f o r pu l s e 47 #include <Stepper . h> 48 Stepper Pulse (200 , STEPPER_PIN1 , STEPPER_PIN2 , STEPPER_PIN3 , ←↩ STEPPER_PIN4 ) ; // 200 pu l s e per r o t a t i o n s tepper 49 int Pulse_dir = 1 ; // D i r e c t i on o f next pu l s e s tep 50 boolean pulseflag = false ; // True when pu l s e command sent u n t i l pu l s e←↩ completed 51 byte pulsecount = 0 ; // Number o f s t ep s in to pu l s e we are 52 // Pulse l i m i t switch i s attached to p ins 43 and 45 , i s high on ←↩ depre s s 53 boolean pulse_ls_read = true ; // Pulse l i m i t switch read ing ; 54 55 // Servo v a r i a b l e s 56 #include <Servo . h> 57 Servo Breathing ; 58 Servo SEar ; 225 Appendix D. Code 59 Servo PEar ; 60 61 62 //Timer Library ( Just to use the temperature s e n s o r s :−\) 63 #include <MsTimer2 . h> 64 65 // loop index v a r i a b l e s 66 int j = 0 ; 67 int i = 0 ; 68 69 void setup ( ) { 70 // Set t imer to f i f t e e n seconds 71 MsTimer2 : : set (15000 , readtemp ) ; 72 MsTimer2 : : start ( ) ; 73 74 // S e r i a l communication channe l s 75 Serial . begin (9600) ; // USB 76 Serial1 . begin (9600) ; // XBee 77 Serial2 . begin (9600) ; // Bluetooth 78 // S e r i a l 3 . begin (9600) ; // t a i l wire 79 80 // Setup Limit switch power and r e c e i v e r 81 pinMode ( LIMIT_SWITCH_PIN , INPUT ) ; 82 digitalWrite ( LIMIT_SWITCH_PIN , LOW ) ; 83 84 // Attach Servos 85 Breathing . attach ( BREATH_PIN ) ; 86 SEar . attach ( SEAR_PIN ) ; 87 PEar . attach ( PEAR_PIN ) ; 88 89 // Set purr ing motor d i r e c t i o n , turn purr ing motor o f f 90 pinMode ( PURR_DIR2_PIN , OUTPUT ) ; 91 pinMode ( PURR_DIR1_PIN , OUTPUT ) ; 92 digitalWrite ( PURR_DIR1_PIN , LOW ) ; 93 digitalWrite ( PURR_DIR2_PIN , HIGH ) ; 94 pinMode ( PURR_ENABLE_PIN , OUTPUT ) ; 95 analogWrite ( PURR_ENABLE_PIN , 0) ; // purr speed to 0 96 97 // I n i t i a t e and zero s tepper motor 98 Pulse . setSpeed (5 ) ; //Slow down the speed f o r i n i t i a t i o n 99 //Run u n t i l pu l s e depressed 100 for (i = 0 ; i < 50 ; i++) { 101 if ( digitalRead ( PULSE_LS ) != 1) { 102 Pulse . step ( Pulse_dir ) ; 103 delay (50) ; // wait f o r debounce 104 } 105 } 106 // take two a d d i t i o n a l s t ep s to be sure i t i s depressed 107 Pulse . step ( Pulse_dir ) ; 108 Pulse . step ( Pulse_dir ) ; 109 110 // Restore speed 226 Appendix D. Code 111 Pulse . setSpeed (25) ; 112 113 // d e a c t i va t e hea t e r s 114 for (j = 10 ; j < 13 ; j++) { 115 pinMode (j , OUTPUT ) ; 116 digitalWrite (j , HIGH ) ; 117 } 118 119 // Zero s e rvo s 120 Breathing . write (0 ) ; 121 SEar . write (0 ) ; 122 PEar . write (0 ) ; 123 124 // Star t temperature read ing 125 tempPreparetoRead ( ) ; 126 } 127 128 byte inByte = 0 ; // byte read from s e r i a l port 129 byte btarget = 0 ; // breath ing amplitude to ach i eve 130 byte boldtarget = 0 ; // prev ious breath ing amplitude to ach ieve 131 132 int bvalue = 50 ; // f i l t e r e d breath ing amplitude to ach ieve 133 int millisold = 0 ; // prev ious time r e s p i r a t i o n command r e c e i v e d 134 int diff = 0 ; // amount o f time s i n c e prev ious r e s p i r a t i o n command ←↩ was r e c e i v e d 135 int interval = 10 ; // m i l l i s e c o n d s between r e s p i r a t i o n commands 136 137 int pulseinterval = 0 ; // amount o f time s i n c e prev ious pu l s e command ←↩ was r e c e i v e d 138 int pulseold = 0 ; // time prev ious pu l s e command was r e c e i v e d 139 140 void loop ( ) { 141 // Communication and Control through USB Port 142 if ( Serial . available ( ) > 0) { 143 inByte=Serial . read ( ) ; 144 switch ( inByte ) { 145 case 'f' : 146 Pulse . step (10) ; 147 break ; 148 case 'd' : 149 Pulse . step (−10) ; 150 break ; 151 case 'a' : 152 Pulse . step (1 ) ; 153 break ; 154 case 's' : 155 Pulse . step (−1) ; 156 break ; 157 case 'r' : 158 // read t a i l s en so r 159 MsTimer2 : : start ( ) ; 160 break ; 227 Appendix D. Code 161 case 't' : 162 MsTimer2 : : stop ( ) ; 163 break ; 164 } 165 inByte = 0 ; 166 } 167 168 // Breathing and pu l s e commands through XBee Radio 169 if ( Serial1 . available ( ) > 0) { 170 boldtarget = btarget ; // s t o r e the prev ious commanded ←↩ breath ing servo value 171 inByte=Serial1 . read ( ) ; 172 Serial2 . print ("R." ) ; // p r i n t the new r e s p i r a t i o n ra t e 173 Serial2 . println ( inByte , DEC ) ; 174 if ( inByte == 253) { // 253 = pu l s e command 175 pulseinterval = millis ( ) − pulseold ; // make sure we ' re ←↩ not r e c e i v i n g these too qu i ck ly 176 pulseold = millis ( ) ; 177 if ( pulseinterval > 50) { 178 pulseflag = true ; 179 } else{ 180 btarget = boldtarget ; 181 } 182 } 183 if ( inByte < 253) { 184 btarget = inByte ; 185 } 186 millisold = millis ( ) ; 187 } 188 189 // Now what we pass through every i t e r a t i o n : 190 // Send proper command to breath ing servo 191 diff = millis ( ) − millisold ; 192 // Updates are r e c e i v e d every ˜60 microseconds 193 /∗ i f ( d i f f < i n t e r v a l ) { 194 bvalue = bo ld ta rge t + ( ( btarge t − bo ld ta rge t ) ∗ d i f f ) / i n t e r v a l ; 195 } e l s e { 196 bvalue = btarge t ; 197 } 198 i f ( bvalue < 0) { // san i ty check , t h i s happened once or twice 199 bvalue = 0 ; 200 } 201 ∗/ 202 Breathing . writeMicroseconds ( map ( btarget , 0 , 253 ,950 ,1400) ) ; 203 204 // Pulse i f nece s sa ry 205 if ( pulseflag == true ) { 206 if ( pulsecount < 5) { // To pu l s e go three s t ep s forward 207 Pulse . step (3∗ Pulse_dir ) ; 208 pulsecount=6; 209 } 210 else if ( pulsecount > 5 && pulsecount < 10) { 228 Appendix D. Code 211 Pulse . step(−3∗Pulse_dir ) ; // Then three s t ep s back 212 pulsecount=11; 213 } 214 else { 215 pulseflag = false ; 216 pulsecount = 0 ; 217 } 218 } 219 } 220 221 /∗ ∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗ ∗/ 222 // To read temperature s e n s o r s f i r s t we s e l e c t sensor , then wait f o r 223 // convers ion , then read senso r . Some o f t h i s code from arduino ←↩ onewire guide 224 void tempPreparetoRead ( ) { 225 if ( senstoread < 4) { // number o f temperature s e n s o r s + 1 226 senstoread = 1 ; 227 } 228 229 switch ( senstoread ) { 230 case 1 : 231 for ( i = 1 ; i<9; i++) { 232 addr [ i ]=tempsense1 [ i ] ; 233 } 234 break ; 235 case 2 : 236 for ( i = 1 ; i<9; i++) { 237 addr [ i ]=tempsense2 [ i ] ; 238 } 239 break ; 240 case 3 : 241 for ( i = 1 ; i<9; i++) { 242 addr [ i ]=tempsense3 [ i ] ; 243 } 244 break ; 245 } 246 247 248 ds . search ( addr ) ; 249 // Send the command to read 250 ds . reset ( ) ; 251 ds . select ( addr ) ; 252 ds . write (0x44 , 1 ) ; 253 MsTimer2 : : start ( ) ; // s t a r t t imer f o r conver s i on 254 } 255 256 void readtemp ( ) { 257 MsTimer2 : : stop ( ) ; 258 present = ds . reset ( ) ; 259 ds . select ( addr ) ; 260 ds . write (0 xBE ) ; 261 229 Appendix D. Code 262 for (i = 0 ; i<9; i++) { 263 data [ i ] = ds . read ( ) ; 264 } 265 266 // Convert to Fahrenheit and send 267 LowByte = data [ 0 ] ; 268 HighByte = data [ 1 ] ; 269 TReading = ( HighByte << 8) + LowByte ; 270 SignBit = TReading & 0x8000 ; 271 if ( SignBit ) { // negat ive 272 TReading = ( TReading ˆ 0xffff ) + 1 ; // 2 ' s compliment 273 } 274 Tc_100 = (6 ∗ TReading ) + TReading / 4 ; 275 Tc = Tc_100 ; 276 277 if ( SignBit ) { 278 Tc = −1 ∗ Tc ; 279 } 280 t2sflag=true ; 281 t2s=Tc∗9/5+3200; 282 283 // I f temperature f l a g set , read the s e n s r s . 284 if ( t2sflag ) { 285 if ( senstoread ==1) { 286 Serial2 . print ("T." ) ; 287 } else if ( senstoread ==2) { 288 Serial2 . print ("U." ) ; 289 } else if ( senstoread == 3) { 290 Serial2 . print ("V." ) ; 291 } 292 Serial2 . println ( t2s ) ; 293 t2sflag = false ; 294 } 295 296 tempPreparetoRead ( ) ; 297 } 230

Cite

Citation Scheme:

    

Usage Statistics

Country Views Downloads
United States 115 7
United Kingdom 25 143
China 9 0
Germany 7 0
Canada 3 0
France 3 0
Sweden 2 0
Brazil 1 0
Finland 1 0
City Views Downloads
Washington 100 0
Unknown 35 140
Göttingen 5 0
Beijing 4 0
Shenzhen 3 0
Mountain View 3 0
Clarks Summit 2 0
New Westminster 2 0
Cupertino 2 0
Stockholm 2 0
Guangzhou 2 0
Redmond 1 4
Sunnyvale 1 3

{[{ mDataHeader[type] }]} {[{ month[type] }]} {[{ tData[type] }]}

Share

Share to:

Comment

Related Items