Open Collections

UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Affecting affect effectively : investigating a haptic-affect platform for guiding physiological responses Hall, Joseph 2011

You don't seem to have a PDF reader installed, try download the pdf

Item Metadata

Download

Media
ubc_2011_fall_hall_joseph.pdf [ 16.25MB ]
Metadata
JSON: 1.0080686.json
JSON-LD: 1.0080686+ld.json
RDF/XML (Pretty): 1.0080686.xml
RDF/JSON: 1.0080686+rdf.json
Turtle: 1.0080686+rdf-turtle.txt
N-Triples: 1.0080686+rdf-ntriples.txt
Original Record: 1.0080686 +original-record.json
Full Text
1.0080686.txt
Citation
1.0080686.ris

Full Text

A ecting A ect E ectivelyInvestigating a haptic-a ect platform for guidingphysiological responsesbyJoseph P. HallB.Sc., Columbia University in the City of New York, 2008A THESIS SUBMITTED IN PARTIAL FULFILLMENT OFTHE REQUIREMENTS FOR THE DEGREE OFMASTER OF APPLIED SCIENCEinThe Faculty of Graduate Studies(Mechanical Engineering)THE UNIVERSITY OF BRITISH COLUMBIA(Vancouver)May 2011© Joseph P. Hall 2011AbstractThis thesis describes the development of a platform for touch-guided anxiety managementvia engagement with a robot pet. An existing physiological sensor suite and \Haptic Crea-ture" robot pet are modi ed to in uence user physiological responses through real-timeinteraction guided by physiological data. Participant reaction to and perception of the plat-form is then investigated in several experiments, with the results from these experimentsused to re ne the platform design. Finally, an experiment is conducted with elementaryschool children to investigate the ability of the platform to serve as a comforting presenceduring a stressful task.It is found that participants were not able to recognize the Creature mimicking theirbreathing and heart rates. However, once informed of their physiological link to the Creaturethey were able to use the motion of this device to gain a better awareness of their ownphysiological state. In addition, the presence of the Creature and its activities are correlatedwith changes in heart rate, breathing rate, skin conductance, and heart rate variability.These changes are suggestive of a reduction in anxiety. Overall, participant response tothe platform was positive, with many participants reporting that they felt the Creatureto be comforting and calming. Children in particular were receptive to the Creature, andeager to use it in their stressful environment of school testing. It is found that care mustbe taken, however, to ensure the platform is presented in an age-appropriate manner, assudden changes in Creature state can be alarming to the user.The combination of physiological assessment of user a ect with a small, physically com-forting robot results in a unique system with the potential to serve as a companion ortraining aide for children or adults with anxiety disorder, especially in clinical and educa-tional settings.iiPrefaceExperiments 1, 2, and the pilot experiment in this thesis were performed under UBC BREBcerti cate no. H01-80470. Experiment 3 was performed under UBC CREB certi cate no.H09-02860.iiiTable of ContentsAbstract . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iiPreface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iiiTable of Contents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ivList of Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . viiiList of Figures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ixGlossary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xiv1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.1 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.2 Research Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31.3 Thesis Outline . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42 Literature Review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52.1 Robotic Companions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52.2 Robotic Therapy with Children . . . . . . . . . . . . . . . . . . . . . . . . 92.3 Biofeedback and Anxiety Therapy . . . . . . . . . . . . . . . . . . . . . . . 102.4 Haptics and A ect . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112.5 Physiological Assessment of Emotional State . . . . . . . . . . . . . . . . . 132.6 Physiological Interaction with Robots . . . . . . . . . . . . . . . . . . . . . 142.7 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153 Methods and System Design . . . . . . . . . . . . . . . . . . . . . . . . . . . 173.1 General Approach and Methods . . . . . . . . . . . . . . . . . . . . . . . . 173.1.1 Creature Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . 183.1.2 Creature Development . . . . . . . . . . . . . . . . . . . . . . . . . 203.2 Hardware Additions and Modi cations . . . . . . . . . . . . . . . . . . . . 223.2.1 Design Considerations and Challenges . . . . . . . . . . . . . . . . . 233.2.2 Additional Display Mechanisms . . . . . . . . . . . . . . . . . . . . 24ivTable of Contents3.2.3 Creature Electronics Board . . . . . . . . . . . . . . . . . . . . . . . 253.3 Communications: Command and Control . . . . . . . . . . . . . . . . . . . 293.3.1 Design and Construction of Radio System . . . . . . . . . . . . . . 293.3.2 Creature User Interface . . . . . . . . . . . . . . . . . . . . . . . . . 303.3.3 Creature Modes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 313.4 Feedback from Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 333.4.1 Vibration and Noise . . . . . . . . . . . . . . . . . . . . . . . . . . . 333.4.2 Temperature / Cooling . . . . . . . . . . . . . . . . . . . . . . . . . 333.4.3 Comfort . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 343.5 Online Physiological Assessment . . . . . . . . . . . . . . . . . . . . . . . . 353.5.1 Physiological Signals . . . . . . . . . . . . . . . . . . . . . . . . . . 363.5.2 Physiological Sensors Used . . . . . . . . . . . . . . . . . . . . . . . 403.5.3 Sensor Application Notes . . . . . . . . . . . . . . . . . . . . . . . . 504 Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 514.1 Pilot Experiment: Response to Disturbing Images . . . . . . . . . . . . . . 514.1.1 Introduction and Motivation . . . . . . . . . . . . . . . . . . . . . . 514.1.2 Experimental Design Considerations . . . . . . . . . . . . . . . . . . 514.1.3 Research Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . 534.1.4 Experiment Procedure . . . . . . . . . . . . . . . . . . . . . . . . . 544.1.5 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 554.1.6 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 614.1.7 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 624.1.8 Feedback for Iterated Design . . . . . . . . . . . . . . . . . . . . . . 634.2 Experiment 1: Recognition of Mirroring and Initial Reactions to Creature . 644.2.1 Research Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . 654.2.2 Experiment Procedure . . . . . . . . . . . . . . . . . . . . . . . . . 664.2.3 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 684.2.4 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 704.2.5 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 754.2.6 Feedback for Iterated Design . . . . . . . . . . . . . . . . . . . . . . 754.3 Experiment 2: Creature Entraining and Reactions During a Task . . . . . 774.3.1 Research Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . 784.3.2 Procedure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 794.3.3 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 814.3.4 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 874.3.5 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 934.3.6 Feedback for Iterated Design . . . . . . . . . . . . . . . . . . . . . . 94vTable of Contents4.4 Experiment 3: Experiment with Children . . . . . . . . . . . . . . . . . . . 944.4.1 Research Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . 964.4.2 Experimental Procedure . . . . . . . . . . . . . . . . . . . . . . . . 964.4.3 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 984.4.4 Additional Investigation with the Creature . . . . . . . . . . . . . . 1004.4.5 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1004.4.6 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1044.4.7 Feedback for Iterated Design . . . . . . . . . . . . . . . . . . . . . . 1044.5 Re ections on Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1075 Conclusions and Recommendations . . . . . . . . . . . . . . . . . . . . . . 1085.1 Experimental Outcomes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1085.2 Methodological Critique and Recommendations . . . . . . . . . . . . . . . 1105.2.1 Platform Presentation . . . . . . . . . . . . . . . . . . . . . . . . . . 1105.2.2 Platform Interaction . . . . . . . . . . . . . . . . . . . . . . . . . . . 1115.3 Platform Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1135.3.1 Outcomes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1135.3.2 Recommendations . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1145.4 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 117AppendicesA Derivations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126A.1 Creature Physiological Mirroring Derivations . . . . . . . . . . . . . . . . . 126A.1.1 Derivation of Ramped Breathing Motion Commands . . . . . . . . 126A.1.2 Derivation of Ramped Pulse Rate . . . . . . . . . . . . . . . . . . . 128A.1.3 Derivation of Ramped Breathing Motion Commands . . . . . . . . 129A.1.4 Derivation of Ramped Pulse Rate [Simpli ed Motion] . . . . . . . . 130A.2 Physiological Sensor Data Analysis Methods . . . . . . . . . . . . . . . . . 131B Experiment Documents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 134B.1 Preliminary Experiment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135B.1.1 Pre-Experiment Questionnaire . . . . . . . . . . . . . . . . . . . . . 135B.1.2 Post-Experiment Questionnaire . . . . . . . . . . . . . . . . . . . . 138B.1.3 Sample Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139B.1.4 Sample Comparisons . . . . . . . . . . . . . . . . . . . . . . . . . . 142viTable of ContentsB.1.5 Participant Consent Form . . . . . . . . . . . . . . . . . . . . . . . 146B.2 Experiment 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147B.2.1 Post-Experiment Questionnaire . . . . . . . . . . . . . . . . . . . . 147B.2.2 Data Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151B.2.3 Sample Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 152B.2.4 Sample Comparisons . . . . . . . . . . . . . . . . . . . . . . . . . . 159B.2.5 Participant Consent Form . . . . . . . . . . . . . . . . . . . . . . . 165B.3 Experiment 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 166B.3.1 Post-Experiment Questionnaire . . . . . . . . . . . . . . . . . . . . 166B.3.2 Data Tables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 168B.3.3 Sample Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 170B.3.4 Sample Comparisons . . . . . . . . . . . . . . . . . . . . . . . . . . 177B.3.5 Participant Consent Form . . . . . . . . . . . . . . . . . . . . . . . 182B.4 Experiment 3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183B.4.1 Sample Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183B.4.2 Sample Comparisons . . . . . . . . . . . . . . . . . . . . . . . . . . 190B.4.3 Participant Consent Form . . . . . . . . . . . . . . . . . . . . . . . 208B.4.4 Participant Assent Form . . . . . . . . . . . . . . . . . . . . . . . . 211B.5 Experiment Equipment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 212C Schematics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 213C.1 Radio Base Station Schematics . . . . . . . . . . . . . . . . . . . . . . . . . 213C.2 Creature Board Schematics . . . . . . . . . . . . . . . . . . . . . . . . . . . 217D Code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 224viiList of Tables3.1 Heart rate variability frequencies. . . . . . . . . . . . . . . . . . . . . . . . . 474.1 Pilot Experiment: Self-reported Likert-scale responses to anxiety, agitation,and surprise. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 564.2 Summary of signi cant results from Pilot Experiment. . . . . . . . . . . . 614.3 Table of results from Experiment 1 questionnaire. . . . . . . . . . . . . . . . 684.4 Summary of results from Experiment 1. Signi cant results are in bold. . . . 694.5 Summary of signi cant results from Experiment 1. . . . . . . . . . . . . . . 754.6 Questionnaire results from Experiment 2 post-experiment survey. . . . . . . 824.7 Summary of results from Experiment 2. . . . . . . . . . . . . . . . . . . . . 864.8 Summary of signi cant results from Experiment 2. . . . . . . . . . . . . . . 934.9 Summary of signi cant results from Experiment 3. . . . . . . . . . . . . . . 100B.1 Table of results from Experiment 1 questionnaire. . . . . . . . . . . . . . . . 151B.2 Results for two-tailed unequal variance t-test between breath lengths for eachsubject between all stages. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151B.3 Results for two-tailed unequal variance t-test between series of interbeat in-tervals for each subject between all stages. . . . . . . . . . . . . . . . . . . . 152B.4 Table of results from Experiment 2 questionnaire. . . . . . . . . . . . . . . . 168B.5 Results for two-tailed unequal variance t-test between series of interbeat in-tervals for each subject between all stages. . . . . . . . . . . . . . . . . . . . 168B.6 Questionnaire results from Experiment 2 post-experiment survey. . . . . . . 169B.7 Summary of results from Experiment 3. Investigated columns in green, sig-ni cant results are in bold. See Figure B.81 for comparisons. . . . . . . . . 205D.1 Haptic Creature communications protocol. . . . . . . . . . . . . . . . . . . . 224viiiList of Figures1.1 Proposed Creature{user interaction model. . . . . . . . . . . . . . . . . . . 21.2 User-centered diagram of TAMER model. . . . . . . . . . . . . . . . . . . . 33.1 Simpli ed schematic of the Haptic Creature interaction loop; an example ofa haptic-a ect loop. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183.2 TAMER command and control scheme. . . . . . . . . . . . . . . . . . . . . 183.3 Diagram showing development of Haptic Creatures. . . . . . . . . . . . . . 193.4 The Haptic Creature. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 203.5 The Haptic Creature, upside-down, with fur removed, showing silicone skin. 213.6 Haptic Creature pulse mechanism. . . . . . . . . . . . . . . . . . . . . . . . 243.7 Overview of main functions of Creature electronics board. . . . . . . . . . . 273.8 The Haptic Creature electronics board. . . . . . . . . . . . . . . . . . . . . . 273.9 Creature force sensitive resistor circuit. . . . . . . . . . . . . . . . . . . . . . 283.10 Simpli ed diagram of TAMER command and control scheme. . . . . . . . . 303.11 The radio base station for the Creature. . . . . . . . . . . . . . . . . . . . . 313.12 GUI for the Haptic Creature, providing motor, servo, and temperature status. 323.13 Graph of Haptic Creature internal temperature during normal use. . . . . . 343.14 Overview of measured physiological signals and the physiological metrics de-rived from them. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 353.15 User holding the Haptic Creature and wearing physiological sensors. . . . . 413.16 Thought Technology FlexComp™In niti Encoder. . . . . . . . . . . . . . . . 423.17 Thought Technology EKG™ Sensor T9306M, attached to triode electrodesfor placement on chest. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 423.18 Thought Technology EMG MyoScan-Pro™ Sensor T9401M-60. . . . . . . . 433.19 Thought Technology Skin Conductance Sensor SA9309M. . . . . . . . . . . 443.20 Thought Technology Blood Volume Pulse (BVP) Sensor SA9308M, front andrear. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 453.21 A sample un ltered blood volume pulse signal, showing four heartbeats. . . 453.22 Thought Technology Respiration Sensor SA9311M. . . . . . . . . . . . . . . 473.23 A sample  ltered respiration signal. . . . . . . . . . . . . . . . . . . . . . . . 483.24 Calculation of respiration rate. . . . . . . . . . . . . . . . . . . . . . . . . . 49ixList of Figures3.25 Thought Technology Temperature Sensor SA9310M, showing sensor and con-nector to encoder . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 493.26 A sample skin temperature signal. . . . . . . . . . . . . . . . . . . . . . . . 504.1 \Wizard of Oz" Haptic Creature Prototype used in pilot experiment, showingbellows used to simulate breathing and heating pad. . . . . . . . . . . . . . 534.2 Diagram of Pilot Experiment procedure. . . . . . . . . . . . . . . . . . . . . 544.3 Preliminary experiment participant responses to statement \Haptic Creaturewas comforting while viewing the images." . . . . . . . . . . . . . . . . . . . 564.4 Preliminary experiment participant responses to statement \Haptic Crea-ture’s actions were distracting while viewing the images." . . . . . . . . . . 574.5 Preliminary experiment participant responses to statement \Haptic Creaturewould help reduce my anxiety in other situations." . . . . . . . . . . . . . . 574.6 Average normalized skin conductance response for disturbing image slideshowwith and without Haptic Creature prototype for each participant. . . . . . . 584.7 Typical normalized skin conductance response for a participant during calm-ing image set, the baseline. . . . . . . . . . . . . . . . . . . . . . . . . . . . 594.8 Typical normalized skin conductance response for a participant during dis-turbing image slideshow. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 604.9 Diagram of Experiment 1 procedure. . . . . . . . . . . . . . . . . . . . . . . 664.10 Experiment 1 participant responses to statement \I found the creature com-fortable on my lap." . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 714.11 Experiment 1 participant responses to question of whether \creature’s actionsmade them more aware of their own." . . . . . . . . . . . . . . . . . . . . . 714.12 Experiment 1 participant responses to statement \It was easy to recognizecreature mirroring my. . . ". . . . . . . . . . . . . . . . . . . . . . . . . . . . 724.13 Breath lengths of a participant during Experiment 1. . . . . . . . . . . . . . 734.14 Diagram of Experiment 2 procedure. . . . . . . . . . . . . . . . . . . . . . . 794.15 Ramped Creature motion, as used during experiments. . . . . . . . . . . . . 804.16 Breath lengths and heart rate for a participant during stage 2 of Experiment2. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 844.17 Experiment 2 participant responses to survey statement \I was aware of thecreature’s breathing." . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 874.18 Experiment 2 participant responses to survey statement \I was aware of thecreature’s pulse." . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 884.19 Experiment 2 participant responses to survey statement \The creature’sbreathing made me more aware of my own breathing." . . . . . . . . . . . . 89xList of Figures4.20 Experiment 2 participant responses to survey statement \The creature’s pulsemade me more aware of my own heart rate." . . . . . . . . . . . . . . . . . 904.21 Experiment 2 participant responses to survey statement \I found the crea-ture’s motion distracting during the reading assignment." . . . . . . . . . . 914.22 The Experiment 3 procedure diagram. . . . . . . . . . . . . . . . . . . . . . 974.23 Experiment 3 participant during experiment. . . . . . . . . . . . . . . . . . 99B.1 Heart rate acceleration for a participant during the Pilot Experiment. . . . 139B.2 Heart rate for a participant during the Pilot Experiment. . . . . . . . . . . 140B.3 Normalized skin conductance for a participant during the Pilot Experiment. 140B.4 Skin conductance derivative for a participant during the Pilot Experiment. 141B.5 Normalized EMG for a participant during the Pilot Experiment. . . . . . . 141B.6 Mean heart rate for participants during Pilot Experiment. . . . . . . . . . . 142B.7 Standard deviation of normalized heart rates for participants during PilotExperiment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143B.8 Mean normalized heart rate acceleration for participants during Pilot Exper-iment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 143B.9 Mean normalized skin conductance for participants during Pilot Experiment. 144B.10 Mean normalized derivative of skin conductance for participants during PilotExperiment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 144B.11 Mean normalized EMG for participants during Pilot Experiment. . . . . . . 145B.12 Mean estimated arousal for participants during Pilot Experiment. . . . . . . 145B.13 Heart rate acceleration for a participant during Experiment 1. . . . . . . . . 152B.14 Normalized heart rate acceleration for a participant during Experiment 1. . 153B.15 Heart rate for a participant during Experiment 1. . . . . . . . . . . . . . . . 153B.16 Normalized heart rate for a participant during Experiment 1. . . . . . . . . 154B.17 Normalized heart rate standard deviation for a participant during Experi-ment 1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 154B.18 Skin conductance response for a participant during Experiment 1. . . . . . 155B.19 Normalized skin conductance for a participant during Experiment 1. . . . . 155B.20 Skin conductance derivative for a participant during Experiment 1. . . . . . 156B.21 Normalized skin conductance derivative for a participant during Experiment 1.156B.22 EMG for a participant during Experiment 1. . . . . . . . . . . . . . . . . . 157B.23 Normalized EMG for a participant during Experiment 1. . . . . . . . . . . . 157B.24 Breath lengths for a participant during Experiment 1. . . . . . . . . . . . . 158B.25 Mean breath lengths for participants during Experiment 1. . . . . . . . . . 159B.26 Breath length standard deviation for participants during Experiment 1. . . 160B.27 Mean heart rate acceleration for participants during Experiment 1. . . . . . 160xiList of FiguresB.28 Heart rate acceleration standard deviation for participants during Experi-ment 1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 161B.29 Mean skin conductance for participants during Experiment 1. . . . . . . . . 161B.30 Skin conductance standard deviation for participants during Experiment 1. 162B.31 Mean and standard deviation of breath lengths of participants during eachstage of Experiment 1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163B.32 Mean and standard deviation of heart rate for participants during Experi-ment 1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 164B.33 Heart rate acceleration for a participant during Experiment 2. . . . . . . . . 170B.34 Normalized heart rate acceleration for a participant during Experiment 2. . 171B.35 Heart rate for a participant during Experiment 2. . . . . . . . . . . . . . . . 171B.36 Normalized heart rate for a participant during Experiment 2. . . . . . . . . 172B.37 Normalized heart rate standard deviation for a participant during Experi-ment 2. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172B.38 Skin conductance response for a participant during Experiment 2. . . . . . 173B.39 Normalized skin conductance for a participant during Experiment 2. . . . . 173B.40 Skin conductance derivative for a participant during Experiment 2. . . . . . 174B.41 Normalized skin conductance derivative for a participant during Experiment 2.174B.42 EMG for a participant during Experiment 2. . . . . . . . . . . . . . . . . . 175B.43 Normalized EMG for a participant during Experiment 2. . . . . . . . . . . . 175B.44 Skin temperature for a participant during Experiment 2. . . . . . . . . . . . 176B.45 Breath lengths for a participant during Experiment 2. . . . . . . . . . . . . 176B.46 Standard deviation of breath lengths for all participants during Experiment 2.177B.47 Mean breath length for all participants during Experiment 2. . . . . . . . . 178B.48 Mean heart rate for participants during Experiment 2. . . . . . . . . . . . . 178B.49 Mean heart rate standard deviation for participants during Experiment 2. . 179B.50 High frequency component of heart rate variability during Experiment 2 forall participants for all stages. . . . . . . . . . . . . . . . . . . . . . . . . . . 179B.51 Mean skin conductance for participants during Experiment 2. . . . . . . . . 180B.52 Mean derivative of skin conductance for participants during Experiment 2. . 180B.53 Mean EMG for participants during Experiment 2. . . . . . . . . . . . . . . 181B.54 Mean skin temperature for participants during Experiment 2. . . . . . . . . 181B.55 Heart rate acceleration for a participant during Experiment 3. . . . . . . . . 183B.56 Normalized heart rate acceleration for a participant during Experiment 3. . 184B.57 Heart rate for a participant during Experiment 3. . . . . . . . . . . . . . . . 184B.58 Normalized heart rate for a participant during Experiment 3. . . . . . . . . 185B.59 Normalized heart rate standard deviation for a participant during Experi-ment 3. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 185xiiList of FiguresB.60 Skin conductance response for a participant during Experiment 3. . . . . . 186B.61 Normalized skin conductance for a participant during Experiment 3. . . . . 186B.62 Skin conductance derivative for a participant during Experiment 3. . . . . . 187B.63 Normalized skin conductance derivative for a participant during Experiment 3.187B.64 EMG for a participant during Experiment 3. . . . . . . . . . . . . . . . . . 188B.65 Normalized EMG for a participant during Experiment 3. . . . . . . . . . . . 188B.66 Skin temperature for a participant during Experiment 3. . . . . . . . . . . . 189B.67 Breath lengths for a participant during Experiment 3. . . . . . . . . . . . . 189B.68 Mean heart rate standard deviation for participants during Experiment 3. . 191B.69 Mean heart rate pnn50 for participants during Experiment 3. . . . . . . . . 192B.70 Mean skin conductance for participants during Experiment 3. . . . . . . . . 193B.71 Mean skin conductance standard deviation for participants during Experi-ment 3. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 194B.72 Mean skin temperature for participants during Experiment 3. . . . . . . . . 195B.73 Mean skin temperature standard deviation for participants during Experi-ment 3. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 196B.74 Mean breath length for participants during Experiment 3. . . . . . . . . . . 197B.75 Heart rate vlf% for participants during Experiment 3. . . . . . . . . . . . . 198B.76 Mean derivative of skin conductance standard deviation for participants dur-ing Experiment 3. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 199B.77 Mean breath length standard deviation for participants during Experiment 3. 200B.78 Mean heart rate for participants during Experiment 3. . . . . . . . . . . . . 201B.79 Mean heart rate rms standard deviation for participants during Experiment 3.202B.80 Mean derivative of skin conductance for participants during Experiment 3. . 203B.81 Summary of comparisons made during Experiment 3. . . . . . . . . . . . . . 204B.82 Diagram of TAMER command and control scheme. . . . . . . . . . . . . . . 212C.1 The radio base station board. . . . . . . . . . . . . . . . . . . . . . . . . . . 213C.2 The radio base station schematic. . . . . . . . . . . . . . . . . . . . . . . . . 214C.3 The Creature board board. . . . . . . . . . . . . . . . . . . . . . . . . . . . 218C.4 The Creature board schematic. . . . . . . . . . . . . . . . . . . . . . . . . . 219xiiiGlossarya ect (n) emotion or desire, esp. as in uencing behavior or actiona ect (v) produce an e ect on, in uencealpha ( ) The probability of falsely rejecting a true hypothesis.BVP blood volume pulseCreature The Haptic CreatureECG Electrocardiograme ect (n) a result, consequence, impressione ect (v) bring aboutEKG ElectrocardiogramEMG electromyogramELF Extremely Low FrequencyHALO Haptic-A ect LoopHaptic Creature A zoomorphic robotic companion for exploring haptic (touch-based)interaction.HF High FrequencyHR heart rateHRV heart rate variabilityibi interbeat intervalICICS Institute for Computing, Information and Cognitive SystemsLF Low FrequencyxivGlossaryp-value The probability of obtaining a statistical result as extreme as the result obtained,assuming the null hypothesis is true.pnn50 The sum of the number of successive interbeat intervals that di er by more than 50ms, divided by the total number of interbeat intervals counted.QRS complex The series of de ection seen on an EKG during a heart beat.RMSSSD The root mean squared standard deviation of heart rateTAMER Touch-guided Anxiety Management via Engagement with a Robot PetTAMER Platform The platform designed and constructed in this thesisSCR skin conductance response, also known as galvanic skin response (GSR)ST skin temperatureUSB Universal Serial BusVLF Very Low Frequencyzoomorphic having or representing animal forms or gods of animal formxvChapter 1IntroductionAs robots become better able to infer and assess human a ective state, there will be a rangeof opportunities for robots to assist us with speci c physical tasks and with more generalsocial needs, such as playing, entertaining, and skill training. Robots will be able to adapttheir behavior based upon that of their operator [1]; in doing so they may also be able toin uence his or her emotional state. A gentle touch or hug from a robot that determinesyou are sad could cheer you up, or decreased interruptions from a robot assistant thatdetects your happiness with its progress could improve task performance. With physiologicalsensing, there is even the potential that a robot could become aware of your feelings beforeyou are [2]. A robot detecting an increase in your heart and breathing rate could intervenebefore you were consciously aware that you were becoming afraid or angry, allowing forreaction times faster than a human alone could achieve.The platform constructed and experimentally veri ed in this thesis is designed to begininvestigation into this link between robot behavior and user emotion. A small personalrobot [3] is utilized as part of an interactive feedback loop incorporating integrated biofeed-back from a user wearing physiological sensors. By investigating how the behaviors of asmall personal robot can change a user’s physiological state, the platform will be capableof reacting to and even guiding user physiological signals in order to produce an e ect inthe user.1.1 MotivationIn this work, the proposed end-use application for this platform is as a companion robot forchildren with anxiety or emotion related disorders, following the interaction model in Figure1.1. Using biosensors to assess user emotional state, the platform would intervene whenappropriate to encourage anxiety-therapy training and coping behaviors [4, 5]. Immuneto fatigue, it would provide an untiring, uncompromising tool for a therapist, parent, orteacher, by reinforcing existing therapy techniques [6] and helping to enable their applicationin the non-clinical world. The haptic interaction channel allows for the interruptions by theplatform to be con dential, nonintrusive, and discreet [7] | a small stu ed animal wouldnot seem out of place in a supportive classroom environment or home, nor unusual for achild to possess, and the sense of touch can provoke comforting reactions. This interaction is11.1. Motivationa variant on the haptic-a ect loop principle proposed by MacLean, Croft, and McGrenere,in which a combination of haptic stimuli and physiological sensing are used to manipulateuser a ective state.CHILDROBOTPHYSIOLOGICALSENSINGAFFECT DISPLAYSTRESSORSANXIETYFEEDBACKGUIDED INTERVENTIONFigure 1.1: Proposed Creature{user interaction model.The overall system block diagram is shown in Figure 1.1, which along with Figure 1.2represents the concept described by MacLean, Garland, Croft, Van der Loos and O’Brienin earlier proposals. It describes the platform as envisioned in its eventual use. Input fromthe user is gathered both by touch sensing on the companion robot and from physiologicalsensors worn by the user. An anxiety assessment is derived from this data: touch sensorsare interpreted by a gesture recognition engine that identi es how the user is holding orstroking the creature (e.g., light petting, hard squeezing. . . ), and physiological data by aninference engine. This estimate of anxiety is then used to drive the robot’s response model.A response rendering engine enforces transitions between commanded response states andco ordinates the companion robot’s mechanisms so as to depict a coherent presentation ofits biomimetic mechanisms. O ine interaction is provided by the therapist to downloadboth user physiological data and interaction information to assess therapy performance andto upload new therapy protocols as the user progresses. The overall platform is calledTAMER, for Touch-guided Anxiety Management via Engagement with a Robot pet.Before the TAMER platform can be used for therapy purposes, it is necessary to deter-mine what e ect the presence of the platform has on user physiology, and if this e ect can bemeasured or in uenced. Only after such an e ect has been determined can an investigationinto the guidance of user a ect begin. This thesis, therefore, describes the construction andtesting of a platform that can support the entire model as shown in Figures 1.1 and 1.2,21.2. Research ObjectivesCreatureusertherapisttouch sensingcreature responsesensorsaffect sensinganxiety assessmentphysiological responsetraining ofcoping skillsinitial engine training& calibrationtouchinteractionphysiologicaldataprocessed datatouch data & gesture recognitionresponsedeterminationanxiety estimateresponse commandsattachmentFigure 1.2: User-centered diagram of TAMER model.beginning with previously existing robot and physiological sensing platforms, and takes asigni cant step forward in verifying the platform through experimental observation of theplatform’s e ect on participant physiological signals. Speci cally, an existing robot, theHaptic Creature [8], and physiological sensing platform [9] were modi ed and improved tofunction as part of the TAMER platform. The loop, excepting the therapist and touchsensing boxes from Figure 1.2, and with physiological measures related to anxiety simplymeasured in lieu of a full anxiety assessment engine, is then tested and veri ed. The plat-form construction is guided by feedback from pilot studies, design interactions, and theveri cation experiments.1.2 Research ObjectivesThe overall research objectives are as follows:1. Determine user physiological reactions to both the presence of and motion of a com-panion robot, particularly when the companion robot is imitating the user’s physio-logical state.2. Determine whether physiological reactions can be provoked in a user through manip-ulation of the companion robot’s heart rate and breathing mechanisms.To achieve these objectives, the following contributions were required:1. Construction of a platform consisting of a small companion robot and physiologicalsensors that is capable of measuring physiological data from a user and providinghaptic feedback that could evoke a physiological response. This platform is based onexisting robot and physiological sensing platforms.31.3. Thesis Outline2. Veri cation and improvement of the functionality of this platform through iterateddesign and participant feedback.1.3 Thesis OutlineFollowing a literature review, this thesis presents the design process of the TAMER platformfollowed by the experimental design, protocol, and outcome of testing, and  nally theconclusions and recommendations toward realizing the vision of a therapeutic robot toolfor anxiety therapy.Chapter 2 An outline of recent literature in the subject areas relevant to the TAMERplatform. Research related to robotic companions, robotic therapy, biofeedback andanxiety therapy, haptics and a ect, physiological assessment of emotional state, andphysiological interaction with robots is discussed.Chapter 3 A description of the design and construction of the TAMER platform and de-tails of its components: the companion robot, the \Haptic Creature," and integrationof the physiological sensing system.Chapter 4 The experiments performed with the TAMER platform and their results. Fourmain experiments were performed. The  rst was a pilot experiment to determine thefeasibility of the platform. The second and third investigated the initial user reactionsto the Haptic Creature and attempted to determine whether users could recognize itmirroring their physiological state, as well as the reactions to the Creature when theuser was performing a task. The  nal experiment began to investigate reactions tothe Creature during a task with child participants.Chapter 5 The conclusions from the construction and experimental testing of the TAMERplatform and recommendations for improvements and future work.4Chapter 2Literature ReviewAs the TAMER platform incorporates research from a number of di erent areas, this litera-ture review will serve as a broad overview of the motivations for the TAMER platform anda summary of the existing work and technology that have been incorporated in it, includingbasic science and engineering research in psychology and haptics. First, robotic companionsand the uses of robots in therapy are discussed, followed by some of the technology andtechniques used in their applications. Techniques speci c to biofeedback and anxiety arediscussed, as well as the general link between haptics and a ect. Finally, work in physio-logical assessment of emotional state is presented, as well as the use of this assessment orother physiological data in interaction with robots.2.1 Robotic CompanionsAround the turn of the twenty- rst century personal robots were introduced to the NorthAmerican consumer market. From Japan came the Sony AIBO [10], a robotic pet dogcapable of learning and responding to verbal commands. Tyco Inc. in the United Statesdesigned the Furby [11], a small furry electronic creature with the ability to move its eyes,ears, and even itself in response to human interaction. Through prolonged conversationand interaction with their owners, Furbies would appear almost as children in developinga growing command of the language around them, gradually mumbling less and less intheir own gibberish language and more in their owner’s. Now, for the  rst time, advancesin computer and arti cial learning technology hold the potential for robots to re ne theirinteractions with us in a way resembling the development of a friendship or companionship.The AIBO, as a commercial robotic pet, was the focus of a large amount of researchinvestigating whether owners would react to this device like a loyal  reside-accompanyingplastic pet or more as they would a television. Friedman et al. investigated how severalhundred AIBO owners described their devices on an online web forum. Although 75 percentof the owners attributed \technological essences" to the creature; still 60 percent describedit as having some sort of \Mental States," in particular believing it to have the ability tohave intentions and feelings [12]. Owners spoke of their AIBO \wanting" to do things,and of feeling \sad" or \happy" based upon both actions the owners had taken and thereactions of AIBO. Of particularly importance to the concept of companionship was that52.1. Robotic Companions60 percent of owners attributed \social rapport" to their robotic canine, with 28 percentof posts describing an emotional connection that they had to their AIBO, and 26 percentexpressing a sense of companionship with their plastic pup. It is surprising that humanswere able to feel many of the same feelings they would have towards a living animal toa robot, especially one with such limited expressive and interactive ability. Although theAIBO, like the Furby, could respond to commands, there was little verbal communicationother than barking, and no software algorithms attempting to emulate a greater emotionalconnection.Melson et al. took on the human-robot and human-animal comparison more directly,and investigated children’s responses to both an AIBO and an Australian shepherd [13].They found that while children were able to recognize that the AIBO was a robot and not anactual dog, they still treated it in dog-like ways, and \a rmed that it had mental states. . .sociality. . . and moral standing." About half of the students even thought AIBO was morelike a dog than a computer. Later research revealed similar results in adults, that \evenwhile the person recognizes that AIBO is a technology, the person still a rms AIBO as acompanion, and as a friend" [14]. It appears that although humans may recognize AIBOand other robotic pets as technological devices and not living creatures, they still are ableto form the bonds of companionship with their robot, and with these can come health andsocial bene ts. Banks et al. showed that elder adults in a nursing home were again able torecognize that AIBO was a robot and not a real dog. Interaction with AIBO produced thesame reduction in loneliness that interacting with an actual animal dog provided [15]. Thenursing home residents showed a high level of attachment to both the living dog and therobotic impersonator, but yet the e ect of level of attachment was not su cient to explainthe decrease in loneliness for either interaction, suggesting some additional attachment.Tamura et al., in research with adults with dementia who might not be able to distinguishthe robot from an actual pet, found that residents would often look at, communicate,and care for AIBO, and that this resulted in increased communication from the patientsand improved well-being [16]. This  nding supports the goal of the TAMER platformto capitalize upon these social links. It aims not to blindly reproduce some bene ts ofhuman-animal interaction, but rather to deliver targeted emotional and behavioral therapythrough this medium.The intriguing research prospects of these commercial devices led to the developmentof several robotic platforms strictly for research use. Instead of modifying commercialplatforms designed for entertainment, these were engineered speci cally to investigate thebehavioral e ects that robot animals could have on humans. The most prominent of thesewas Paro, a robotic baby harp seal developed by Shibata et al. [17]. This robot has the abil-ity to move its eyelids,  ippers, and neck, and displays sophisticated animal behaviors, suchas responding to noises and sleeping. Paro also is equipped with reinforcement learning,62.1. Robotic Companionsresponding to positive interactions such as gentle petting as well as negative interactionssuch as slapping. It can recognize and grow accustomed to its owner. During long-terminteraction at a Japanese nursing home, Paro was shown not only to increase the amountof social interaction engaged in by residents with their peers, but also to reduce their stresslevels and improve health, as measured by stress hormone levels [18]. Additional studiesby Kidd et al. had American nursing home residents interact with Paro in a group, ratherthan in one-on-one settings, and saw improvements in community building among the mem-bers [19]. Robotic therapy is particularly appealing to nursing homes and hospitals wherereal animals may be banned for both health and hygienic reasons. The widespread use andexhibition of Paro have also allowed for cross-cultural comparisons of user impressions ofrobotic animals. In a recent study, Shibata et al. found that westerners were more likelyto attribute to Paro a \comfortable feeling like interacting with real animals," while usersfrom Japan and Korea tended to attribute to Paro a \favorable impression to encourageinteraction." They attribute this di erence to cultural di erences in relationships with an-imals [20]. The success of this robot in therapy has been so great that Paro is now beingmanufactured for commercial use, targeted to the elderly and those with dementia [21]. Aconcern noted in these experiments was confounding impressions of speci c species. Notonly may a user have di erent expectations from a robotic dog than a robotic seal, buttwo users from di erent backgrounds may have di erent expectations of proper \dog" be-havior. To help avoid this, the robot companion in the TAMER platform is a zoomorphiccreature, with animal-like characteristics but not resembling a speci c animal. Unlike Paro,it communicates solely through the haptic channel, investigating user reactions to a robotdesigned to be a non-speci c species.Another robot speci cally designed for therapy is the Huggable, a robotic stu ed teddybear. This robot was designed by Stiehl et al. speci cally to investigate touch interactions.It features the ability to move its neck, eyebrows, ears, and shoulders (in order to hug, hencethe name) with fully compliant voice coil actuators. Its unique feature is a creature-widesensitive skin to distinguish between various touching behaviors [22]. Much like Paro, theHuggable also contains a behavior system designed to increase companion behaviors. Ithas the ability to look into a person’s face and recognize its owner [23]. More recent workproposes the use of the Huggable in pediatric care, either as a proxy to allow distant friendsand family to interact with a child, or with the hope that a child will use the robotic bearas an \emotional mirror" of themselves, allowing for doctors and nurses to receive valuablefeedback that the child may be unwilling or unable to provide [24].These robots both serve as an inspiration for and feed the design iteration of the robotcompanion for the TAMER platform, the Haptic Creature. It was designed by Yohananet al. to investigate the emotional aspects of our touch interaction with animals [3]. TheHaptic Creature has the ability to purr, breath, heat up, and adjust its ears, and uses these72.1. Robotic Companionsbehaviors in an attempt to determine what common behaviors typical to human-animalinteraction we  nd pleasurable. Work is currently underway to determine the emotionalstates users attribute to various Creature behaviors [8], and how to utilize these to a ectthe emotional state of the user.In addition to robots designed for full-body touch interaction, there are several robots de-signed to investigate the potentials of human-robot companionship through primarily visualor audio means. In keeping with the theme of hugs, the Huggable Robot Probo was designed\as a tele-interface for entertainment, communication, and medical assistance" [25]. Therobot has the appearance of something like a robotic green anteater, with a long nose, andis capable of actuating most of its face, neck, and trunk to display various emotional states.Research is currently ongoing to map its facial expressions to emotional states [26]. TheiCat is a small yellow robot consisting primarily of a large face that is capable of displayinga wide range of emotions and coherently changing between emotional states [27]. The iCatis intended to investigate what emotional states, as displayed through facial expressions, areto be expected from such a robot in long-term engagement, and which would best be ableto maintain engagement in the creature, thereby building a relationship. Work is currentlyongoing in measuring participant engagement with the creature [28]. The Kismet robot byBreazeal is another expressive robotic face; this is designed to investigate the use of facialexpressions in interacting vocally with users [29].Unlike an animal, a robot has the potential to communicate with humans in theirown language, and there are many potential uses for a robot that essentially acts as theembodiment of a voice to develop a relationship with users. Heerink et al. investigated theuse of the iCat to elderly nursing home residents with conversation skills [30]. They foundthat while the residents were generally excited and interested to interact with the robot, theconversation activity might have been too oriented towards assisting and not su cientlyenjoyable. They hypothesized that viewing the iCat as an assistant rather than a companionwould lead to less than expected utilization of the robot. Kanda et al. experienced similarresults when using robots to help teach English in Japanese elementary schools [31]. Afterthe initial excitement from introduction of the robot faded, interaction with the robot fello markedly. However, those who continued to interact with the robot showed signs ofimproved English skills. While a robot designed for entertainment may not be successfullyadapted to a role in therapy, a therapeutic robot must maintain a degree of entertainmentand companionship in order to attract repeated, long-term engagement and use.The TAMER platform aims to build upon these examples of robotic companions tocreate an engaging device with therapeutic bene ts. People are capable of developinga pet-like attachment towards their robots, and the technology exists to construct smallpersonal robots with various interaction devices. In this system we endeavor to leveragethis emotional connection to manipulate user a ect and feelings.82.2. Robotic Therapy with Children2.2 Robotic Therapy with ChildrenMuch research into robotic companions and robotic therapy has involved the use of children,the target audience of the TAMER platform. Children, like the elderly, often have a varietyof special needs that require care, and they are generally receptive to robots under the guiseof a new toy. Several robots and applications of robots to therapy are described here that,like the TAMER platform, target children and attempt to in uence child behavior.Through play, robots may have the ability to elicit emotions more reliably and repeatedlythan a human caregiver. Kozima et al. developed Keepon, a small, bright yellow robotthat resembles a snowman, with the ability to orient its eyes on a target and bob or rockaround to display emotion [32]. Primarily designed for toddlers or babies, they foundthat children were able to develop a steady emotional reaction to the creature. Theyhypothesized that by appearing so di erent from a human but \perceiving and acting"as we do, Keepon \motivates children to explore and communicate with it," a necessityfor human-robot interaction. Their intended use for this robot is to promote interactionand engagement in children with autism spectrum disorder. Plaisant et al. developed arobot that, rather than communicate to the child, attempts to enable the child to bettercommunicate with others [33]. The child controls the robot via arm-bands and a headset,and the robot attempts to promote \therapeutic play," either by having the child displayemotions appropriate to a story in the robot to gain awareness of the emotions, or byhaving the robot perform rewarding tasks when certain motion goals as part of physicaltherapy are met. They note the advantage of a robot controlled by a child in educationand therapy: giving a child the ability to have control over part of his or her environmentprevents frustration and encourages success. Kronreif et al. developed the PlayROB, arobot that assists several disabled children in assembling LEGO™ structures by handlingthe bricks. They found that children were able to quickly adapt to use the system, andbuild LEGO™ structures that they may not have otherwise had the physical ability toassemble [34].Many applications from the use of robotics in rehabilitation have also been found ap-plicable to children. Out of many: Cook et al. used a robotic arm to assist children withmotor development disabilities in gaining motor control [35], and Krebs et al. successfullyused a robot exoskeleton to assist children with cerebral palsy in developing proper walkingmotion as well as muscle strength [36].Robotic therapy is particularly well-suited to provide bene ts to children who are de-velopmentally disabled, particularly those with autism spectrum disorder. These childrenare often unable to express their emotions and can have di culties communicating | thisoften leads to rapid frustration in a social environment. A robot can adapt its behavior to achild’s emotional state through the use of physiological sensing to access a communications92.3. Biofeedback and Anxiety Therapychannel unavailable to humans, and can use machine learning techniques to correlate thechild’s activity and signals with emotional states.Dautenhahn et al. developed Robota, a humanoid robotic doll capable of moving itslegs, arms, and head [37]. This was part of the Aurora project, the goal of which was tostudy the role of robots in autism therapy. Robota was used in an attempt to developinteraction skills in children with autism. The robot was programmed both to dance tomusic and to react to the pressing of controls on a control pad by moving its limbs. Theyfound that the robot was able to become a source of interaction for the children, a deviceabout which they could communicate with their teacher. More importantly, after they hadbecome comfortable interacting with the robot, they were then interested in communicatingwith the creator of the robot, who was a stranger to them [38].Salter et al. developed a small spherical robot called Roball for interacting with autisticyoung children [39]. The robot is designed to resemble a ball to facilitate ease of play, andcurrent research is ongoing in how best to adapt the balls behavior to children’s actions inorder to maximize engagement and attention.Liu et al. mount a basketball hoop on a typical industrial pick and place robot to developan engaging video game for children with autism [40]. They developed a basketball-shootinggame with three levels of di culty by varying the motions of the robot arm. They thenattempted to maximize user engagement and liking through the use of physiological sensorsto detect emotional state. They found that they were e ectively able to increase child likingof their game session through the use of physiological feedback.Robins et al. does caution, however, that in particular robots designed for the targetaudience of children with autism must be careful that they do not simply encourage interac-tion with the robot, but instead use the robot as a tool to eventually encourage interactionwith other humans [41].Leveraging this  nding of receptivity to robotic therapy and interaction by young chil-dren, the primary user group of the TAMER platform will be children. Guidance intothe interaction loop from therapists and child educators will help ensure that while timewith the Creature and platform is playful and fun, it also provides important therapeuticbene ts.2.3 Biofeedback and Anxiety TherapyPhysiological training exercises such as yoga and other meditation have long been usedfor calming purposes. These approaches, however, require careful and repeated trainingunder the supervision of an instructor to be e ective. For a novice practitioner, oftenthe concentration needed to achieve anxiety reduction cannot be established in the veryanxiety-inducing situations for which they would wish them to be e ective. Feedback-guided102.4. Haptics and A ecttraining would seem an e ective learning technique, however it is only relatively recentlythat we have been able to measure and quantify muscle relaxation. With this technologyhas emerged a new  eld of biofeedback-guided therapy: patients are trained to reduceor stimulate certain physiological indices to help them reduce their stress or anxiety levels.Raskin et al. used biofeedback to teach adult patients to relax their frontalis muscle, used tolift the eyebrows, and found that several patients had their anxiety markedly or moderatelyimproved through this technique, in one of the  rst pilot studies on anxiety patients. Theystate that \in many ways biofeedback techniques represent a modern electronic version ofthese older approaches" [42]. Townsend et al. found that electromyogram (EMG) relaxationtraining was superior to group psychotherapy in decreasing mood disturbances, as well asboth trait and state anxiety [43]. In an investigation into the physiological symptoms ofanxiety, Lehrer et al. found that biofeedback training to increase heart rate variability isalso e ective in reducing anxiety [44], and that \various forms of breathing retraining havebeen found to be e ective treatments and/or treatment adjuncts for anxiety disorders" [45].They also note the bene t of biofeedback guided training over simple verbal guidance.With the advent of portable physiological sensing devices, recent studies have examinedthe use of real-time biofeedback. In this technique patients are alerted when they areexceeding certain physiological thresholds associated with anxiety, in order that they mightbegin calming procedures. Murphy et al. used a heart rate variability feedback device forpatients with generalized anxiety disorder; they found that, in combination with cognitivebehavioral therapy, such biofeedback could be just as e ective as EMG relaxation trainingin reducing anxiety [46]. Reiner reported similar results: he equipped patients with aportable heart rate variability monitor, and they were instructed to monitor their heart ratevariability throughout the day. Patient reported outcomes included reductions in anxietyand anger and improved sleep; participants found the feedback device to be more helpfulthan meditation and yoga [6]. Reiner also found that patients who were most compliantwith the monitoring and training reported the greatest bene ts. These results suggest theprospect that a biofeedback enabled robot, such as the Haptic Creature, could take on therole of either tutor, by training users to reduce their anxiety using its breathing mechanism,or alerter, by making users aware of their current state of heightened anxiety.2.4 Haptics and A ectHerteinstein states that \touch is capable of communicating valenced and discrete emotionsas well as speci c information" [7]. Touch can be used both to communicate and to elicitemotions. The simple act of touching has been shown to be capable of in uencing user ac-tions and opinion. Although an often under-considered element of human-robot interaction,recent attention has been focused on how to reproduce communicative touch in robots.112.4. Haptics and A ectVarious studies have con rmed that touch, even unnoticed, can have a profound impactover our behavior. Fisher et al. found that the brief touch from a librarian handing backa library card produced an increase in positive opinion of the librarian and library [47].This e ect held even when the participant was consciously unaware of the touch. Williset al. asked passers-by on a campus and in a mall to sign a petition or complete a survey,respectively | they found that combining the request with a casual touch almost dou-bled participant compliance [48]. It is not only the touch by a human that can inducethese e ects: Vormbrock et al. found that touching a dog is correlated with changes inblood pressure [49], and Shiloh et al. found that touching rabbits and turtles reduced state-anxiety [50]. Touching the toy versions of the animals, however, did not have a similar e ecton anxiety.Several more recent studies, however, have con rmed that arti cial, active touch canprovoke positive reactions. Haans et al. investigated whether an armband with vibrotactileactuators could produce the same increase in altruism and compliance associated withhuman touch; they found that both man and machine had similar success rates [51].Touching can even make a robot seem more humanlike: Cramer et al. found that proactiverobots seemed less machine-like when they touched users [52], but also that a user’s opinionof touch was in uenced by robot behavior: touching reactive robots made them seem lessdependable.Tactile pleasure should be of concern in designing interaction devices. Salminen et al.investigated the responses to stimulation by a  ngerprint friction stimulator: stimuli ratedas unpleasant, arousing, dominating, and less approachable produced faster reaction timesthan those considered more pleasant [53]. Swindells et al. observed physiological reactionsto operation of a haptic knob along with emotional reports, concluding that \analyzingboth a ective and performance measures together is crucial for good design" [54].Both the e ect of haptics on a ect and the e ect of a ect on haptic use have beeninvestigated through several haptic devices that attempt to communicate or in uence useremotions. The intimate nature of touch in relationships has inspired several researches tosee if mechanical devices can substitute for interpersonal touch. Smith et al. concludedthat users were able to communicate emotion through knobs during various tasks [55].Chang et al. developed the Lumitouch, a pair of linked picture frames designed to providea sense of presence across distance. A frame would light up when a user was in front ofthe partner frame: by touching the frame a user could cause colors to light up on the otherframe; the colors varied depending upon the location, intensity, and duration of touch [56].Couples found this generally appealing: several developed their own \haptic language"for remote communication. Mueller et al. invented the \Hug over a Distance," in whichcouples could wirelessly activate an in atable vest in their partner, simulating a hug [57].This was generally well received, although thought impractical for every-day use [58]. The122.5. Physiological Assessment of Emotional StateTapTap was a similar device, essentially a haptic scarf designed to record and display touchinteractions [59]. This device was proposed to enable a single user to provide therapeutictouch asynchronously to several people without the necessity of their presence.As touch is an important link to emotion, the TAMER platform aims to use this intimatechannel to a ect user’s emotional state. The haptic channel seems uniquely suited forthis sort of task. Through it, the Haptic Creature will be able to unobtrusively displayinformation and even communicate discreetly.2.5 Physiological Assessment of Emotional StateWhile biofeedback therapy may have used physiological measurement in order to adjust andmoderate emotional responses in patients, these physiological metrics could also be usedto assess emotional state. Humans are sophisticated enough that single-sensor metrics arenot particularly generalizable to all emotional states, but with the advent of improved com-puter pattern recognition and machine learning techniques, it became possible to developthe online recognition of emotional state through physiological measurements. When evenhumans may have trouble reading the verbal and visual clues of their fellow humans, the useof non-conscious channels for emotional communication with robots appears ideal. Humansare not typically accustomed to openly and consciously assessing and sharing their feelings,and whereas body language, posture, and gaze may be di cult for a robot to assess directly,requiring sophisticated cameras and visual processing techniques, much work has been donein using small, simple physiological sensors for the assessment of emotional state.Picard et al. were among the  rst to apply the machine learning techniques that had beenoriginally used for vocal and facial emotional analysis to physiological data [2]. They usedpsychological techniques to instill in participants 8 di erent emotions, and they achieveda success rate of 81 percent in recognizing these from blood volume pulse, skin conduc-tance, and respiration rate sensors. They state that at the time \there were doubts inthe literature that physiological information shows any di erentiation other than arousallevel," making this the  rst proof of concept of machine emotional recognition throughphysiological signals. Kim et al. attempted similar emotion recognition in children, usinga support vector machine to classify emotional state based upon blood volume pulse, skinconductance, and skin temperature sensors. They were able to achieve a success rate of 78percent in recognizing sadness, anger, and stress in users [60]. Wagner et al. attempted amore robust emotion classi cation system, using feature reduction to improve valence andarousal recognition in users strapped to electrocardiogram, skin conductance, respirationrate, and electromyography sensors and subject to emotion-inducing music. They were ableto achieve 92 percent accuracy in identifying emotional state [61].Kuli c et al. attempted not to estimate discrete emotional state, but rather to develop132.6. Physiological Interaction with Robotsonline recognition of a user’s valence and a ect levels. They utilized a fuzzy-logic basedinference engine to assess user arousal base upon electrocardiogram, skin conductance,and electromyography sensors, and use this as the basis for human-robot interaction [62].They later re ned their results using a Hidden Markov Model [63] to achieve an averagerecognition rate of 72 percent [9].Liu et al. applied support vector machines to identify emotional state in children withautism [64], using this as the input for the robot basketball game mentioned previously.Theirs is unique in that they trained their system not by progressing the user throughvarious emotional states, but by using both therapists and parents to assess emotional stateof the children, who would not themselves be able to communicate this e ectively. Theywere able to achieve a success rate of approximately 83 percent recognition, and improvethe child’s liking of the game.Rani et al., in a recent summary of applying several machine learning techniques to alarge data set, achieved an overall emotional classi cation success rate of 86 percent usingsupport vector machines [65].A major limitation with all these physiological assessment engines developed is that theyare often not generalizable to every-day practical use, having been calibrated in speci c,often sterile environments for speci c uses. Bethel et al. caution that \research should focuson developing a diverse set of complimentary [sic] measures that capture the full range ofhuman-robot interactions" [66].Although the TAMER hardware and software support the ability to provide onlineassessment of physiological state through computer learning methods, the training of anassessment engine speci cally for anxiety is beyond the scope of this thesis.2.6 Physiological Interaction with RobotsDespite the limitations of these physiological assessment engines, they have already beenused to some success in human-robot interactions. Takahashi et al. used skin conductancesensors in an eating assistance robot for people with disabilities [67]. By measuring skinconductance response they were able to distinguish between erratic behavior that was underuser control and that which was robot generated, and use this input to  ne-tune their controlalgorithms.Itoh et al. utilized physiological sensing to reduce user stress when interacting with alarge personal robot during interaction tasks [68]. If user stress raised above a certain valuethe robot would stop the activity and shake hands with the user. They found that subjectstress was signi cantly reduced by the robot’s motion. Rani et al. demonstrated a ect-based control of a robot: upon sensing an increase in anxiety from its user the robot wouldinterrupt its own task and return to assist the user [69]. Hanajima et al. programmed a142.7. Summaryrobot to reduce its speed of approach towards a user based upon skin conductance responseand found that this improved subjective response to the robot [70].Kuli c et al. used their previously de ned mentioned algorithms to assess user emotionalresponse to slow, medium, and fast robot trajectories with various behaviors of approachtowards the user [62]. They then analyzed this information to estimate user arousal duringinteraction with the robot, reducing the velocity of the robot when sensed arousal is high,as this could be a dangerous condition [71]. Such a reaction has promising applications insituations where humans must work in close proximity to a robot: a robot reacting to auser’s physiological indicators of danger could potentially stop much more quickly than ifthe user had to  nd and hit an emergency stop button. Thus, while a generalized emotionsystem is still some distance away, physiology-based input to a robot system has beenshown a successful input to a robot control system to reduce stress related to human-robotinteraction.The eventual goal of the TAMER platform is to have a robot that reacts to valencechanges in the assessed level of a user’s emotional state of anxiety. Although at present,platform behavior and e ectiveness are not based on aggregated sensor data, but rathersimpler, single-sensor readings, that capability exists and can be implemented once theappropriate inference engines have been researched.2.7 SummaryWhile recent works have begun to apply physiological monitoring to human-robot interac-tion, the TAMER platform uniquely attempts to integrate this broad background of tech-nologies in order to guide physiological responses. Chapter 3 will describe how an existingcompanion robot and physiological sensing suite were combined to produce this platform.Incorporating biofeedback training techniques into a robotic companion has the potentialto provide users with an untiring, consistent trainer for developing important coping tech-niques to deal with stress and anxiety. Results from Chapter 4 will show that users are ableto successfully use the TAMER platform to mimic the breathing of the Haptic Creature.Experimental results will show that activation of the TAMER platform has a statisticallysigni cant relationship to a user’s physiological measures, even when the user is performinga separate task. Physiological sensing allows for a robot companion to be not simply analerting mechanism, informing the user of his or her undesired physical and emotional re-actions to conditions, but also a teacher, targeting these conditions for reinforcement of thepreviously learned coping skills. The platform can potentially act as a proxy for a therapistwho cannot always be present with the user, and at the same time gather physiologicaldata for further analysis and feedback. For this children are an ideal target population, asthey require constant and consistent reinforcement, but must receive such therapy without152.7. Summarybelittling and disparagement from their peers. Orienting cognitive based therapy throughthe haptic sense allows for non-intrusive and inconspicuous communication even in a socialenvironment, while also using a channel that has been shown to have great e ect on behav-ior and a ect. Results from Section 4.3 will show that school-aged children are amenable toworking with the Creature, and  nd interacting with it comforting and pleasurable. In addi-tion, when the TAMER platform is used during computerized cognitive activities in school,it will be found to have a statistically signi cant relationship to physiological changes inthe children, who typically enjoy having the companionship of the Haptic Creature duringthis stressful activity.16Chapter 3Methods and System DesignThe TAMER platform expands upon two existing technologies: a \Haptic Creature" and aphysiological sensing suite. Thus, a uni ed platform is aimed at guiding user a ect throughhaptic interaction, particularly for anxiety reduction purposes. The design of this systemproceeded in two main parts: the construction of a new Haptic Creature, with designmodi cations made to support this particular use, and modi cation of the sensor suiteboth to interface with the Creature and to record physiological data related to anxiety thatcould be used to drive the Creature. This chapter outlines the overall platform approach andmethods in Section 3.1, the hardware modi cations developed in Section 3.2, the TAMERcommand and control framework in Section 3.3, and the modi cations made due to feedbackfrom user testing in Section 3.4. Finally, in Section 3.5, the modi ed physiological sensingsuite for use in the platform is presented.3.1 General Approach and MethodsThe TAMER platform pairs a robotic creature designed for haptic interactions with phys-iological sensors in order to guide physiological responses related to a ect. Three maincomponents are needed in order to e ectively manipulate a ect: a sensing suite to measurephysiological signals, a response engine using this information, and an interaction device.These components function in a haptic-a ect loop, of which the block diagram in Figure 3.1is an instance. The sensing suite serves to provide online feedback for the platform. Phys-iological data from all available sensors are analyzed in real-time by the response engine.When used to assess user a ect it is trained using questionnaires or surveys from previousinteractions. In the initial studies for this thesis the primary goal is not to manipulatea ect directly, but rather to take the more preliminary of step of attempting manipulationof user physiological indicators, such as breathing rate or heart rate. In this case the re-sponse engine is not trained to recognize speci c emotional states, but rather commandsactions directly from sensor output, and adapts based upon user response to these actions.The interaction device for this platform is the \Haptic Creature" described in Section 3.1.1;in this platform it serves as a robotic companion. It is desired that through the couplingbetween user a ect and robot actions a genuine a ection and sense of connection with therobot will be engendered in the user.173.1. General Approach and Methodscreatureuserhaptic displayhaptic input and physio sensingFigure 3.1: Simpli ed schematic of the Haptic Creature interaction loop; an example of ahaptic-a ect loop.Figure 3.2 describes the overall TAMER platform. Physiological sensors attached to theuser collect and transmit physiological data to the physiological sensor software. The sensorsoftware, based on these data and desired Creature behavior, sends motion commands to theCreature, over radio, USB, Bluetooth, or an actual wire. Having received these commands,the Creature’s microcontroller activates the breathing, pulse, or heating mechanisms toperform the desired motions. In typical use these commands are breathing servo positiondata or a pulse command. At the same time, sensor data are sent by the microcontrollerthrough radio, USB, Bluetooth or an actual wire to the Creature display software, whichdisplays the sensor information. These components are described in the following sections.PHYSIOSOFTWARECREATUREDISPLAYRADIOorUSBorBLUETOOTHorWIREPULSEorBREATHINGorHEATorEARSorPURRTEMPERATURESENSORSMICROCONTROLLERTOUCH SENSORSUSERCREATURECOMMANDSSENSORDATASENSORSUSBCONVERTERSENSOR ENCODERCREATUREPHYSIOLOGICAL SENSORSHOST COMPUTER(S)PHYSIODATAFigure 3.2: TAMER command and control scheme.3.1.1 Creature IntroductionThe concept of the Haptic Creature was initially created by Yohanan and MacLean [72],who constructed a manually-actuated prototype version of the Haptic Creature followed by183.1. General Approach and Methodsa robotic version. The Haptic Creature used in this thesis is a second robotic version. It issimilar to the original, but was intended not for fundamental research into the haptic ex-pression of emotion, but for the TAMER platform, and thus minor modi cations were madefor this application, under the supervision of Yohanan et al. and the author. Developmentof an entirely unique companion robot for the TAMER platform would have been outsidethe scope of this thesis. The robot used in this thesis incorporates a pulse mechanism andheating pads as additional display mechanisms, which the original did not have, as well aselectronics systems designed for integration into the TAMER control loop. Some of the de-sign improvements made on this thesis’s robot have been brought back to the original. Forthe Creature in this thesis, physical construction of the shell, creature heating pad, pulse,breathing mechanism, ears, and fur was in collaboration with Yohanan et al., with severalundergraduate student design groups. However, all electronics and communication proto-cols presented in this thesis are unique to this thesis and wholly the work of the author. Asummary of this is shown in Figure 3.3. A description of the Haptic Creature developmentprocess follows. “wizard of oz” prototypeoriginal creaturesecond creaturetimerevisionsshell, mechanism conceptstesting &developmentconcept verification+ pulse+ heating pads+ T AMER    connectivitydesign improvementsFigure 3.3: Diagram showing development of Haptic Creatures. The second Creature(green) is used as part of this thesis.193.1. General Approach and Methods3.1.2 Creature DevelopmentThe robotic companion for this platform is the \Haptic Creature" (see Figures 3.4 and3.5), developed by Yohanan and MacLean [72] to investigate a ective touch in human-robot interaction (see Section 2.1). While much research has focused on the e ect of robotappearance in interactions with humans, the Creature is innovative in that it is amongthe  rst to explore in depth our touch-based interactions with robots, and how the tactilequalities of a robot in uence our perceptions of it. Such investigation is necessary: robotsare no longer constructs of cold metal and motors in factories, where human contact wouldbe dangerous, but have become smaller, more personal devices that interact with theirusers in more intimate ways. This research both draws from, and can serve as an aid to,the domains of human-animal and human-human touch interaction. In those  elds it istypically di cult to eliminate the many confounding variables that are present in touchstudies: touching from or being touched by other humans is almost always emotionallyloaded, and perceptions of animal touch can be positively or negatively altered by previousexperiences with them.The Creature allows for the individual components of human-animal interaction to beFigure 3.4: The Haptic Creature.203.1. General Approach and MethodsFigure 3.5: The Haptic Creature, upside-down, with fur removed, showing silicone skin.studied separately. Actions such as breathing, warmth, and purring can be emulated inisolation as well as combined in both natural and abnormal ways | this is much easierand more practical than, for example, training a cat to purr repeatedly, but not to move orbreathe! By manipulating these individual components, a more complete model of how eachcontributes to our perception of the emotional \state" of the Creature can be developed.The relations between these actions and perceived emotional states should help to developa more fundamental understanding of the a ective nature of our touch interactions.As originally conceived by Yohanan et al. [72], the Creature had several main mech-anisms to interact with users: purring, breathing, ear display, and warmth. These weredrawn from the actions typical of small domestic mammals, but designed to be zoomor-phic: resembling a generic animal more than any one species to avoid confounding e ects.Care was also taken to ensure that the Creature’s display mechanisms were purely haptic,with minimal aural or visual components, to again reduce confounding e ects and narrowinvestigative scope. The  rst version constructed by Yohanan et al. was a \Wizard of Oz"prototype (this version was utilized for use in the pilot experiment of the Haptic A ectPlatform, described in Section 4.1) with all mechanisms present but with the breathing andear display manually actuated by a human operator. Initial studies by Yohanan et al. [72]investigated how these mechanisms could be combined into coherent emotional states. They213.2. Hardware Additions and Modi cationsfound that participants were successful in identifying and distinguishing between the deviceasleep, content, happy, upset, and playing dead [72].Following that testing a robotic version of the Creature was then constructed by Yohananet al.. Several form factor and mechanism iterations followed leading to the version depictedin Figure 3.4, the  rst robotic model of the Haptic Creature, and that used in subsequentexperiments by Yohanan et al. [3]. This version consists of a hard  berglass shell with forcesensitive resistors encompassing the structure to detect touch (separate research has beenongoing to classify these sensor inputs as common gestures, such as petting or striking [73]).The shell is covered with soft synthetic fur on all sides except the bottom, where a softer,felt-like fabric, like the abdomen of a dog or cat, is present. A servo mechanism moves theupper rear part of the abdomen to simulate a breathing motion; the mechanism is attachedthrough springs to improve its compliance. Inside the Creature, a motor with an unevenweight attached to its shaft spins to emulate the vibrations characteristic of purring, withalso a slight purring sound. The ears are constructed from the rubber bulb of a bloodpressure cu . A servo adjusts a valve connected to the outlet of the bulb to increase ordecrease the rate of air ow from the cu when squeezed, adjusting the perceived ear sti nesswhen squeezed.From this \base" design of the Creature, and following the preliminary studies reportedin Section 4.1, an additional creature was constructed for use in the TAMER platform bythe author in collaboration with Yohanan et al. [8] and undergraduate student teams. Itincludes the shell and mechanisms described above, and incorporates additional mechanismsand modi cations necessary for use in the TAMER platform. Unless mentioned otherwise,all references to the Creature henceforth refer to this newer, second version, the robotintegrated into the TAMER platform and used during the experiments in this thesis. Whilethe original Creature and those elements mentioned above were developed by Yohanan etal., the modi cations made to the additional Creature, in particular the electronics and theapplications thereof, are unique contributions of this thesis.3.2 Hardware Additions and Modi cationsThe use of the Haptic Creature in this platform results from an important characteristicof the device revealed in initial prototypes: its calming potential. In casual interaction thewarmth and gentle breathing sensation from the Creature were often perceived as com-forting. However, in order to fully investigate this behavior there were a number of chal-lenges and concerns to be addressed in modifying the platform from its original intendedpurpose of investigating a ective touch. The Creature required additional robustness forlonger-term operation in a less laboratory-like environment. Additional mechanical actu-ators were needed that, while staying within the solely haptic mode of interaction, could223.2. Hardware Additions and Modi cationsbetter represent physiological states. As part of this thesis, electronics for motor power andcontrol, sensor input, and communication were constructed, and a communications protocolto incorporate the Creature into the TAMER interaction loop, shown in Figure 3.1, wasdeveloped. Feedback from user testing was incorporated into the design process to both testand re ne these hardware changes. In this section the design considerations and challengesinherent in the TAMER are described, followed by the details of the modi cations madeto the Creature’s display mechanisms, and electronics. The following sections describe theTAMER platform’s communications and control systems, and  nally the re nements tothese modi cations based on feedback from user interactions.3.2.1 Design Considerations and ChallengesThe two primary considerations in designing the Creature element of the TAMER platformwere robustness and engagement. Robustness was a paramount design goal: the eventualusage environment for the TAMER platform includes home and school environments, wherethe Creature will be subject to the not-gentle handling of children. In these environments,it is expected that the Creature will be dropped, struck, and generally played with. It isnecessary that the Creature be rugged enough to withstand this treatment, as well as todegrade gracefully in the event of failure, in a way that should not cause harm to the user.Compliance was necessary in Creature mechanisms | a child hugging the Creature couldobstruct the motion of the breathing or pulse mechanisms, potentially causing too high aload on the servo or motor driving the mechanism. The Creature also had to be capable ofsurviving longer-term experiments of several hours or an entire school-day. In addition, thenature of this ultimate user group demanded consideration of the Creature’s engagementability. While acting through channels of limited expressiveness, the Creature had to beinitially intriguing to the user, inducing a desire for contact and interaction, and had tomaintain this desire during long-term encounters, while not being so engaging as to distractthe user from his or her ordinary tasks.To help foster this engagement a command and control framework that can be readilyadapted to changing environments was necessary for the TAMER platform. The platformmust be able to react quickly to short-term changes in physiological state, as well as subtlyto longer-term user responses. It must also be capable of rapidly communicating interactiondata, such as touch patterns, which are applicable to its present operation. The platformmust be able to generate and store performance and interaction data for later analysis. Asit is anticipated that experimental time with the ultimate user group may be limited, it wasimperative that the experimenter be able to modify engagement parameters and Creaturebehavior quickly; therefore, the Creature hardware and software also had to be adjustableand reprogrammable. All of these parameters had to be ful lled within the small size of the233.2. Hardware Additions and Modi cationspresent shell, and, for time and budget purposes, without a whole-scale revamping of thepreviously existing Creature mechanisms. All modi cations had to support a robust andreliable device capable of withstanding repeating long-endurance experimental trials. Themodi cations made to achieve these goals are described in the following sections.3.2.2 Additional Display MechanismsIn consideration of the primarily haptic nature of the device, the Creature’s expressivechannels were limited to those which produced e ects discernible by touch. In order toincrease the Creature’s expressiveness, two additional display mechanisms were added tothe Creature under the supervision of the author: a pulse mechanism to replicate thepresence of a heartbeat, and heating pads to generate warmth.PulseA pulse mechanism, designed and constructed by an Undergraduate Mechatronics CapstoneDesign Project Course team under the supervision of Yohanan et al. [8] and the author,was added to the Creature (see Figure 3.6). This expressive channel was well-suited to theTAMER platform for several reasons. As heart rate and heart beats are directly measured bythe physiological sensor suite, this mechanism permits representation of a user’s heart beatin the Creature. Heart rate and heart rate variability are also linked to human a ective state,in particular anxiety, therefore display or manipulation of this activity could potentiallya ect the user’s physiological state. Having a pulse also increases the \life-like" nature ofthe Creature in a way that maintains its zoomorphic behavior. Incorporating heart-rateinto the Creature’s a ect presentation allows the Creature to present its own emotionalstates with greater  delity and higher accuracy; these more expressive emotional statescould potentially allow for increased growth of user companionship with the Creature.Figure 3.6: Haptic Creature pulse mechanism.243.2. Hardware Additions and Modi cationsThe pulse mechanism consists of a bipolar stepper motor attached to a pulley. Tworods, one on each side, with a cork on the end, are attached to the pulley via a revolutejoint. The rods pass through a support bracket near the sides of the Creature. As thestepper motor rotates the pulley, these brackets force the rods to move linearly outwardsand inwards. A limit switch mounted near the pulley prevents over-rotation. The net e ectof a rapid clockwise then counterclockwise motion (or vice-versa) of the stepper motor is tocreate a brief tap or \pulse" on the point impacted by the corks. The pulse mechanism ismounted transversely near the front of the Creature, approximately where its \neck" wouldbe if it had one, and with fur on the Creature this mechanism produces a pulse locatable inthe immediate area of the mechanism. It does not, however, produce a discernible tactilee ect in any other area of the Creature. A maximum heart rate of approximately 160 beatsper minute was achieved in bench testing.Heating PadsMany users responded positively to the warmth produced by a heating pad in the \Wizardof Oz" prototype Haptic Creature [72]. Therefore, three heating pads were added to thebottom of the Creature to reproduce this warmth. The heating pads are large,  at resistorsthat dissipate heat when voltage is supplied. They are not noticeably felt through thefur. When operated on 500 mA of current, heat from the pads is able to be felt throughthe Creature’s fur in approximately one minute. Feedback from DS18B20 1-wire digitalthermometers mounted around the Creature’s shell can be used to monitor temperaturelevels, and deactivate the heating pads when the desired temperature is achieved.3.2.3 Creature Electronics BoardThe Creature’s main electronic board was designed by the author to support the basicfunctionality necessary for the TAMER platform, while allowing for easy maintainabilityand upgradability. The board was designed to attach to the Arduino Mega, an \open-sourceelectronics prototyping platform based on  exible, easy-to-use hardware and software" [74].The mating of a custom board with an o the shelf component served to provide increasedfunctionality, improved reliability, and easier maintainability of the control system. The useof the Arduino helped to reduce potential assembly and design errors in the microcontrollerand its supporting hardware, which were among the most complex parts of the electronics. Italso allowed for the system to be programmed in a free, open-source developer environmentand programming language based on the common C programming language: this allowsfor future programmers without knowledge of assembly language to maintain the codebase.The Arduino is also able to be reprogrammed without the need to remove the chip or usespecial programmers. Full schematics of the electronics board, as well as sample code and253.2. Hardware Additions and Modi cationsa parts list, can be found in Appendices C.1, C.2, and D. The Creature’s electronics boardand its components are shown in Figures 3.7 and 3.8.Power SupplyThe power supply for the Creature comprises several components for delivering power atthe voltages and currents necessary for its mechanisms. Power for the Arduino and controlboard components is supplied by 5 V and 3:3 V linear voltage regulators. Filter capacitorsare placed as close as possible to all integrated circuit chips to reduce line noise. Powerfor the motors, servos, and heaters is supplied by three Dimension Engineering 25 W stepdown adjustable switching regulators [75]. Adjustable voltage regulators were required toallow for motors or servos to be replaced, as well as for overall current regulation of theheaters. Switching regulators were required both for their e ciency gains: they waste lesspower than traditional linear regulators, reducing overall current draw, and their reducedheat production: heat buildup is of concern in the small enclosed space of the Creature.Typical operation of the heaters at 10 V, the servos at 7:2 V, and the motors at 5 V allowsfor maximum current draws of 2:5 A, 3:47 A, and 5 A respectively, although in typical usagethis total current draw is not reached.Power input to the Creature was provided by a 12 V wall power supply capable ofsupplying 5 A. The linear voltage regulators are low drop-out, allowing for microcontrollerpower and therefore radio communications to be maintained when input power is as lowas 5:7 V. This is of particular importance when the provisions for internal powering of theCreature with a battery are utilized. Connectors are present to allow the Creature to becontrolled by a 12 V NiMH battery, similar to that used in remote control cars, eliminatingthe need for a \tail" wire to the Creature. A Maxim MAX712CPE-ND battery chargingchip and supporting circuitry allow for the battery to be charged from a wall outlet withoutdisassembling the shell. Battery voltage can be monitored via the onboard microprocessor.Motor ControlsThe control board is capable of controlling one bipolar stepper motor and two bidirectionalor four unidirectional DC motors. Motor control is provided by two Texas InstrumentsL293DNE dual H-bridges, each capable of supplying 1:2 A of continuous current, and uti-lizing integrated clamping diodes to prevent back emf. DC motor speed control is providedby PWM output at 64 kHz, 32 kHz, 8 kHz, 1 kHz, or 500 Hz. Heater control is provided byfour p-channel MOSFETs capable of providing 9A of current each; typical control is on-o with hysteresis.263.2. Hardware Additions and Modi cationspowermotor controlsmotorssensor input temperature sensorstouch sensorscommunications,command and controlbluetooth,xbee radiosFigure 3.7: Overview of main functions of Creature electronics board.Figure 3.8: Creature Board with power (orange), motor controls (purple), sensor input (yel-low), communications and command and control (blue), and temperature sensing (white)areas highlighted.273.2. Hardware Additions and Modi cationsSensor InputsThe control board supports acquisition of touch sensor data from the sixty-four force sen-sitive resistors (FSRs) arrayed around the Creature. There are four individual sensingcircuits: each comprises sixteen FSRs attached connected to a single sixteen to one signalmultiplexer. The output of that multiplexer is connected to a circuit as shown in Figure 3.9.Two of the multiplexers share an operational ampli er and digital potentiometer to reducehardware requirements. Sensor output runs from 0 V to 2:5 V. For greater  delity and toaid in sensor calibration a digital potentiometer, the 100 kΩ Maxim MAX5479EUD+-ND, isused as the resistor in the sensor circuit; in general, larger resistor values cause the FSRs tosaturate less quickly but lose  delity. Sensor operation was not addressed in this thesis, butresistor values were chosen so as to gain two to three amplitude levels of touch sensing perresistor. The sensor outputs are connected to the analog input pins of the microcontroller,and the digital pins for controlling the multiplexers and digital potentiometers to the digitalinput and output pins of the microcontroller. Control of the digital potentiometer is viathe 3-wire SPI protocol. A triple-axis accelerometer with analog output, capable of sensingup to  3 g, is also present on the control board.+-VOU T= 2.5(1−RDIGI P OTRF SR)RDIGI P OTRF SR+5V+2.5VFigure 3.9: Creature force sensitive resistor circuit.MicrocontrollerThe control board uses the Arduino Mega as its main controller. The Arduino Mega is astandalone microcontroller board containing an Atmel ATmega 1280 AVR microcontrollerrunning the Arduino bootloader. The microcontroller operates at 16 MHz, with 128 kB ofFlash memory. It has 54 digital output pins, 14 of which can provide pulse width modula-tion (PWM) output, and 16 analog input pins, as well as 4 UART serial communicationschannels. The use of the Arduino board allowed for the microcontroller and its supportingdevices to be connected with smaller solder-traces than possible in a non-mass-manufacturedboard, allowing for a smaller overall footprint. The Mega also supports the SPI (Serial Pe-ripheral Interface Bus) and I2C (Inter Integrated Circuit) communications protocols.283.3. Communications: Command and ControlCommunications EquipmentThere are several digital input and output methods provided by the control board. Primaryinput to the Creature is via the universal serial bus (USB) port on the Arduino Mega board;this is also the channel through which the microcontroller code and  rmware are loaded andupdated. Access to this port is somewhat di cult without disassembling the Creature’sshell; therefore, a \tail" consisting of a short USB cable and power cord surrounded by furis typically attached to the Creature. A Digi XBee® 802.15.4 RF module allows for wirelessradio communication between the radio base station (see Section 3.3.1) and the Creature.The range of the XBee has been experimentally measured at greater than twenty meters,line of sight, which is more than su cient for typical operations. A Bluegiga Bluetoothcommunications module (WRL-08771) allows for Bluetooth communication between theCreature and a Bluetooth-enabled computer. Both radio communication devices emulateserial ports on the Creature and the host computer; typically communications speed is57600bps, bidirectional. Headers on the control board allow for a wired tail to be attachedfor additional serial communication with the microcontroller or other devices.3.3 Communications: Command and ControlThe control board was designed to support communication through several di erent meth-ods and media, in order to support communication and monitoring in diverse environments.The Creature as part of the TAMER platform must be capable of both receiving data fromand providing data to the physiological sensing suite, and it must be able to do this reli-ably and e ectively. It must also be able to report Creature hardware status, in particularinternal temperatures and battery voltages. In typical operation the Creature is controlledby a host computer, communicating wirelessly via the XBee radios. In locations with highelectromagnetic interference or for testing purposes a wired serial connection from the radiobase station may be used. The radio base station, creature status interface, and several typ-ical usage cases of the Creature communications systems are described in this subsection.A diagram of the command and control scheme is shown in Figure 3.10. The computerhardware used during the experiments is described in Appendix B.5.3.3.1 Design and Construction of Radio SystemWireless communication with the Creature necessarily requires two radios: one inside theCreature and another to send commands to it. A radio base station was designed andconstructed to contain this second radio, and allow for a secure and safe connection withthe host computer. The radio base station (see Figure 3.11) consists of a Digi XBee®802.15.4 radio, as well as supporting components. The radio base station supports input293.3. Communications: Command and ControlCREATURESENSORSUSERPHYSIO DATATOUCH DATAHARDWARESTATECREATURE COMMANDSPHYSIOSOFTWARECREATUREDISPLAYFigure 3.10: Simpli ed Diagram of TAMER command and control scheme. Arrows repre-sent communications links between system components, dashed arrows identify the connec-tions that are typically wireless.from several sources: USB communication with a host computer, as well as two-wire serialinput from any other serial device. In addition, several digital and analog input and outputpins on the front cover of the unit allow for switches or potentiometers to be used to controlthe Creature. For future applications, an Atmel ATmega 328 chip can be attached to theradio base station board to allow for operation of the base station without a host computer.This chip is typically programmed with the Arduino bootloader to allow for use of theArduino programming environment, and can make use of the several LEDs and a 4-digit7-segment display on the unit’s control board for user feedback. Schematics and parts listsfor the radio base station can be found in Appendix C.1.3.3.2 Creature User InterfaceUse of the Creature in experiments revealed a need for additional monitoring and feedbackof the Haptic Creature. A computer graphical user interface (GUI) in the Processing envi-ronment was developed to receive feedback from the Creature, and is shown in Figure 3.12.This GUI can utilize whichever communication methods are not currently being used tosend commands to the Creature; during typical operation this is the Bluetooth transceiver.It provides the status of the breathing servo and pulse motor, as well as the internal tem-perature of the Creature. Data received are logged to a text  le, for use in after-experimentperformance analysis.303.3. Communications: Command and ControlFigure 3.11: The radio base station for the Creature.3.3.3 Creature ModesThe Creature is capable of operating either as part of the TAMER loop, controlled byother programs, radio controlled via the radio base station, or autonomously via onboard rmware. These methods are described in the following sessions.Physiological Sensor Suite Input (e.g., mirroring)In typical operation the Creature mechanisms are directly controlled by the physiologicalsensor suite. The physiological sensors are connected to a computer running the physio-logical sensing software, which is connected to a radio for command transmission for theCreature. Creature mechanisms are controlled by the physiological software according toprogrammed algorithms | in the simple but common usage case of the Creature mirroringthe user’s heart rate and pulse, the physiological software commands the position of thebreathing servo to match that of the respiration sensor on the user, and triggers a pulse inthe Creature when it detects one in the user. Input from the software does not have to be313.3. Communications: Command and ControlFigure 3.12: GUI for the Haptic Creature, providing motor, servo, and temperature status.A display of commanded respiration rate is shown in the upper left hand corner. To theright of that graph temperature readings from internal temperature sensors are display. Onthe far right is a timer system for experiments.direct motor or servo commands; the software can also control the Creature hardware ata more general level, such as commanding a transition between pre-programmed emotional\states" on the Creature.Direct Software InputThe Creature can also be controlled by any other software program that has access to theserial port, such as those written in the Processing language [76].Radio Stand-AloneThe radio base station as mentioned previously can act as a standalone device by installingthe ATMEGA328 microcontroller into the unit, and programming it using the Arduinoenviroment. The digital and analog input pins on the radio base station can be connected to,for example, potentiometers or switches to drive the Creature. This is useful when operatingthe Creature for demonstrations or testing, where a host computer is not available.323.4. Feedback from TestingAutonomous OperationThe Creature can also be programmed to act autonomously, running a preset program with-out external input. As this does not incorporate the functionality gained by incorporationin the TAMER loop, it is typically used only for testing or demonstration.3.4 Feedback from TestingAfter construction of the Creature for the TAMER platform, testing and informal pilotstudies revealed several design concerns and suggestions for improvement that were im-plemented into the device. The three primary areas of redesign were related to unwantedvibrations in the Creature, temperature and cooling related issues, and user comfort.3.4.1 Vibration and NoiseDuring operation of the Creature as part of this thesis a slight vibration and noise werepresent from the breathing and pulse mechanisms. The noise from the breathing servo waspredominantly from the servo attempting to maintain a constant position against gravitypushing down the abdomen shell. The shell would fall a small amount and then be raised bythe servo, creating sound from the action of the servo and vibration from the motion of theshell. The refresh rate of the servo was increased to give the shell less time in which to fallbefore the servo would react: this had the result of eliminating the vibration, and changedthe sound emitted from a choppy one to a lower-volume purr. These changes both increasedthe rate at which breathing servo commands are sent from the physio software and increasedthe smoothness of the breathing mechanism when mirroring a user’s respiration. The noisefrom the pulse mechanism was reduced by placing vibration dampers at the mechanismmounting points. This somewhat mu ed the sound, but there is still a fairly audible clickwhen the pulse mechanism is operated.3.4.2 Temperature / CoolingA similar version of the Haptic Creature experienced a servo failure due to overheating.To prevent this, the Creature was equipped with a temperature monitoring system bythe author. Up to eight DS18B20 1-wire digital thermometers can be located throughoutthe Creature; in typical operation one is placed on the breathing mechanism servo, andanother near the voltage regulators on the control board, the two primary heat generatingcomponents inside the shell. Temperature readings are taken every  ve seconds duringCreature operation and passed to the control computer, if present. The Creature is shutdown when internal temperature rises above 48:8 C, above which damage to the internalcomponents may occur. Figure 3.13 shows the Creature internal temperature during an333.4. Feedback from Testingexperiment. Temperature is monitored in two places: on the breathing servo, and theanterior of the Creature, located the farthest away from the breathing servo and expected tobe the coolest part of the Creature. Breathing servo temperature increased by eight degreesduring  fty minutes of use. Although overheating concerns are therefore not a problemduring typical operation, some wires were rerouted and neatened to increase available spacearound the primary heat generating mechanisms in the Creature, namely the servos andthe voltage regulators on the control board.creature temperature while active on laptime [min]temperature [°C]0 10 20 30 40 50 6021.122.223.324.425.626.627.828.9anteriorbreathing servoFigure 3.13: Graph of Haptic Creature internal temperature during normal use.3.4.3 ComfortWhile the fur was generally found to be soft and comfortable, the  berglass shell couldstill be easily felt underneath. This was particularly evident when the Creature was restingon the lap: the Creature bottom seemed particularly hard and bony. To alleviate this asilicone skin was developed for the Creature by the author in collaboration with Yohananet al. [8] and constructed by the author. It attaches to the  berglass shell underneath thefur (see Figure 3.5). This skin consists of an approximately 0.25 inch thick piece of siliconein the shape of the Creature’s shell. Part of the skin stretches over the ends of the shell tosecure it in place.The pad improved the comfort of the device markedly, and had the added bene t ofincreasing the zoomorphic characteristics of the Creature. The feel of the silicone under the343.5. Online Physiological Assessmentfur also attempted to replicate the feeling of a dog or a cat, where there is a harder level ofskeleton under the fur coat. The silicone, combined with the fur, also helps spread out theforce from any touching of a skinned surface, resulting in registration of a touch by moreforce sensing resistors on the shell.3.5 Online Physiological AssessmentThe second major component of the TAMER platform is the physiological sensor suite,allowing for user feedback via physical channels. Figure 3.14 summarizes the physiologi-cal signals collected in this platform and the physiological metrics derived from these. Sixsensors, EKG (Electrocardiogram), EMG (Electromyogram), BVP (Blood Volume Pulse),Skin Conductance, Respiration, and Skin Temperature, are currently used within this plat-form, both to derive the physical state of the user and to drive the actuators of the HapticCreature. The capability exists for additional sensors to be integrated into this platformas they become available. Section 3.5.1 describes the key physiological signals used for thisplatform, Section 3.5.2 describes the sensors and encoder used, and Section 3.5.3 describesthe reactions to and limitations of these sensors.ACCELERATIONNORMALIZEDECGECGHEART RATEDERIVATIVENORMALIZEDBEAT DETECTBVPBVPAMPLITUDEHEART RATEBEAT DETECTSCRSCRNORMALIZEDSLOPENORMALIZEDRESPRESPRATEAMPLITUDEEMGEMGNORMALIZEDTEMP(FILTERED)(RAW)VARIABILITYSTANDARD DEV.PNN50FREQUENCYFigure 3.14: Overview of measured physiological signals and the physiological metrics de-rived from them.353.5. Online Physiological Assessment3.5.1 Physiological SignalsThe bio-sensor suite generates a large number of physiological metrics of which several areparticularly important due to their relation to participant anxiety. Literature speci callyregarding the links between the physiological sensors used within this thesis and anxiety isdiscussed here in order that the use of these speci c sensors may be justi ed.Anxiety can be thought of as the \ ght or  ight" response of the autonomic nervoussystem: the body prepares to respond to a threat by optimizing performance of criticalsystems.Heart rate, body temperature, and blood  ow to muscles are increased; while activitiessuch as those of the digestive system or blood  ow to the extremities are reduced until afterthe danger. When properly triggered due to an external stimulus this is considered \fear,"but without such a trigger it is considered \anxiety." When this stress response is activatedimproperly in a person, such as in social situations, the e ects can be crippling as well asunhealthy; and in cases of long-term anxiety disorders, this response may be chronic anddebilitating. Physiological sensors can be used to detect the physiological changes char-acteristic of anxiety, and are particularly useful in situations where a person is unable toconsciously detect the stress response beginning. Current research investigates both theshort-term responses to stimuli as can be easily gathered in a laboratory environment andthe longer-term physiological di erences between those su ering from chronic anxiety andcontrol subjects. Both are di cult due to inherent between-subject variations in commonphysiological metrics; the latter also due to the within-subject  uctuations over the longertime periods. Indeed, in persons with chronic anxiety disorder it may be impossible togather a non-anxious baseline for physiological comparison; Craske states that: \the auto-nomic system may reestablish a balance over long periods of stress, such that dysfunctionis no longer apparent except during acute panic attacks" [77]. Although it may be di cultto assess anxiety quantitatively, it is the eventual goal of the TAMER project to incorpo-rate into the TAMER platform advanced machine learning algorithms for analyzing datafrom the physiological sensors. By training the system based on anxiety assessments frommedical professionals it should be able to identify various levels of anxiety in a user, andmight even be able to eventually distinguish levels of anxiety in su erers of chronic anxiety,which would otherwise be di cult. For the experiments in this thesis it is assumed thatparticipants did not su er from a chronic anxiety disorder in which their physiological re-sponses would be reduced, and therefore comparisons are made to a physiological baselinegathered during the experiment. Disturbing images, intensive tasks, and timed and scoredcognitive training exercises are used in an attempt to induce physiological changes in theparticipants that would be similar to anxiety, as they have been both self-reported to in-duce anxiety and used in other studies that purport to induce anxiety in their subjects.363.5. Online Physiological AssessmentEventual comparison of the physiological data from these experiments to physiological datafrom patients undergoing anxiety as determined by a medical professional would con rmwhether or not anxiety was actually induced. Here, changes in both long-term and short-term heart rate, skin conductance, heart rate variability, respiration rate, skin temperature,and corrugator muscle activity are discussed. These are the primary sensors and metricsused in the physiological sensing suite of the TAMER platform.Heart Rate and Skin ConductanceHeart rate and skin conductance response are perhaps the two physiological metrics mosthighly and often correlated to anxiety. Both are primarily associated with short-term, in-duced anxiety: indeed the increase in skin conductance is often called the \startle response"due to its quick onset and rapid disappearance. Bankart et al. induced anxiety by informingsubjects that they were likely to receive an electric shock after a countdown period [78]. Theprobability that they would be shocked was varied. They found that during the countdownperiod both heart rate and skin conductance increased linearly. After the  rst shock, heartrate quickly ceased to increase and stabilized, but maintained an elevated rate comparedto baseline throughout the experiment. Skin conductance continued to increase throughoutthe experiment. Telling subjects that their shock would be mild reduced this e ect, but itwas still present.  Ohman et al. investigated whether this response was driven by consciousactivity [79]. They showed pictures of snakes and spiders to users for 30 milliseconds, tooshort a duration to consciously perceive the image, and found that skin conductance re-sponse was similar to those groups that had been shown the pictures for long enough toconsciously perceive them. They also found that those who had previously expressed fearof spiders and snakes had more elevated skin conductance responses than those who hadnot, and that they felt more negative, more aroused, and less dominant after their exposureto the images. These, among other studies in the literature, suggest that elevated skinconductance and heart rate are correlated with experimentally induced anxiety, and thatthe level of such elevation is increased by an increased perception of anxiety. Assessment ofnon-experimental anxiety con rms these results. Caprara et al. measured the skin conduc-tance levels of patients about to undergo a dental procedure [80]. They found that increasedskin conductance was an objective and reliable test for identifying anxiety in patients.Hoehn-Saric et al., in several studies, investigated the e ects of a clinical anxiety di-agnosis on skin conductance and heart rate response to stressful tasks. They found thatwhen given a stress-inducing task, subjects tended to show reduced skin conductance andheart rate variability (heart rate variability as standard deviation), and that this reductionin variability was greater in those who had been diagnosed with chronic anxiety [81]. Theyfurther examined this lack of variability to conclude that chronic anxiety patients typically373.5. Online Physiological Assessmentreact with less physiological  exibility to every-day stress, but have an increased reactionto anxiety-provoking stimuli [82] than control subjects.Heart Rate VariabilityVarious changes in heart rate variability have been correlated with an increase in either gen-eral or speci c anxiety responses, and a reduction in heart rate variability is now commonlyassociated with anxiety. Dishman et al. measured heart rate variability in gym patrons for ve minutes while resting [83]. They found a correlation between a reduced normalizedhigh frequency component of heart rate variability and the patron having perceived emo-tional stress during the previous week. They did not  nd a correlation between self-reportedsusceptibility towards anxiety and heart rate variability.Generalized anxiety disorder is also associated with resting variations in heart rate vari-ability. Blom et al. investigated heart rate variability in subjects with generalized anxietydisorder or major depressive disorder, and found them to have lower high frequency and lowfrequency components of heart rate variability, as well as a reduced standard deviation ofheart rate interbeat intervals compared to controls [84]. Thayer et al. also investigated thissubject pool [85]. They found that subjects with generalized anxiety disorder had shorterinterbeat intervals and lower high frequency component of heart rate variability even whileresting, and that when instructed to worry they had even shorter interbeat intervals, lowerhigh frequency component of heart rate variability, and a reduction in successive interbeatintervals that di ered by more than 50 milliseconds. Friedman et al. subjected subjectssusceptible to severe panic attacks, severely afraid of blood, and controls to stressful tasksin a lab [86]. They found that the control subjects had longer heart rate inter beat intervals,higher variance in heart rate inter beat intervals, greater high frequency component of heartrate variability, and lower low frequency to high frequency ratios than those susceptible topanic attacks.RespirationWhile hyperventilation is the most obvious respiration-related indicator of anxiety, severalstudies have investigated whether more subtle variations in respiration rate could be anindicator of increased anxiety. Several results were not promising: Suess et al. inducedanxiety by threat of electric shock, and while they saw an increase in heart rate duringthe task, this was not correlated with a change in respiratory activity [87]. However, morerecent work does suggest a link between respiratory variability and anxiety. Martinez etal. found that patients diagnosed with panic disorder had greater respiratory variability inboth rate and amplitude than controls, even after receiving medication for the disorder [88].Niccolai et al. in a recent meta-analysis of the literature, con rm that increased respiratory383.5. Online Physiological Assessmentvariability is well-correlated with panic disorder [89].Skin TemperatureSkin temperature has often been used to help identify emotions, and has been associatedwith both anxiety and relaxation. In general, a decrease in skin temperature is correlatedwith an increase in anxiety. Rimm-Kaufman et al. found that hand skin temperature in-creased when participants were exposed to a video designed to generate happiness, butdecreased when asked threatening personal questions [90]. Mittelmann et al. induced anx-iety in subjects by questioning them during psychoanalysis: they found that a decrease in nger skin temperature was associated with anxiety [91]. Boudewyns et al. again subjectedsubjects to electric shock in order to induce anxiety; they found that  nger skin temperaturedecreased during the stressful condition and increased during relaxation, and was correlatedwith participant self-reports of arousal [92].ElectromyogramSurface electromyography of various muscle groups has been used to assist in the classi- cation of various emotional states. Increased muscle tension has been associated withanxiety disorders and stress, and brief muscle responses can be associated with startleevents. Smith et al. investigated corrugator or eyebrow muscle response to disturbing im-ages designed to induce anxiety, and found that these images were correlated with increasedEMG activity [93]. They also found that baseline images of increased anxiety before thedisturbing images were associated with a larger response over neutral photos. Cacioppo etal. found that EMG corrugator muscle activity could be used to distinguish between posi-tive and negative emotion inducing pictures, even when there were visible changes in facialexpression [94]. Dimberg concluded that \facial EMG technique may be a sensitive toolfor measuring emotional reactions" [95], and found that anger-inducing photos increasedcorrugator muscle activity as opposed to neutral photos [96].It is important to note that while the above data show correlations in various physiolog-ical metrics to anxiety, the actual inference of anxiety from physiological data, especially inan online modality, is challenging. The human body is a complex organism, and the physi-ological metrics measured are a ected by the activities of numerous bodily systems, all ofwhich can have di erent short and long-term reactions to stimuli. Responses are often notconsistent across the population, and are in some cases not even present at all. Laboratoryinduction of anxiety in a controlled environment can help in identifying these e ects, but ina real-world environment they are often obscured by the noise from every-day interactions.While the various low-frequency signals are useful in the classi cation of anxiety disorders,and have been used to judge the e ects of various robotic interventions, their utility for393.5. Online Physiological Assessmentshort-term human-robot interaction is limited.In recognition of these limitations, the initial use of the TAMER platform has been eitherin controlled laboratory settings or in scenarios in the outside world with limited reactionto external stimuli | participants were at a computer in a classroom, but not interactingwith their classmates. Additional sensor platform training in recognition of anxiety will benecessary before the platform can used in every day activity.3.5.2 Physiological Sensors UsedIn this section the primary sensors used for collecting physiological information are de-scribed. The hardware and software platform for physiological sensing is a later version ofthat used by Kuli c et al. in their human-robot interaction research in the CARIS lab. Theinitial usage of the sensor platform by Kuli c et al. was to detect anxiety in human-robot in-teraction: see Section 2.5. Additions made by this author include the porting of the softwareto a more recent operating system, as well as the capability for the software to communicatewith the most recent Thought Technology hardware and the Haptic Creature. Only wherechanges have been made in the processing of physiological signals are they described in thissection, otherwise see reference [97] for signal processing and  ltering details. An image ofa user wearing the physiological sensors typically used with the TAMER platform is shownin Figure 3.15.EncoderThe data acquisition device used for this platform is the Thought Technology [98] Flex-Comp™ In niti (pictured in Figure 3.16). This encoder is designed for clinical physiologicalmeasurement and biofeedback training. This encoder has ten channels capable of recordingat 2048 samples per second, although data are sampled at 256 Hz within the platform. Dataare transferred from the encoder via a  ber optic cable to a converter located near the hostcomputer, and then converted to USB to connect to the host computer. The encoder ispowered by four AA batteries.EKG (Electrocardiogram)EKG (Electrocardiogram) or heart electrical activity is measured by the EKG SensorT9306M (see Figure 3.17), a 3-lead electrocardiography sensor. The sensor is connectedeither to a 3-terminal electrode as in Figure 3.18(c) and attached to the center of the chest,or to an extender cable as shown in Figure 3.17. In the latter case, three electrodes areattached to the participant’s chest: a negative electrode on the right shoulder, a positiveelectrode to the left of the navel, and a ground electrode on the upper left portion of thechest. This was the method that was typically used during experiments. Although the403.5. Online Physiological AssessmentFigure 3.15: User holding the Haptic Creature and wearing physiological sensors: respirationrate (a), blood volume pulse (b), skin conductance (c), and EKG (d). The EMG sensor isnot shown, it would have been placed on the forehead.extender cable required the use of additional cabling, the use of three smaller electrodesattached to the periphery of the chest instead of one large electrode in the center of thechest reduced the amount of body hair contacted by the electrode glue, resulting in greatlyimproved participant comfort (particularly male) when removing the sensors. It also pro-vided better signal quality, as there was less susceptibility to noise from  dgeting of thebody core, and the single electrodes proved less susceptible to losing their connection due toperspiration. Similar electrodes are used for the EMG sensor, and in all cases the electrodesused are single-use and disposable. Participants were typically asked to attach the sensorsthemselves.A QRS detection algorithm [99] is then applied to the signal data to detect the occurrenceof a heart beat. From this data heart rate, heart acceleration, and heart rate variability arecalculated, as are normalized versions of the same.EMG (Electromyogram)Electromyogram or muscle activity is measured by the EMG MyoScan-Pro™ Sensor T9401M-60 (see Figure 3.18): a pre-ampli ed surface electromyography sensor. This sensor is typi-cally connected to the forehead to measure the activity of the corrugator supercilii muscle;413.5. Online Physiological AssessmentFigure 3.16: Thought Technology FlexComp™In niti Encoder.Figure 3.17: Thought Technology EKG™ Sensor T9306M, attached to triode electrodes forplacement on chest.for this location care must be taken to ensure the sensor cable does not interfere with theuser’s vision. It can also be attached to other muscles to measure their electrical activity.This signal is  ltered and then normalized as in Kuli c et al. [97].423.5. Online Physiological Assessment(a) Back of Sensor (b) Front of Sensor (c) Side of SensorFigure 3.18: Thought Technology EMG MyoScan-Pro™ Sensor T9401M-60. (c) showssensor with electrode attached.Skin ConductanceSkin Conductance Response (SCR) (or Galvanic Skin Response (GSR)) is measured bythe Skin Conductance Sensor SA9309M, as shown in Figure 3.19. The sensor measures theelectrical resistance of the skin, and is the same type of sensor used in lie detector tests. Skinconductance is a ected by the amount of moisture present in the skin, as released by glandswhen sweating or in response to stress or fear. During experiments the Skin ConductanceSensor is worn on the index and middle  nger of the participant’s non-dominant hand. Thesensor electrodes must be cleaned with alcohol after each use, and are replaced after  ftyuses.This signal is then  ltered, and the derivative taken to produce a skin conductancederivative measurement. Both are normalized as in Kuli c et al. [97].433.5. Online Physiological AssessmentFigure 3.19: Thought Technology Skin Conductance Sensor SA9309M.BVP (Blood Volume Pulse)Sensor The Blood Volume Pulse Sensor SA9308M (as shown in Figure 3.20) is a photo-plethysmography sensor. It measures the re ectivity of the skin to infrared light, a propertydependent upon the amount of blood present in the underlying tissues. A heartbeat causesa sudden increase in the amount of blood present; therefore, this sensor is able to measurethe occurrence of a pulse. This sensor is typically attached to distal end of the thumb of theparticipant’s non-dominant hand, and secured in place by a velcro strap. It is used whenthe more-invasive EKG Sensor is not desired or appropriate; however, care must be takento ensure the sensor is attached tightly enough to the  nger to record a signal, but not sotight as to impede circulation and cause discomfort to the participant. The sensor does notmeasure in absolute units, but rather percentage change in blood volume.Processing An example sensor signal is shown in Figure 3.21. The raw blood volumepulse signal is passed through a 7th-order low pass Butterworth  lter with a 3 Hz cuto , asa user’s heart rate should not exceed about 120 beats per minute during experiments. The ltered signal is then passed through a peak-detection algorithm, which looks for a changein  rst derivative to determine the occurrence of a heartbeat. From this time series, heartrate and heart rate variability can be extracted.443.5. Online Physiological AssessmentFigure 3.20: Thought Technology Blood Volume Pulse (BVP) Sensor SA9308M, front andrear.time [s]blood volume pulse [%]0 2 4 6 8 10 12 14 16 1835.235.435.635.83636.236.436.636.83737.2Figure 3.21: A sample un ltered blood volume pulse signal, showing four heartbeats.453.5. Online Physiological AssessmentHeart Rate Variability Heart rate variability is a general term describing several met-rics derived from heart rate that describe activity of the autonomic nervous system. Itis computed from a series of interbeat intervals (IBIs), or time between heart beats, andcan therefore be calculated from either the electrocardiogram or blood volume pulse sensor.Use of the Electrocardiogram (EKG) sensor theoretically gives better performance as theEKG directly measures the electrical activity of the heart. The blood volume pulse sensormeasures a more distant e ect of the heart beat, the increase in blood in a distal digit. Inpractice this di erence is minimal, and often one sensor will o er higher reliability thanthe other due to the physical characteristics of the particular user: the EKG sensor can bedi cult to attach on a subject with a large amount of chest hair, and the blood volumepulse (BVP) sensor can shift and become detached if the subject’s thumb moves too often.Several variability metrics are calculated, as de ned in the following paragraphs.Root Mean Squared Standard Deviation The root mean squared standard deviationof heart rate is calculated as follows, where n is the number of observations:SDRMS =rPni=1(ibin+1 ibin)2n (3.1)A running 10-second average of root mean squared standard deviation is generated bythe physio software; this value can be computed for longer time periods as well.PNN50 PNN50 is calculated as the sum of the number of successive interbeat intervalsthat di er by more than 50 ms, divided by the total number of interbeat intervals counted.PNN50 = # of (ibin+1 ibin) > 50n (3.2)Frequency Analysis Frequency variation in the interbeat interval series is calculatedfor extremely low, very low, low, and high frequency bands using commonly acceptedranges [100], as shown in Table 3.1. The integral of the power spectral density functionof the signal is used to calculate the power of each frequency band. For samples less than ve minutes in length extremely low frequency and very low frequency data are typicallyunreliable [101]. Also calculated is the LF / HF ratio as follows:LF / HF ratio = low frequency powerhigh frequency power (3.3)463.5. Online Physiological AssessmentTable 3.1: Heart rate variability frequencies.band lower limit [Hz] upper limit [Hz]extremely low frequency (ELF) 0 0.0033very low frequency (VLF) 0.0033 0.04low frequency (LF) 0.04 0.15high frequency (HF) 0.15 0.4Respiration SensorRespiration rate, amplitude, and waveform are measured by the Respiration Sensor SA9311M,as shown in Figure 3.22 This sensor consists of a strain gauge connected to a large velcrostrap. This strap is worn around the upper abdomen over the participant’s clothing, andtightened so that the strain gauge is on the front of the abdomen. The strain gauge expandsand contracts with the user’s breathing. The sensor does not measure expansion in absoluteunits, but rather percentage expansion.Figure 3.22: Thought Technology Respiration Sensor SA9311M.Processing An example respiration rate signal is shown in Figure 3.23. The processedsignal is shown in Figure 3.24. The raw respiration signal is passed through a 5th-orderlow pass Butterworth  lter with a 1 Hz cuto . The  ltered signal is then passed througha peak-detection algorithm, which looks for a change in the  rst derivative to determinethe peak of a breath (peaks are identi ed by blue triangles in the  gure, troughs by thepurple triangles), similar to how peaks in the blood volume pulse signal are detected. Thepeak-to-peak distance between breaths is then used to calculate the participant’s breathing473.5. Online Physiological Assessmentrate (L1 or L2). The normalized current respiration amplitude is calculated as follows:Respiration Amplitudecurrent Respiration AmplitudeminRespiration Amplitudemax Respiration Amplitudemin (3.4)time [s]respiration amplitude [%]0 2 4 6 8 10 12 14 16 18 20212223242526272829Figure 3.23: A sample  ltered respiration signal.Normalization from smallest to largest expansion is necessary for calculations, as thepercentage compression and expansion for a single breath for each user can vary widely.Typically this is about  ve to six percent of the full sensor range for a deep breath. Onsome subjects, particularly very small ones, the upper abdomen may not give a large enoughrange of motion, and the sensor may have to be placed lower on the abdomen, around thebelly. Such a placement is undesirable, as belly motion can be a ected heavily by speech.Although the algorithm has generally proved robust to short phrases or questions, longerperiods of speech can result in erroneous data. Respiration rate and breath length are termstypically used to describe user breathing; they are the inverses of each other.483.5. Online Physiological Assessmenttime [s]respiration amplitudeCalculation of Respiration Rate0 1 2 3 4 5 6 7 8 9 1021.421.621.82222.222.422.622.82323.223.4L1L2Figure 3.24: Calculation of respiration rate.Skin TemperatureThe Temperature Sensor SA9310M (as shown in Figure 3.25) measures the skin temperatureof a peripheral digit. This sensor was worn on the ring  nger of the participant’s non-dominant hand, and attached by a piece of medical tape. No  lter is used as the signal isrelatively noise-free and slow moving; a sample signal is shown in Figure 3.26. Data arerecorded in degrees Celsius. As with all sensors placed on the  ngers, care must be taken toensure that the sensor does not become detached during use, and that the sensor does notslip down to the underside of the  nger, which may be in contact with a warmer surface.Figure 3.25: Thought Technology Temperature Sensor SA9310M, showing sensor and con-nector to encoder493.5. Online Physiological Assessmenttime [s]temperature [°C]0 10 20 30 40 50 60 70 80 903131.131.231.331.431.531.631.7Figure 3.26: A sample skin temperature signal.3.5.3 Sensor Application NotesActual use of the sensors in both brainstorming, pilot studies, and experiments as part of thisthesis provided valuable feedback on the use of the sensors in experimental environments.In particular, maintaining proper sensor functioning is a challenge inherent to physiologicalexperiments | most physiological experiments have at least a few people whose data areunusable. The sensors attached to the  ngers were particularly prone to coming loose duringexperiments, as subjects typically made contact with the Creature with their hand attachedto the sensor. The most sensitive to motion is the blood volume pulse sensor, which requiresa tight  t on the thumb to record a proper signal. Motion in the body core could also a ectsensor readings:  dgeting could result in noise in the EKG signal, and talking resulted indisruption of the respiration rate signal. In general, however, the large number of wiresrequired to physiologically monitor a subject is a greater hindrance to non-experimentaluse of the sensors than these motion concerns.50Chapter 4ExperimentsA series of four experimental trials was performed to evaluate the functionality of theTAMER platform and to investigate its e cacy in guiding physiological responses. The rst, a pilot experiment, was a preliminary examination into the feasibility of using theHaptic Creature as an anxiety-reducing device: participants were asked to view disturb-ing images with a proof-of-concept version of the Haptic Creature. Overall results wereencouraging enough to support construction of the TAMER platform. Experiment 1 andExperiment 2 investigated the ability of the Haptic Creature to in uence physiological re-sponses. In Experiment 1, the initial physiological and subjective responses to the Creaturewere observed, and participants were exposed to the Creature mimicking their breathingand pulse. In Experiment 2 participants were asked to use the Creature as a training tool,breathing with it, and then had the Creature in their lap as they performed a task. In the nal experiment, Experiment 3, the Creature was tested in a target environment, namely anelementary school that supports children with learning challenges, many of them anxietyrelated. Children were introduced to the Creature and then given the Creature to haveduring a stressful activity. In this chapter the experiments and experimental results aredescribed.4.1 Pilot Experiment: Response to Disturbing Images4.1.1 Introduction and MotivationInitial reactions to the  rst Haptic Creature prototype revealed the potential for the deviceto provoke a comforting and calming response in its users [72]. The Creature’s similarityto both a stu ed animal and an actual animal suggested the potential for the Creature toproduce similar comforting e ects. Here, a pilot study was undertaken to investigate thisgeneral hypothesis. Information obtained from this experiment was also desired to assist inthe design of the second Haptic Creature prototype.4.1.2 Experimental Design ConsiderationsThe  rst step in developing such an experiment was to determine both how to best in-duce anxiety in adult participants, and whether the physiological sensing would be able to514.1. Pilot Experiment: Response to Disturbing Imagesrecognize such anxiety. Inducing anxiety in experimental participants is di cult both prac-tically and ethically: participants can have widely di ering responses to the same stimuli,anxiety-inducing scenarios are limited, and threats of harming or actual physical harm tothe participant are not permitted under ethics regulations. It is also necessary to have anon-anxious baseline from the participants to help determine the physiological indicatorsof anxiety. Therefore, although long-term anxiety or general stressful situations such as themiddle of exam week could be ideal scenarios in which relaxation therapy would be e ective,the determination of this more chronic and persistent anxiety would be beyond the time-scale of the preliminary experiments and the clinical capabilities of the researchers, thusnecessitating an investigation of short-term anxiety induction and response. In addition,in order to produce a measurable e ect during the limited time-span of an experiment,the anxiety stimulus must be able to quickly induce anxiety in the participant. Typicalpsychological methods of inducing anxiety in experiments are such procedures as rapid- reyelling of math questions to be answered, or playing a stressful puzzle or video game. Thesewere deemed impractical for two reasons. First, they were viewed as too distracting fromthe Haptic Creature prototype, and second, they required the use of the participant’s hands| participants would need to keep their hands, which would also be encumbered with thephysiological sensors, on the Creature during any experiment, as the hands are the primarychannel through which the Creature communicates. It is important, eventually, to havetheir hands available for other activities while using the Creature. Potentially hand-relianttasks could raise the questions of how much physical interaction with the Creature wouldbe required for it to be e ective, and how inhibiting the hand sensors would be. Theseare discussed below in Sections 4.3.4, 4.3.6, 4.4.5, and 5.3.2. Therefore, additional anxietyinducing methods were investigated.In a pilot study, six participants were asked to watch a two-minute video clip of a moviepicked for its believed ability to induce anxiety [102] while physiological data, skin conduc-tance, EKG, and EMG, were collected. Analysis showed an increase in skin conductanceand heart rate during the movie. This response, however, was inconsistent across trials,highly transient, and dependent upon an individual’s engagement with the video. In mostcases, this response peaked for only part of the scene, remaining at a lower state for the ma-jority of the  lm. While clearly real, these responses were neither sustained nor controllableenough for use during an experiment. A more stable visual source of anxiety was thereforesought. The International A ective Picture System [103] is a set of images designed toprovoke either positive or negative reactions in subjects, and correlated with physiologicale ects in both skin conductance and corrugator muscle activity [104], both of which aredirectly measured by the physiological sensor suite. Images such as mutilations, snakes,insects, and dead bodies, corresponding to high anxiety induction, were selected. By usinga variety of images, it was expected that participants would be more likely to experience at524.1. Pilot Experiment: Response to Disturbing Imagesleast one anxiety inducing stimulus.Since this pilot experiment was done prior to the construction of the Haptic Creatureversion described in Section 3.1.1, the \Wizard of Oz" prototype constructed by Yohananet al. [72] and shown in Figure 4.1 was used during the experiment. This prototype is amanually actuated predecessor of the present Haptic Creature. It consists of a warmingelement, a purring mechanism, in atable ear-like appendages, and a pneumatically acti-vated breathing mechanism. In operation during the experiment the breathing and purringmechanisms are activated at a constant, moderate rate by a facilitator.Figure 4.1: \Wizard of Oz" Haptic Creature Prototype used in pilot experiment, showingbellows used to simulate breathing and heating pad.4.1.3 Research QuestionsThere were two main research questions for this preliminary experiment.• Would the prototype Haptic Creature be e ective in reducing the level of anxietyexperienced by a participant during the viewing of disturbing images, as measured byphysiological sensors and surveyed self-assessments?• What changes would be measurable or captured by the physiological sensors duringthe experiment, and could they be correlated with anxiety?Physiological data were investigated both for an EMG reaction to the disturbing images,due to their visual nature, and for changes in average heart rate and skin conductance, whichare two commonly accepted methods of measuring anxiety [105, 106]. A description of thecalculations performed for this and the following experiments is included in Appendix A.2.534.1. Pilot Experiment: Response to Disturbing Images4.1.4 Experiment ProcedureThis experiment took place in an ICICS experiment room that had been cleared of equip-ment. Participants sat in an o ce chair facing an HDTV television screen a xed to thewall. The encoder for the physiological sensors was placed on a small table beside theparticipant. Wiring from both the biosensors and the \Wizard of Oz" prototype HapticCreature ran from the participant to a fake wall placed to the participant’s right. The wallserved to hide the prototype’s actuators, computer equipment, and the experiment facil-itators. During the experiment participants were viewed through cameras present in theroom; unusual interactions with the prototype were noted. There were three main partsto this experiment: a preliminary questionnaire, two separate slideshow viewings, and apost-experiment questionnaire. The overall experiment procedure is outlined in Figure 4.2.baseline, calm imagesdisturbing images without creaturebaseline, calm imagesdisturbing images with creature(i)(ii)(iii)(iv)random orderFigure 4.2: Diagram of Pilot Experiment procedure.Preliminary QuestionnaireAfter signing consent forms, participants were given a written survey asking for generaldemographic information as well as the participant’s experience and comfort with touch-based interaction. A copy of the questionnaire is included in Appendix B.1.1.Sensor Attachment and BaselineThe participants were then  tted with three physiological sensors: skin conductance (SCR)on their non-dominant hand, three-lead electrocardiogram (EKG) on their chest, and surfaceelectromyogram (EMG) on the corrugator muscle of their forehead. Sensor functionalitywas tested and con rmed before the facilitators retreated behind the wall. The participantsthen viewed a slideshow of calming nature scenes for two minutes whilst baseline data weregathered.Disturbing Images SlideshowsThe participants were given the \Wizard of Oz" prototype for either the  rst or secondslideshow | the order was determined randomly. Once given the prototype, the participant544.1. Pilot Experiment: Response to Disturbing Imageswas asked to sit with it for two minutes to gain familiarity with the device. While in theparticipant’s lap, the prototype was manually actuated by an experiment facilitator togenerate a breathing and purring sensation. Participants were instructed to focus theireyes on the screen and not on the Haptic Creature during slideshow viewing. When theprototype was taken from the participant, it was removed to behind the fake wall.Each slideshow consisted of twelve disturbing images, and each was shown for ten secondsfor a total of 120 seconds of disturbing images. The order of images shown was randomlydetermined for each participant from the total set of 24 images. After the  rst slideshow, theprototype was then given or taken away, and the participant was again shown two minutesof calming nature scenes while another baseline was gathered | giving the participant timeto recover from the in uence of the previous slideshow. The second slideshow then followed.Concluding QuestionnaireAfter the second slideshow was completed, the physiological sensors were removed fromthe participant, and they were asked to rate their responses to both the images and thehaptic device via survey. A copy of the questionnaire is included in Appendix B.1.2. Beforebeginning the\Overall Response" section of the questionnaire, participants were informedof the two operating modes of the Creature during the experiment.4.1.5 ResultsTen participants, seven male and three female, between the ages of 20 and 30 took partin the experiment. All were undergraduate and graduate computer science and engineer-ing students, and were compensated for their time (approximately 30 minutes). Due toan equipment malfunction one participant’s physiological data were not useable, but hisquestionnaire data were included.Self Reported Results Participants were surveyed as to their states of anxiety, agitation,and surprise during the disturbing image slideshows, both with and without the \Wizard ofOz" Haptic Creature prototype, on a 5-point Likert Scale, with adjectives used previouslyfor reporting a ective state during human-robot interaction experiments [107]. Descriptionsof quantitative survey results refer to general trends, not statistical analyses. Results areshown in Table 4.1. Participants had lower self-reported mean anxiety, agitation, andsurprise with the prototype than without.Participants were also surveyed as to their levels of comfort with the prototype duringthe experiment; these results are shown in Figure 4.3. Nine out of ten participants foundthe Creature comforting.554.1. Pilot Experiment: Response to Disturbing ImagesTable 4.1: Pilot Experiment: Self-reported Likert-scale responses to anxiety, agitation, andsurprise (1 = strongly felt, 5 = weakly felt).Prototype Present No PrototypeState Mean Std. Dev. Mean Std. Dev.Anxious 2.3 1.2 1.7 0.6Agitated 2.0 1.1 1.7 0.7Surprised 2.8 1.2 1.7 0.71 2 3 4 512345678910ranking (1 = strongly disagree, 5 = strongly agree)number of responseshaptic creature was comforting while viewing the imagesFigure 4.3: Preliminary experiment participant responses to statement \Haptic Creaturewas comforting while viewing the images."Participants were also surveyed as to whether they felt that the motions of the prototypewere distracting while viewing the images; these results are shown in Figure 4.4. Participantsgenerally expressed agreement with this statement; only 2 mildly disagreed.Participants were also surveyed as to whether they felt that the Creature would helpthem reduce their anxiety in other situations; these results are shown in Figure 4.5. As agroup, participants did not express any conclusive general opinion.There were no particular patterns identi ed from within-individual data, likely due tothe small sample size. Participants were not given detailed interviews about their sur-vey responses; they were, however, asked to provide comments on the Creature and theexperiment.564.1. Pilot Experiment: Response to Disturbing Images1 2 3 4 512345678910ranking (1 = strongly disagree, 5 = strongly agree)number of responseshaptic creature's actions were distracting while viewing imagesFigure 4.4: Preliminary experiment participant responses to statement \Haptic Creature’sactions were distracting while viewing the images."1 2 3 4 512345678910ranking (1 = strongly disagree, 5 = strongly agree)number of responseshaptic creature would help reduce my anxiety in other situationsFigure 4.5: Preliminary experiment participant responses to statement \Haptic Creaturewould help reduce my anxiety in other situations."574.1. Pilot Experiment: Response to Disturbing ImagesPhysiological Results Counting and visual inspection revealed that all subjects had askin conductance response to at least six disturbing images in each slideshow, as markedby an increase in skin conductance when the image was presented. Therefore, statisticalcomparisons were made using the  ve images with the highest initial skin conductanceresponses for each subject. An example of skin conductance response for a subject duringthe calming images is in Figure 4.7, and for the same subject during the disturbing images,showing the initial response to images, is shown in Figure 4.8. Note the large transients thatoccur at the onset of several new images; these indicate an orienting response. Regardless ofwhether they were holding the prototype, all subjects responded to at least six images witha jump in skin conductance of more than 20%. None had a signi cant response to all twelveimages, and there was no order related trend in these responses. The mean normalizedskin conductance response with the Creature was signi cantly greater (M = 0:261;SD =0:143;p< 0:05) than the mean normalized skin conductance response without the Creature.A graph of mean skin conductance response is shown in Figure 4.6.participantnormalzed SCR responseaverage normalized skin conductance response to disturbing images1 2 3 4 5 6 7 8 900.10.20.30.40.50.60.70.80.91without creaturewith creatureFigure 4.6: Average normalized skin conductance response for disturbing image slideshowwith and without Haptic Creature prototype for each participant.584.1. Pilot Experiment: Response to Disturbing Images0 10 20 30 40 50 60 70 80 9000.10.20.30.40.50.60.70.80.91time [s]normalized skin conductancenormalized skin conductance during baseline imagesFigure 4.7: Typical normalized skin conductance response for a participant during calmingimage set, the baseline. The vertical line represents the start of a new image. Baseline istypically less than  ve percent of maximum response.594.1. Pilot Experiment: Response to Disturbing Images0 20 40 60 80 100 12000.10.20.30.40.50.60.70.80.91time [s]normalized skin conductancenormalized skin conductance during disturbing imagesFigure 4.8: Typical normalized skin conductance response for a participant during disturb-ing image slideshow. The vertical line represents the start of a new image.604.1. Pilot Experiment: Response to Disturbing ImagesAs summarized in Table 4.2, the disturbing images induced signi cant changes in meanheart rate, mean normalized EMG, mean normalized heart rate acceleration, mean normal-ized derivative of skin conductance, heart rate standard deviation, and arousal as comparedto the calming images. Arousal (as per Kuli c et al. [108]), normalized skin conductance, andthe normalized derivative of skin conductance were signi cantly less during the disturbingimages with the Creature than without the Creature.Table 4.2: Summary of signi cant results from Pilot Experiment.physiological metric comparison mean SD pmean heart rate [bpm] calming images to disturbingimages without creature1.86 3.59 < 0:001mean normalized EMG calming images to disturbingimages without creature0.032 0.081 0.05mean normalized heartrate accelerationcalming images to disturbingimages without creature-0.00630 0.00290 0.002mean normalized skinconductance derivativecalming images to disturbingimages without creature0.0204 0.037 0.007heart rate standarddeviation [bpm]calming images to disturbingimages without creature-3.32 5.19 0.046arousal calming images to disturbingimages without creature0.0522 0.0802 0.016mean normalized skinconductanceimages without creature toimages with creature-0.261 0.143 0.007mean normalized skinconductance derivativeimages without creature toimages with creature-0.0154 0.0130 0.047arousal images without creature toimages with creature-0.0602 0.0769 0.0014.1.6 DiscussionResponses from the surveys revealed many useful comments and several general trends.Participants reported either feelings or strong feelings of anxiety, agitation, and surprise,and all responded to at least six of the disturbing images in each set with the peak in skinconductance typically associated with a startle response [109]. There was no order-relatedtrend of which images produced this response, suggesting that the participants did notbecome acclimatized to the disturbing images during the session. No participant had aphysiological response to all of the images. Mean heart rate, EMG, heart rate acceleration,and heart rate standard deviation were also a ected by the disturbing images. The EMGreaction to the disturbing images was likely due to their visual component, and the heart ratechanges are consistent with a more anxious or aroused state. After the experiment, manysubjects also reported to the facilitators that they found some of the images disturbing. Itis likely that the disturbing images were successful in inducing anxiety in the participants.614.1. Pilot Experiment: Response to Disturbing ImagesIn general, participants reported lower anxiety, agitation, and surprise with the Hap-tic Creature prototype than without. In addition, skin conductance response and inferredarousal (as per Kuli c et al. [9]) during the disturbing images were signi cantly less with theCreature than without. With such a small sample size, physiological results were encour-aging, indicating that this approach was worthy of further research. Survey data indicatedthat subjects generally found the Haptic Creature prototype a comfort while viewing thedistracting images: this was encouraging feedback for both the form-factor of the Creatureand the idea that a small robotic creature would be of any help in reducing a subject’sanxiety. In comments, many subjects speci cally commented on the creature’s warmth ascomforting, and several mentioned  nding its simulated breathing prominent. Some indi-cated that they found the gentle breathing of the Creature pleasant; interestingly, a fewvolunteered that this caused them to become more aware of their own breathing. It istherefore also likely that the Haptic Creature prototype had an e ect on the participants.Participants did, however, report that the prototype caused moderate to high levels ofdistraction during the image viewing. A device that purely distracts from sources of anxietywould be of limited utility, as this distraction would be of short duration and would precludethe accomplishment of other tasks. It is, however, possible that some subjects found theentire experience of the Haptic Creature unusual and hence distracting, and that theirsubjective reporting of distraction would be decreased after spending additional time withthe Creature. Although some participants may have found the Creature distracting, mostsubjects did not seem to  nd the prototype so distracting as to be annoying. There wasalso net-positive but varied response to the proposition that the Haptic Creature prototypemight reduce anxiety in stressful situations other than that of viewing disturbing images.There is also an experimental concern in that the Creature was never presented to the userin its inactive state, to determine whether Haptic Creature prototype presence alone wassu cient to induce these seen e ects.4.1.7 ConclusionsNot all participants reacted to every disturbing image, but all had a skin conductance (SCR)response to at least six of the disturbing images in each set with a peak in skin conduc-tance. A change in mean EMG, heart rate, heart rate acceleration, and heart rate standarddeviation was also correlated with the images. The presence of the Haptic Creature pro-totype was correlated with reduced levels of both mean and normalized skin conductanceresponse values, as well as inferred arousal, during the anxiety-inducing disturbing videotask. Participants generally reported the Haptic Creature as comforting during the experi-ment, particularly liking its warmth and gentle breathing.624.1. Pilot Experiment: Response to Disturbing Images4.1.8 Feedback for Iterated DesignThe overall positive feedback to the prototype device encouraged future investigation, andprovided valuable guidance as to Creature and TAMER platform experiment design, as wellas experimental methods. Many of the lessons learned from this experiment were incorpo-rated into the design of the Haptic Creature used for future experiments. Participants’favorable opinion of the warmth that the prototype was able to produce through its heatingpad led to the installation of additional heating pads into the Creature. Due to partici-pants’ high comfort rating attributed to the plushness of the Creature, additional paddingwas added to the new Creature. In designing the control system of the new prototype, par-ticular attention was paid to ensuring that the Creature would be able to interface with thephysiological sensor suite directly, without requiring an additional experimenter to operatethe Creature. This also reduced the complexity of using the system.De ciencies in the physiological sensing platform were also recognized and addressed.This preliminary experiment revealed that the existing physiological sensor software wasinsu cient for longer-term a ect based experiments. In particular, it was di cult to cor-relate the sensor data logs with speci c experimental conditions: the various stages of theexperiment had to be identi ed by carefully timing the start of the experiment and notingat what time various events occurred relative to this | a potentially error-prone measure-ment when dealing with shorter-term physiological events. Participants remarked upon thebreathing activity of the prototype, and many felt that the Creature’s breathing increasedtheir awareness of their own breathing. As breathing exercises and training are an impor-tant aspect of current anxiety training, it was necessary to add the respiration rate sensorto the physiological sensors. As a result of rewriting the sensor software to support the res-piration rate sensor, the ability to use both the skin temperature and blood volume pulsesensors was gained.This experiment also formed the basis for several methodological changes in the follow-ing experiments. Inducing anxiety ethically was always a challenging task. While the IAPSpicture set seemed e ective at inducing anxiety, they provoked an emotionally loaded en-counter { many participants remarked upon the gruesomeness of the images, and expresseddispleasure at having to view them. Longer-term studies along this vein would involve theviewing of many more images, which would not only be extremely displeasurable to par-ticipants, making recruitment di cult, but was highly unlikely to be approved (and wouldindeed be inappropriate) for the targeted platform age group of children. There were alsolimitations on the sensor suite’s ability to recognize anxiety: the existing inference engineproved unable to adequately measure anxiety and, more importantly, levels of anxiety inparticipants. The engine had been trained primarily on visual stimuli, and may not havebeen able to recognize the more subtle human reactions to changes in emotional state. As634.2. Experiment 1: Recognition of Mirroring and Initial Reactions to Creaturework to improve the emotional state recognition engine was already ongoing in a separateprocess, it was decided to focus the trial experiments of the platform on what the physi-ological sensors were capable of doing well: measuring e ects of raw physiological metricssuch as breathing rate, heart rate, and skin conductance. While ongoing work was inves-tigating self-reported emotional responses to the Haptic Creature, there had not yet beenresearch investigation of physiological reactions to interaction with the Haptic Creature. Ifphysiological reactions occurred from the Haptic Creature, there could be the potential tocommand these reactions through particular motions and activity state of the Creature toreduce the physiological metrics related to anxiety.4.2 Experiment 1: Recognition of Mirroring and InitialReactions to CreatureFollowing the preliminary experiment, the TAMER platform, as described in Chapter 3, wasconstructed. The following experiments, Experiment 1 and Experiment 2, describe small-scale studies that were intended as much for obtaining feedback and veri cation of theplatform systems as beginning to explore the potential physiological e ects of the Creature,and possible roles for the Creature in anxiety reduction. The  rst experiment performedhad two primary motivations: to begin the investigation of human physiological responseto interaction with the Haptic Creature, and to determine whether participants could rec-ognize the Haptic Creature mirroring their breathing rate and heart rate. By linking theCreature’s pulse and breathing mechanisms to those of the participant, as recorded by thephysiological sensors, the Creature has the ability to \mirror" a user’s breathing rate andpulse. This ability has several possible applications, some of which are particularly applica-ble for use within the TAMER platform, such as an alerting scenario, in which the Creatureattempts to inform its user of his or her own breathing rate and heart rate by mirroring.In a stressful or anxiety inducing situation, participants may not recognize that they arebecoming more stressed and anxious, or the degree to which that is the case. By seeingtheir own breathing and heart rate in the Creature, users could gain increased awareness oftheir own physiological state and take appropriate coping actions. Therefore, the primarygoal of this  rst experiment was to determine user reaction to this mirroring: both theirsubjective responses and whether they could recognize it in the Creature.A second goal of this  rst experiment was to determine whether the programmed ac-tions of the Creature’s mechanisms were recognized as both lifelike and appropriate to theCreature. Pilot studies and informal initial interactions suggested that users were able todistinguish between various \states" of the active Creature through the application of be-havioral state terms typically associated with a living animal: e.g., the Haptic Creature,644.2. Experiment 1: Recognition of Mirroring and Initial Reactions to Creaturewhen its breathing mechanism displayed fast breathing, would be perceived as \breathingheavily;" whereas a slower breathing rate and lower intensity in the Creature would be per-ceived as \resting." It was not evident, however, whether a human participant’s breathingrate and heart rate imposed on the Creature would be perceived in the same way. Thesmall creatures that humans are generally familiar with, such as dogs or cats, typicallyhave a higher heart rate and breathing rate than their owners. Consequently, the expected\normal" baseline activity of the Creature could in fact be at this level, which would bearound the level of an excited human; normal human resting breathing rates and heartrates could appear lethargic in the Creature. This would impact both user recognition ofmirroring as well as user determination of the Creature’s emotional state. Accordingly,participant subjective responses as to their perceptions of Creature motion were collectedand discussed.Physiological manipulation of the user was approached indirectly in this experiment.Interacting with a pet has been associated with physiological reactions such as decreasedheart rate [49] and breathing rate, as well as reduced levels of anxiety [110]. There was,therefore, the potential that the zoomorphic appearance and behavior of the Creature wouldallow it to provoke similar results. In order to have such e ects, it was necessary to con rmthat the Haptic Creature was, in fact, able to convey a sensation of both breathing andheart rate to the user, and that this could be recognized. At the very least, however, theCreature’s similarity to a stu ed animal could also potentially give comfort. To investigatethis, user physiological data were collected both for initial reaction to the Creature as wellas during the entire interaction session.4.2.1 Research QuestionsThese motivations led to three primary research questions and goals:• Examine participants’ qualitative opinions of overall Creature feel and their reactionto medium-term interaction with the Creature. Are participants able to identifybreathing and pulse mechanisms, and do they  nd these mechanisms appropriate tothe Creature?• Determine if participants are able to identify the Creature mirroring their breathingand heart rate, and if so, what are their reported reactions to it?• Examine initial physiological reaction to the Creature. Does the Creature’s state,either motionless, breathing steadily, or mirroring the user, have an e ect on physio-logical metrics of the participant?In order for participants to recognize the Creature mirroring their physiological state,they would have to be able to distinguish the motions of their own breathing and heart rate654.2. Experiment 1: Recognition of Mirroring and Initial Reactions to Creaturein the Creature from those of the Creature operating at a constant breathing and heartrate. Therefore, physiological responses in skin conductance, blood volume pulse, EKG,and respiration rate were measured while the Creature was inactive, actively breathing ata constant rate, and then mirroring the participant’s respiration and heart rate for ninetysecond periods. This length was chosen to allow the experiment to be completed within ahalf-hour time period to encourage participant participation: di erentiation between stageswas seen in pilot studies of this length. Participants were surveyed as to their impressionsof the Creature’s mechanisms and their reactions to the physiological mirroring.4.2.2 Experiment ProcedureExperiments took place in an experiment room that had been emptied of all equipmentexcept for a table placed against the wall. During the experiment participants remainedseated, facing the wall, at the large table. The physiological sensor encoder was placedon the table, to the right of the participant. The wire from the sensors, the experimentfacilitator, the Haptic Creature support equipment, and computers were located behind afake wall to the right of the participant. A web camera a xed to the top of the wall wasused to observe the participant during the experiment. Participants wore noise-cancelingheadphones during the experiment. The experiment consisted of the four phases shown inFigure 4.9, and described here.no creaturecreature stillcreature mirroring subject(i)(ii)(iii)(iv)creature constant rate motionRandomizedFigure 4.9: Diagram of Experiment 1 procedure.After signing consent forms, participants were  tted with skin conductance (SCR), bloodvolume pulse (BVP), and skin temperature (ST) sensors on their non-dominant hand, aswell as three-lead electrocardiogram (EKG) and respiration rate (RR) sensors. The sensorswere then activated and tested. If necessary, adjustments were made to sensor  t to ensurethat they were properly functioning.664.2. Experiment 1: Recognition of Mirroring and Initial Reactions to Creature(i) No CreatureParticipants were then asked to sit calmly for ninety seconds while a baseline was gathered,which began when the facilitator had returned behind the wall. As this stage was the initialbaseline gathered for the participant, it was necessarily always performed  rst.(ii) Creature still (CS)The participants were then introduced to the Haptic Creature, and given it to be placedin their lap. They were instructed to sit quietly with the Creature on their lap, and tofeel free to pet and interact with the Creature. They were requested to try to maintain atleast one hand on the Creature at all times during their interaction. After the facilitatorhad returned behind the wall, physiological data were gathered for ninety seconds. As thisstage incorporated a combination of initial reaction to the Creature and reaction to the stillCreature, it was always performed second.(iii) Creature mirroring subject (CM)The facilitator then returned to the participant and informed him or her that the mecha-nisms of the Creature would now be activated. After the facilitator returned behind thescreen the Creature was then turned on. It began mirroring the participant’s breathingand heart rate: a detected pulse from the EKG sensor triggered a pulse on the Creature,and the output of the respiration rate sensor was commanded on the Creature’s breathingmechanism. This continued for ninety seconds, during which time physiological data werecontinued to be gathered. The order of this stage and of the \Creature constant motion,"stage iv, was counterbalanced.(iv) Creature constant motion (CCM)The Creature’s constant motion stage was then begun. In this mode, the Creature has arespiration rate and intensity of twelve breaths per minute, as well as a pulse rate of seventybeats per minute, typical of a resting human adult. The transition from the previous stageto this mode occurred without comment from the facilitator. This stage continued for ninetyseconds, during which time physiological data were continued to be gathered. The order ofthis stage and the \Creature mirroring subject" stage were counterbalanced; in both casestransitions occurred smoothly and without comment.(v) Experiment Ending and QuestionnaireThe physiological data collection was then ended, and the Creature removed from thesubject. The participant then removed the sensors, and a post-experiment questionnaire674.2. Experiment 1: Recognition of Mirroring and Initial Reactions to Creaturewas administered; a copy is included in Appendix B.2.1.4.2.3 ResultsTen subjects, three female and seven male, took part in this experiment. None had partici-pated in previous experiments. All were graduate or undergraduate engineering or computerscience students between the ages of eighteen and thirty.Self-Reported ResultsDescriptions of quantitative survey results refer to general trends, not statistical analy-ses. Only two subjects were able to recognize the Creature behavior during the mirroringstage as mirroring their breathing and heart rate. The responses from the post-experimentquestionnaire are shown in Table 4.3.Table 4.3: Table of results from Experiment 1 questionnaire (1 = strongly disagree, 5 =strongly agree), n = 10.Responses Statement1 2 4 53It was easy to recognize the creature mirroring my breathing.I found the creature mirroring my breathing comforting (if noticed).I found the creature mirroring my breathing disturbing (if noticed).The creature’s breathing made me more aware of my own breathing.It was easy to recognize the creature mirroring my pulse.I found the creature mirroring my pulse comforting.I found the creature mirroring my pulse disturbing.The creature’s pulse made me more aware of my own heart rate.I found the creature comfortable on my lap.I was startled by the activation of the creature.I found the creature’s motion disturbing.I found the noise of the creature distractingPhysiological ResponsesGroup-wise and within-subjects comparisons were performed for several physiological met-rics. Pool-wise comparisons are summarized in Table 4.4, based on two-tailed dependentsample t-tests ( = 0.05). Within-subjects comparisons were performed where more than684.2. Experiment 1: Recognition of Mirroring and Initial Reactions to Creatureone data point existed for each participant for each stage, namely for their series of indi-vidual breath lengths and heart rate interbeat intervals.Table 4.4: Summary of results from Experiment 1. Signi cant results are in bold.physio metric comparison stages unitsCS{CM CS{CCM CM{CCMbreath length mean mean 0.263 0.115 -0.148 mssd 0.801 0.928 0.439p 0.351 0.718 0.338breath length sd mean -0.358 -0.104 0.254 mssd 0.458 0.649 0.420p 0.043 0.643 0.102heart rate mean mean -3.90 +2.00 -1.54 bpmsd 4.64 4.17 2.23p 0.045 0.212 0.075heart rate variability mean -0.023 -0.012 0.010 bpm/mssd 0.024 0.025 0.023p 0.022 0.186 0.215skin temperature mean mean 0.759 0.741 -0.008  Csd 0.585 0.582 0.003p 0.040 0.047 0.956skin temperature sd mean 0.017 0.052 0.034  Csd 0.199 0.069 0.155p 0.808 0.066 0.548skin conductance mean mean 2.18 1.96 -0.515 Ssd 2.01 1.49 0.404p 0.022 0.047 0.104skin conductance sd mean -0.127 0.007 0.134 Ssd 0.364 0.202 0.310p 0.322 0.925 0.227Breath lengths The series of breath lengths for each subject between the Creature still,Creature constant motion, and Creature mirroring stages using a two-tailed within-subjectsunequal variance t-test ( = 0:05). Six of ten participants were found to have a signi cantdi erence (p < 0:05) between breath lengths with the Creature still and the Creature inconstant motion, and seven between breath lengths with the Creature still and the Creaturemirroring the subject. Of those seven, three also had a signi cant di erence (p < 0:05)between breath lengths with the Creature in constant motion and the Creature mirroringthe subject.Breath length series were similarly compared with the breath length of 2.5 seconds,the commanded breath length for the Creature during the Creature constant motion stage.Comparisons were made with actual participant breathing rates during the Creature still,Creature constant motion, and Creature mirroring stages. Results for a signi cant di erence694.2. Experiment 1: Recognition of Mirroring and Initial Reactions to Creature(p< 0:05) between command breath length and participant breath length failed to concludeanything for  ve people during the Creature still phase, and  ve others during the Creatureconstant motion and Creature mirroring stages. The mean and standard deviation of breathlengths are graphed in Figure B.31.4.2.4 DiscussionQualitative Results Overall initial reactions to the Creature were investigated throughsurvey questions and interview responses to determine participants’ qualitative opinionsof overall Creature feel and whether they found the Creature’s actions appropriate to theCreature. These responses were typically positive, with no overtly negative opinions ofthe Creature’s feel or behavior, nor of interaction with it. Most participants, upon theirintroduction to the Creature, expressed a desire to touch and feel it. Participants generallyagreed with the statement \I found the Creature comfortable on my lap" (see Figure 4.10).This comfort level with the Creature was important both in that participants were able totolerate the placement of a new device on their lap, and in that they were comfortable withsuch a device moving and being \active" in such a personal and private part of the body.Participants in general expressed their like of the motion of the Creature: one describedthat it \made the Creature seem much more real and lifelike." One participant noted thatshe found \feeling the pulse of the Creature was really comforting." When asked what theyliked most about the Creature, a majority of respondents mentioned a positive reaction tothe Creature’s warmth on their lap. There were no complaints about the breathing or pulsemechanisms seeming disturbing or disconcerting; most stated that this behavior was in linewith their expectations for the Creature. However, most participants did  nd the pulsemechanism of the Creature to be noisy and moderately distracting. There was an audibleclicking sound whenever a pulse took place that was quite noticeable in the quiet of theexperiment room.Although comfortable with the Creature, participants were less successful in linking theCreature’s breathing and pulse with their own. There was no consensus on whether theCreature’s breathing and pulse made them more aware of their own breathing and pulse(see Figure 4.11). One participant noted that she became worried about the Creature whenits breathing rate changed, an indication that perhaps this participant viewed the Creatureas having some form of \life."Responses were investigated to determine if participants were able to identify the Crea-ture mirroring their breathing and heart rate, and if so, their reported reactions to it.Results are reported in Figure 4.12. As a group, participants were consistently unable toidentify the Creature mirroring their own breathing and pulse, with only a single partici-pant able to recognize this behavior. Most thought that there were two or three di erent704.2. Experiment 1: Recognition of Mirroring and Initial Reactions to CreatureI found the creature comfortable on my lapranking (1 = strongly disagree, 5 = strongly agree)n1 2 3 4 5012345678910Figure 4.10: Experiment 1 participant responses to statement \I found the creature com-fortable on my lap."the creature's ___ made me more aware of my own ___ranking (1 = strongly disagree, 5 = strongly agree)n1 2 3 4 5012345678910breathingpulseFigure 4.11: Experiment 1 participant responses to question of whether \creature’s actionsmade them more aware of their own."714.2. Experiment 1: Recognition of Mirroring and Initial Reactions to Creatureoperating modes of the Creature; these modes were typically identi ed as \fast and slow"or \smooth and random," not as mirroring. Once informed that the second mode of theCreature was mirroring their breathing and pulse, most participants expressed surprise; oneparticipant even stated that he \did not think I was breathing that fast or heavy." Almostall rated mirroring as very di cult to observe. One participant stated: \mirroring couldbe made more obvious." Without any explanation that the Creature would mirror theparticipant, it appears that there was no expectation that such mirroring could occur. Onre ection, when a small animal is placed on our laps, while we may investigate its breathingand heart rate to assess its emotional state, most of us do not immediately compare itsbreathing rate and heart rate to our own.ranking (1=strongly disagree, 5 = strongly agree)nIt was easy to recognize the creature mirroring my...1 2 3 4 5012345678910breathingpulseFigure 4.12: Experiment 1 participant responses to statement \It was easy to recognizecreature mirroring my. . . ".The one participant who was able to recognize the Creature mirroring her breathing andpulse was unable to o er an explanation for this ability, but did hypothesize that becauseshe plays a musical instrument she may be more cognizant of her own breathing than otherpeople. She had a strongly negative reaction to mirroring, responding that \I really did notlike this. I found it di cult to breath normally. It was much better to match my breathingto the Creature." As she had been exposed to the Creature constant motion stage beforethe Creature mirroring stage, it is likely that during the Creature constant motion stageshe was attempting to match her breathing to that of the Creature. It is possible that thesudden transition from attempting to match the breathing of the Creature to now  ndingherself guiding the Creature could be disturbing. Indeed, the participant would ultimately nd herself in a sort of positive feedback loop until the limits of the Creature’s respiration724.2. Experiment 1: Recognition of Mirroring and Initial Reactions to Creaturemechanism were reached.Physiological Reactions Initial physiological reactions to the Creature were investi-gated. Physiological reactions to the Creature were generally inconclusive. Comparisonswere  rst made between the breath lengths of participants during each stage. Breath lengthswere determined from analysis of the respiration sensor waveform: peaks and troughs weredetected and from this breath length was calculated. Where there were obvious noise arti-facts in the signal (most likely from movement or talking), attempts were made to interpolatethe breath length by identifying the underlying wave pattern. The respiration rate sensor isparticularly sensitive to the motions of the abdomen that occur during speech, as this oftengreatly overshadows the breathing motion. Figure 4.13 shows the breath lengths of a partic-ipant during the experiment. During the baseline the participant took longer breaths thanduring the Creature constant motion or Creature mirroring stages, and indeed the mean ofboth the Creature constant and Creature mirroring stages is close to the commanded 2.5second breath length of the Creature during the Creature constant motion stage. FigureB.31 show the mean and standard deviation of breath lengths for all participants duringthe experiment.time [s]breath length [seconds]breath lengths during experiment for single subject0 50 100 150 200 250 300 350012345678baseline creature constant creature mirroringmeanFigure 4.13: Breath lengths of a participant during Experiment 1.The activation of the Creature was strongly correlated with a change of breathing ratefor the participant in six out of ten of the participants. The same six saw both a changefrom the Creature still stage to Creature constant motion stage, as well as the Creaturestill stage to the Creature mirroring stages. An additional three saw a di erence betweentheir breath lengths during the constant motion and mirroring stages. It should be notedthat the subjects who did not react to the constant motion Creature also did not react tothe mirroring Creature, their mean breath lengths remained similar throughout the entire734.2. Experiment 1: Recognition of Mirroring and Initial Reactions to Creatureexperiment, and the standard deviation of breath lengths for them generally remained lowand similar for each stage.T-tests were also conducted to determine if the participants’ breath lengths were distin-guishable from the Creature’s constant motion breath lengths. They were distinguishableduring the baseline for  ve out of ten participants, and then also during the Creature con-stant motion and Creature mirroring stages by  ve (di erent) subjects. This is an indicationthat the chosen commanded breath rate was similar to that of the average resting respira-tion rate. Two of the six subjects whose breathing rates were a ected had breathing ratesthat were indistinguishable from the Creature’s.The important result in group trends was that overall there was a signi cantly lowerstandard deviation of breath lengths during the Creature motion stage as compared to theCreature still stage. This implies that breathing became more \regular" as a result of theactive Creature, and that the steady and repeated motion of the Creature was able to inducea similar steadiness in the subject’s breathing. A similar increase in steadiness was shownby the reduction of heart rate and heart rate variability.Analysis of the series of heart rate interbeat intervals for each participant indicates thatnine out of ten participants had a change in heart rate from the Creature still stage to theCreature mirroring stage, and seven from the Creature still stage to the Creature constantmotion stage. We propose that this heart-rate change was induced by the Creature. Meanheart-rate was signi cantly less during the Creature constant motion stage than during thebaseline, making it likely that this change induced by the Creature was in the negative,i.e. more relaxed, direction. Heart rate standard deviation, or heart rate variability, wasalso signi cantly reduced during the Creature constant motion stage as compared to thebaseline.The increase in mean skin conductance is likely due to sensor drift during the course ofthe experiment. Most participants saw a brief peak in skin conductance when the Creaturewas activated, indicative of the startle response, but there were no other large peaks duringthe experiment.The increase in skin temperature for both the Creature constant motion and the Crea-ture mirroring stages as compared to the Creature still stage is likely indicative of anincrease in relaxation during the experiment. It is unlikely that this was caused directly bythe warmth of the Creature as the skin temperature sensor was worn on the back of thering  nger of the non-dominant hand, and therefore was generally placed farther away fromthe Creature’s main source of warm, its breathing mechanism. A trial experiment withthe temperature sensor mounted on the anterior dorsal end of the Creature did not revealany signi cant temperature change after  ve minutes of the Creature’s mechanisms beingactivated.A summary of the signi cant results from the experiment is shown in Table 4.5.744.2. Experiment 1: Recognition of Mirroring and Initial Reactions to CreatureTable 4.5: Summary of signi cant results from Experiment 1.physiological metric comparison mean SD pbreath length sd Creature still to Creatureconstant motion 0:358 s 0:458 s 0.043mean heart rate Creature still to Creatureconstant motion-3.90 bpm 4.64 bpm 0.045heart rate variability Creature still to Creatureconstant motion 0:023 s 0:024 s 0.022mean skin temperature Creature still to Creatureconstant motion0:759 C 0:585 C 0.040Creature still to Creaturemirroring0:741 C 0:582 C 0.047mean skin conductance Creature still to Creatureconstant motion2:18 S 2:01 S 0.021Creature still to Creaturemirroring1:96 S 1:49 S 0.0474.2.5 ConclusionsUsers did not report any overtly negative reactions to overall interaction with the Creature.Participants had a high awareness of the breathing mechanism of the Creature, but a lowerawareness of its pulse mechanism. Participants found the Creature comfortable on theirlaps and had no disturbing reactions to or adverse opinions of the motion of the Creatureduring their interactions with it. Nine of the ten participants were not able to recognizethe Creature mirroring their own physiological state. Exposure to the Creature produceda reduction in heart rate variability, mean heart rate, and the standard deviation of breathlengths, as well as increase in skin temperature during the Creature constant motion stageas compared to baseline; these are physiological indications of relaxation. The reducedheart rate and breath length standard deviations are closer to the Creature’s, which ran ata constant rate during the constant motion stage.4.2.6 Feedback for Iterated DesignThis experiment provided valuable feedback as to the utility of the haptic anxiety reductionplatform. In its  rst use with test participants, the functioning hardware and softwarecomponents of the system were validated. Participant reports caused several hardware andprocedural modi cations to be made to the platform.The  rst area of concern was Creature noise. Several participants noted the noise of theCreature as \distracting," and response to the questionnaire question about Creature soundindicated a similar reaction. E orts were therefore made to reduce the sounds emitted bythe Creature. The greatest source of noise, the Creature’s pulse mechanism (see Figure3.6) was removed and lubricated, with foam padding added where the pulse mechanism754.2. Experiment 1: Recognition of Mirroring and Initial Reactions to Creatureis attached to the Creature. The Creature’s startup routine was also adjusted to preventsudden noises emanating from the pulse mechanism if the Creature needed to be reset orlost power during operation. Additionally, the Creature’s breathing servo refresh rate wasincreased to eliminate a vibration sound that was noticeable when the breathing mechanismwas under heavy load. After these modi cations, the Creature’s sound output level wasnoticeably lower, and in observations with noise canceling headphones little to no Creaturesound was able to be discerned. In extremely quiet environments such as the experimenttesting rooms the use of noise canceling headphones is now recommended where practical.Noise emitted by the Creature turned out to be a much more solvable problem thanthe companion problem: noise emitted by the participant, namely talking. Speech requiresair to be directed over the vocal cords, and in the process the normal respiration waveformis disrupted. The respiration rate sensor proved extremely sensitive to interference fromtalking; this sensitivity often led to inaccurate estimates of respiration rate that requiredmanual correction. As a result, care is now taken to ensure that the experiment facilitatoris out of sight during the experiment, so that the participant is not inclined to speak. Ifthe respiration rate estimate appears to be abnormally high or low additional time is takenon the baseline stage so that the respiration rate can be recalculated.The inability of most participants to recognize mirroring during the experiment mayhave been symptomatic of a lack of formal introduction to the Creature. Interaction withthe Creature is intuitive only when it is viewed as a robotic pet whose mechanisms add themechanical sensations of life to an otherwise inanimate object. The concept of a roboticpet physiologically linked to its user did not occur to most participants, even after theythemselves were equipped with physiological sensors. This is not necessarily surprising, asthe physiological sensors are most often used in experiments to record reactions to variousstimuli, and very rarely are used as the direct input for another system. Before futureexperiments, care should be taken to describe the functioning of the Creature: both thevarious mechanisms and the fact that it is capable of reacting to physiological sensor inputfrom the participant. This will ensure that the participants know what to look for in termsof Haptic Creature activity changes, as well as provide a baseline for expected Creaturebreathing rate and pulse rate that is near to their own. The strong negative reaction thata participant had upon  nding the Creature mirroring their breathing rate indicates thatthis capability may not be advisable in scenarios where the participant is following theCreature’s breathing, as it could potentially lead to an uncomfortable positive feedbackloop. A sudden change to mirroring may be useful as a high-salience indicator to alert theparticipant during a task.764.3. Experiment 2: Creature Entraining and Reactions During a Task4.3 Experiment 2: Creature Entraining and ReactionsDuring a TaskWhile Experiment 1 investigated mostly the subjective response to the Creature, answeringthe questions of \Will people like it?" and \Will people be receptive to it?", an attemptto manipulate the user’s a ect, a key goal of the TAMER platform, had not yet beenperformed. During Experiment 1, there had been an observation of increased \steadiness,"that is, a decrease in standard deviation of both breathing rate and heart rate during theexperiment attributed to the Creature. This had been an encouraging result: it showed thatthe Creature was able to at least somewhat have in uenced the user’s physiological state.It was proposed to further investigate this ability of the Creature, both directly, by askingusers to follow the Creature, and indirectly, by examining the Creature’s physiological e ectwhen the user was performing a task.The primary goal of this experiment was to investigate whether a change in Creature\physiological state" as conveyed through its respiration and pulse mechanism has an ef-fect on a participant’s physiological state (as measured through pulse and respiration rate).Unlike in the previous Experiment, where the Creature had simply been activated or de-activated, here a more focused change in Creature activity was adopted, one that wouldalso be of use in determining whether participants might  nd higher or lower activity levelsin the Creature more noticeable. In this experiment, the Creature was progressed from aphysiological state mirroring the participant’s respiration and pulse (their baseline) to astate with either a faster respiration rate and higher pulse, or a slower respiration rate anda lower pulse. After some time in this new state, the Creature was progressed back again tothe original pulse and respiration state baseline. This is shown in Figure 4.15. Time periodlengths were chosen to allow the experiment to be completed within a half-hour time periodto encourage participant participation: overall experiment lengths were generally greaterthan in the previous Experiment due to the shorter questionnaire.The gradual adjustment in Creature activity would prevent any disconcertion from theCreature being suddenly activated or deactivated, and would also preclude recognition of asudden change in Creature activity. A di erence of 20 percent from baseline in respirationrate and 20 beats per minute in heart rate was chosen as representing a distinguishabledi erence in Creature activity levels while not exceeding the capabilities of the platform.Larger deltas resulted in extremely fast and noisy Creature motions, often to a distractinglevel, during the elevated activity level state. The transitions between the high and lowactivity levels were generally shorter than the constant motion stages, as where physiologicalcomparisons were made between the high and low activity states a large enough time wasneeded for participant physiological metrics to stabilize.A secondary goal of this experiment was to determine if the Creature could in uence its774.3. Experiment 2: Creature Entraining and Reactions During a Taskuser when the user was not directly engaging with the Creature. This would help supportthe role of the TAMER platform in its ultimate end environment: one in which the HapticCreature acts as merely accompaniment while the user performs another task. There weretwo stages of interaction with the user to investigate this. In the  rst, the participant wasinvited to interact with the Creature in a focused way, through petting or stroking theCreature, for several minutes. In the second, participants held the Creature on their lapswhile performing a secondary task, in this case reading literature. It was expected thatthey would  nd the Creature’s motions and actions comforting, but not distracting fromtheir task.In the previous experiment it had been found that participants required a thoroughintroduction to the Creature. Even after being equipped with physiological sensors, par-ticipants did not recognize that the Creature could be linked to their own physiologicalstate, and several of the Creature’s mechanisms, particularly the pulse, are not obviouslyfound without careful inspection. As part of the introduction, therefore, it was decided toask the participant to mirror the Creature’s breathing and heart rate for a brief period, aprocedure henceforth called \entrainment" (cf. \mirroring"). This would help accomplishseveral goals. Breathing rate training as part of relaxation therapy is an important partof many anxiety reduction techniques, and the Haptic Creature’s abilities to display con-trolled breathing rates could give it the ability to act as a trainer. If users could successfullymirror the Creature’s breathing, it would help to con rm one possible usage scenario of theTAMER platform. By matching user breathing with the Creature’s, this entraining wouldalso help provide an expected activity level for the Creature of the user’s own breathingrate and heart rate, giving participants a calibration on what activity levels to expect fromthe Creature for the rest of the session.4.3.1 Research QuestionsIn this experiment the following research questions were posed:• Can participants consciously mirror the Creature’s respiration rate when instructedto do so? If so, does this mirroring a ect the participant’s physiological state?• Does Creature motion a ect participants’ physiology either when interacting with theCreature or when performing a task with the Creature on their laps?• Is the Creature distracting to participants when they are asked to perform a simple,non-stimulating mental task?Overall group trends were analyzed. Skin temperature, heart rate variability, heartrate acceleration, and skin conductance were examined for any prevailing trends throughpool-wise comparison between stages using two-tailed dependent sample t-tests ( = 0.05).784.3. Experiment 2: Creature Entraining and Reactions During a Task4.3.2 ProcedureExperiments took place in an experiment room which had been removed of all equipmentexcept for a table placed against the wall. During the experiment participants remainedseated, facing the wall, at the large table. The physiological sensor encoder was placedon the table, to the right of the participant. The wires from the sensors, the experimentfacilitator, the Haptic Creature support equipment, and computers were located behind afake wall to the right of the participant. A web camera a xed to the top of the wall wasused to observe the participant during the experiment. The experiment consisted of thefour phases shown in Figure 4.14, and described here.creature stillramped creature motion w/o taskuser asked to mirrorramped creature motion w/o taskuser asked to breath normally(i)(ii)(iii)(iv)ramped creature motion w/taskuser asked to breath normallyRandomizedFigure 4.14: Diagram of Experiment 2 procedure.Introduction and BaselineAfter signing consent forms, participants were  tted with skin conductance (SCR), bloodvolume pulse (BVP), and skin temperature (ST) sensors on their non-dominant hand, aswell as three-lead electrocardiogram (EKG) and respiration rate (RR) sensors. The sensorswere then activated and tested. If necessary, adjustments were made to sensor  t to ensurethat they were properly functioning. Participants were then asked to sit calmly for ninetyseconds while a baseline was gathered.Stage 1: Creature StillParticipants were given the Haptic Creature. It was placed on their lap, and its respirationand pulse mechanisms were described and pointed out. They were instructed to sit quietlywith the Creature and to feel free to interact with it by petting, stroking, or touching.Physiological data were continued to be gathered for ninety seconds after the facilitatorhad moved out of sight of the participant. These ninety seconds are stage 1 in Figure 4.14794.3. Experiment 2: Creature Entraining and Reactions During a Taskand in other references.Stage 2: Ramped Creature Motion, User Asked to MirrorParticipants were then informed that the mechanisms of the Creature would now be acti-vated. After the facilitator had returned behind the screen, the Creature began to mirrorthe physiological state of the user in both heart rate and respiration. The facilitator thenreturned to the participant, and invited him/her to mirror the Creature’s breathing withhis/her own. Once the facilitator returned behind the screen, the Creature then immedi-ately began a progression consisting of a thirty second \ramp" to a breathing rate and heartrate 20% higher than that of Stage 1, sixty seconds at the new, higher rate, and then asixty second ramp down to a breathing rate and heart rate 20% lower than that of stage1, followed by sixty seconds at that rate. The Creature was then deactivated. These twohundred and ten seconds are stage 2 on Figure 4.14 and in other references.HR + 20 bpm1.2 * Resp. RateHR - 20 bpm0.8* Resp. RateBaselineGatheredFigure 4.15: Ramped Creature motion, as used during experiments.Stage 3: Ramped Creature Motion With User Task, User Asked to BreatheNormallyParticipants were then assigned a reading task. They were asked to read selections fromthree Graduate Record Examinations™ [111] reading passages, count the number of wordscontaining four syllables, and write this number at the bottom of the page. They wereinstructed to keep at least one hand on the Creature at all times, and to keep the readingmaterial on the desk rather than hold it in their hands. During this stage the Creatureperformed a ramped motion similar to that of stage 2 but longer, consisting of a sixtysecond \ramp" to a breathing rate and heart rate 20% higher than that of stage 1, onehundred and twenty seconds at the new, higher rate, and then a one hundred and twentysecond ramp down to a breathing rate and heart rate 20% lower than that of stage 1, followed804.3. Experiment 2: Creature Entraining and Reactions During a Taskby one hundred and twenty seconds at that rate. The Creature was then deactivated. Thesefour hundred and twenty seconds are stage 3 on Figure 4.14 and in other references.Stage 4: Ramped Creature Motion Without User Task, User Asked toBreathe NormallyParticipants were then instructed to sit calmly with the Creature while the same rampprogression as in stage 3 is performed. These four hundred and twenty seconds are stage 4on Figure 4.14 and in other references.QuestionnaireThe Creature was collected, the sensors removed, and a post-experiment questionnaireadministered. A copy of the post-experiment questionnaire is included in Appendix B.3.1.The order of stages 3 and 4 was determined randomly. Stage 2 was always performed rst to ensure that participants were aware of the Creature’s mechanisms’ location andactions, as well as the intended relation between the Creature’s mechanisms and their ownbreathing and heart rate.Nine undergraduate or graduate computer science and engineering students between theages of twenty and thirty, four of whom were female, took part in this experiment. None hadparticipated in the previous experiments. Participants were compensated for their time.4.3.3 ResultsQualitative and then physiological results are reported in this section.Qualitative ResultsA summary of the questionnaire results is shown in Table 4.6. Descriptions of quantitativesurvey results refer to general trends, not statistical analyses. Participants reported a highability to easily mirror the Creature’s breathing, and generally a high awareness of theCreature’s breathing and pulse.814.3. Experiment 2: Creature Entraining and Reactions During a TaskTable 4.6: Questionnaire results from Experiment 2 post-experiment survey (1 = stronglydisagree, 5 = strongly agree).When asked to mirror creature:I was able to easily mirror the creature’s breathingI was aware of the creature’s pulseI was comfortable with creature on my lapI was aware of my own breathingI was aware of my own heartrateI found noise of creature distracting1 2 4 53While sitting with active creature:I was aware of the creature’s breathingI was aware of the creature’s pulseI noticed changes in the creature’s breathingI noticed changes in the creature’s pulseI was aware of my own breathingI was aware of my own heart rateI was comfortable with creature on my lap1 2 4 53During reading task:I was aware of the creature’s breathingI was aware of the creature’s pulseI noticed changes in the creature’s breathingI noticed changes in the creature’s pulseI was aware of my own breathingI was aware of my own heart rateI was comfortable with creature on my lapI found creature’s motion distracting1 2 4 53Overall:creature made me more aware of breathingcreature made me more aware of heart rateenjoyed interacting1 2 4 53824.3. Experiment 2: Creature Entraining and Reactions During a TaskPhysiological ResultsBreath Lengths Typical physiological results from the experiment are in Figure 4.16,which shows a participant’s breathing rate and heart rate during the second stage of theexperiment, in which they were asked to mirror the Creature. In the leftmost frame ofthe graph the baseline is gathered. At the sixty second mark on the graph the Creaturehas ramped down to a constant value of 80% of baseline, and here the participant’s meanrespiration rate is almost the same as commanded respiration rate | the commanded andmean breath length lines overlap. During this time period the mean heart rate is increasedslightly from baseline, but not to near the commanded value of twenty beats per minutegreater than the baseline mean heart rate. In the other constant motion stage of theexperiment, starting at the one hundred and eighty second mark on the graph, participantrespiration rate remains almost constant at the commanded respiration rate of 120% of thebaseline, here again the commanded and mean breath lengths overlap. During this periodthe mean heart rate is increased slightly both from the previous period and the baseline,whereas the commanded heart rate was twenty beats per minute lower than baseline.All participants showed greatly reduced standard deviation of breath lengths when askedto mirror the Creature, and this reduction somewhat tended to stay, with standard devia-tions remaining lower for most participants when both sitting calmly and performing thetask than during baseline. On average, standard deviations were slightly but not signif-icantly higher when performing the task than when sitting calmly. There was a statisti-cally signi cant di erence in the standard deviation of breath lengths between the baselineand the training stages (M = 1:15 s;SD = 0:535 s;p < 0:05), the baseline and sittingcalmly (M = 0:780 s;SD = 0:686 s;p < 0:05), and the baseline and performing a task(M = 0:596 s;SD = 0:524 s;p < 0:05), as well as between the training stage and sittingcalmly (M =  0:377 s;SD = 0:411 s;p < 0:05) and the training stage and performing thetask (M = 0:560 s;SD = 0:395 s;p< 0:05).In general, mean breath length was signi cantly di erent in lengths during the faster andslower commanded respiration series both during the training stage (M =  2:91 s;SD =1:98 s;p < 0:05), the Creature with the task (M = 1:46 s;SD = 1:18 s;p < 0:05), and theCreature without the task (M = 0:0540 s;SD = 0:296 s;p< 0:05). Means were calculatedfor the steady portion of Creature motion, when it was operating a constant breathing rate,not the ramp.834.3.Experiment2:CreatureEntrainingandReactionsDuringaTask012345678910time [s]breath length [s]0 50 10 0 15 0 20 0 2506080participant breath lengths and heart rate during mirroring of creatureheart rate [bpm]mean heart ratecommanded heart rateheart ratecommanded   breath lengthmean   breath lengthbreath lengthFigure 4.16: Breath lengths and heart rate for a participan t during stage 2 of Exp erimen t 2. Green v ertical bars represen t a singlebreath.844.3. Experiment 2: Creature Entraining and Reactions During a TaskHeart Rate Heart rate was compared using three metrics: interbeat interval (ibi), heartrate variability, and mean heart rate.Interbeat Interval During the training session eight out of nine participants saw a re-duction in the standard deviation of heart rate interbeat intervals. All participants saw ane ect from the Creature when sitting calmly with it versus the baseline (p < 0:05), andsix out of nine saw an e ect from the Creature motion during the task versus the baseline(p< 0:05).Heart Rate Variability Heart rate variability metrics were calculated for each phase foreach subject. Overall, there was no signi cant di erence in heart rate variability between orwithin stages, except for the percentage of high frequency components, which did not show asigni cant decrease (p> 0:05) from Stage 1 to Stage 2, but did show a signi cant di erencebetween Stages 2 and 3 (M =  16:4;SD = 12:5;p < 0:05), 2 and 4 (M =  23:0;SD =18:2;p< 0:05) , and 3 and 4 (M = 6:59;SD = 12:7;p< 0:05).Mean Heart Rate There was no signi cant di erence (p > 0:05) in mean heart ratebetween or within the stages.Skin Conductance There was no signi cant di erence (p > 0:05) in skin conductancebetween or within the stages.Skin Temperature There was no signi cant di erence (p> 0:05) in mean skin temper-ature between or within the stages.A summary of physiological results is shown in Table 4.7.854.3. Experiment 2: Creature Entraining and Reactions During a TaskTable 4.7: Summary of results from Experiment 2, signi cant results are in bold. Stage 1:Creature Still; Stage 2: Ramped Creature Motion, User Asked to Mirror; Stage 3: RampedCreature Motion Without User Task, User Asked to Breathe Normally; Stage 4: RampedCreature Motion Without User Task, User Asked to Breathe Normally. Breathing rate datais located in Section 4.3.3.metric comparison stages unit1{2 1{3 1{4 2{3 2{4 3{4breath length sd mean 1.15 0.780 0.596 -0.377 -0.183 -0.560 ssd 0.535 0.686 0.524 0.411 0.205 0.395p < 0:001 0.002 0.009 0.011 0.183 < 0:001heart rate mean mean 0.222 1.22 2.11 1.00 1.89 0.889 bpmsd 4.89 2.70 3.31 4.99 5.13 3.81p 0.901 0.236 0.109 0.586 0.328 0.528heart rate sd mean -0.100 0.178 0.122 0.278 0.222 -0.056 bpmsd 1.43 1.32 2.44 1.54 2.73 1.72p 0.848 0.714 0.891 0.623 0.824 0.930heart rate var rmsssd mean 13.6 14.2 12 0.667 -1.55 -2.22sd 32.0 43.1 46.3 19.1 23.7 8.89p 0.265 0.378 0.484 0.924 0.857 0.500heart rate var pnn50 mean -3.37 -2.99 -3.3 0.382 0.064 -0.318sd 11.0 7.50 9.08 7.66 11.2 7.13p 0.409 0.292 0.333 0.891 0.987 0.902heart rate var hf% mean 30.5 14.1 7.50 -16.4 -23.0 -6.59 %sd 16.4 21.1 28.9 12.5 18.2 12.7p < 0:001 0.096 0.483 0.006 0.007 0.008heart rate var lf% mean -1.44 2.56 10.2 4 11.7 7.67 %sd 24.3 16.3 25.4 25.8 24.1 15.3p 0.871 0.669 0.288 0.673 0.207 0.194heart rate LF/HF mean -1.54 -0.736 0.146 0.800 1.95 1.15sd 5.42 4.01 4.21 4.47 3.97 2.49p 0.446 0.618 0.787 0.627 0.202 0.228skin temperature mean mean -0.795 -0.807 -1.47 -0.012 -0.676 -0.664  Csd 1.01 1.66 2.05 1.19 2.14 1.91p 0.055 0.184 0.064 0.977 0.372 0.328skin temperature sd mean 0.489 0.49 0.305 0.001 -0.185 -0.185  Csd 1.75 1.84 1.98 0.234 0.475 0.457p 0.427 0.447 0.657 0.996 0.276 0.259skin conductance mean mean 0.097 0.071 0.081 -0.026 -0.016 0.010 normsd 0.386 0.287 0.323 0.115 0.098 0.09p 0.475 0.481 0.476 0.518 0.634 0.75skin conductance sd mean -0.030 -0.055 -0.074 -0.025 -0.044 -0.020 normsd 0.118 0.096 0.096 0.077 0.084 0.058p 0.467 0.125 0.051 0.365 0.150 0.338864.3. Experiment 2: Creature Entraining and Reactions During a Task4.3.4 DiscussionQuestionnaire ResultsAnalysis of participant survey results focused on three areas: their comfort with the Crea-ture and awareness of its mechanisms, the e ect of the Creature on their awareness of theirown breathing rate and pulse, and their reaction to the Creature while they were performingthe reading task.Participants reported a greater awareness of the Creature’s mechanisms than their owncorresponding activities. Participants were in general aware of the Creature’s breathingduring the experiment, although they were slightly less aware during the reading task (seeFigure 4.17). The design of the breathing mechanism likely allows for its activity to bemonitored with minimal attention from the user. It produces a motion in the Creature’sabdomen that is quite salient over a large area of the Creature, requiring only a brief touchto obtain awareness of the current breathing rate and position. It should be possible tomaintain contact with the Creature with minimal attention as only a brief touch is necessary,but required, to monitor its breathing.ranking ( 1= strongly disagree, 5 = strongly agree)nI was aware of the creature's breathing...1 2 3 4 50123456789while sitting with active creatureduring reading taskFigure 4.17: Experiment 2 participant responses to survey statement \I was aware of thecreature’s breathing."In comparison, the Creature’s pulse is more di cult to locate and much greater e ortis required to maintain awareness of the Creature’s heart rate. The e ect of the pulse874.3. Experiment 2: Creature Entraining and Reactions During a Taskmechanism can only be felt in the \neck" area of the Creature, near the head, and to doso requires placement of the hand in that area. Although the neck area is a somewhatnatural position to place the hand when interacting with the Creature with both hands, itis not as likely to be regularly touched when the participant is primarily interacting with theCreature with one hand, as during the reading task. This is likely the cause for participantsreporting much less awareness of the pulse during the reading task, as expected. In general,however, they showed a high awareness of the Creature’s pulse (see Figure 4.18).I was aware of the creature's pulse...ranking (1 = strongly disagree, 5 = strongly agree)n1 2 3 4 50123456789when asked to mirror creaturewhile sitting with active creatureduring reading taskFigure 4.18: Experiment 2 participant responses to survey statement \I was aware of thecreature’s pulse."Concerning the research question posed related to whether the Creature’s breathingand pulse would cause the participants to be more aware of their own breathing and pulse,participants reported a very high awareness of their own breathing when asked to mirrorthe Creature (see Figure 4.19). The task naturally requires concentration on breathing rateand intensity. This awareness carried over into the later stages of the experiment, with allbut one participant reporting an awareness of their breathing while sitting with the activeCreature. Following the same trend as awareness of the Creature’s breathing, participants’awareness of their own breathing was less during the reading task, with several participantsreporting that they were not aware of their own breathing during the task.It was also a research question as to whether participants would be aware of their ownheart rate or that the Creature would be able to increase participants’ awareness of their884.3. Experiment 2: Creature Entraining and Reactions During a TaskI was aware of my own breathing...nranking (1 = strongly disagree, 5 = strongly agree)1 2 3 4 50123456789when asked to mirror creaturewhile sitting with active creatureduring reading taskFigure 4.19: Experiment 2 participant responses to survey statement \The creature’sbreathing made me more aware of my own breathing."own heart rate. In general, people do not have a high awareness of their own heart rateexcept in extreme conditions, where it is \pounding," or beating fast enough that they areable to notice it. This result was shown in the reported results, as all participants reportedsome level of disagreement with the statement \I was aware of my own heart rate" (seeFigure 4.20). Participants reported slightly higher disagreement during the reading task,but overall levels of disagreement for all three stages were quite high. Without extensivetraining, the most common way of being aware of one’s own heart rate is by taking one’spulse, and participants were generally precluded from doing this during the experiment bysensor wires and the instruction to attempt to maintain one hand on the Creature at alltimes. Even if the Creature had invoked an increased mental awareness that they have apulse, participants would likely have been unable to determine their pulse.As in the  rst experiment, reaction to interaction with the Creature was positive overall,with participants reporting comfort in having the Creature on their laps, and no discomfortwith Creature motions and activity. It was desired that participants would not  nd theCreature overly distracting during their reading assignment; however, user feedback on thatsubject was mixed and inconclusive (see Figure 4.21). It was noted that during higher levelsof engagement with the reading assignment, participants would use at least one hand andsometimes both to assist them in reading the pages; this would preclude haptic interaction894.3. Experiment 2: Creature Entraining and Reactions During a TaskI was aware of my own heart rate...ranking ( 1 = strongly disagree, 5 = strongly agree )n1 2 3 4 50123456789when asked to mirror creaturewhile sitting with active creatureduring reading taskFigure 4.20: Experiment 2 participant responses to survey statement \The creature’s pulsemade me more aware of my own heart rate."with the device and potentially mitigate some of the potential distracting e ect of theCreature. The fact that participants are not forced to monitor the Creature, and thatthey can always remove their hands from it, may prevent it from becoming an intrusivedistraction, but may also make it less e ective.904.3. Experiment 2: Creature Entraining and Reactions During a Taskranking (1 = strongly disagree, 5 = strongly agree)nI found the creature's motion distracting during the reading assignment1 2 3 4 50123456789Figure 4.21: Experiment 2 participant responses to survey statement \I found the creature’smotion distracting during the reading assignment."Physiological ResultsStage 2 (ramped creature motion, user asked to mirror) is always administered prior toStages 3 and 4 (ramped creature motion with and without task) for reasons of experiment ow and introduction to the Creature. This constitutes a randomization restriction, whichmight have implications on the interpretation of results incorporating Stages 3 and 4 (e.g.potential confounds with e ects of adaptation, learning, habituation or fatigue and bore-dom). We saw this as a necessary constraint. Creature entrainment of breath rate whenparticipants were asked to mirror the Creature was con rmed through the respiration mea-surement. There are likely several reasons why entrainment of heart rate was not similarlysuccessful. In particular, participants were not instructed to mirror the Creature’s heartrate, and even if they had been, most would not have had the ability to do so, as theyreported little to no awareness of their own heart rate. It appears likely that entraininghad no e ect on mean heart rate, as there was no pattern to the trend of mean heart ratebetween the slow pulse and high pulse stages of the entraining. Skin temperature did,however, increase during the training, an indication of decreased participant arousal.The physiological e ects noted during the longer-term interaction with the Creaturewere also promising, if less pronounced. The standard deviation of breath lengths notonly showed a general trend of decreasing greatly during the mirroring stage, as would be914.3. Experiment 2: Creature Entraining and Reactions During a Taskexpected when commanded to breathe at a steady rhythm, but this reduction in breath ratevariability remained even when the participant was not instructed to mirror: breath lengthvariability was less both when sitting calmly and when performing the task than duringthe baseline. Participant breath length variability was slightly higher when performingthe task than when sitting calmly for all participants, but still remained below baseline.This suggests that some aspect of the entrainment lingered even after the training stage.This reduction in breath rate variability corresponding to Creature motion was also notedin the previous experiment when the Creature was activated at a constant rate, but notwhen it was mirroring the participant. A likely explanation for this is that participants,understanding that the Creature was displaying a breathing rate similar to theirs, wereidentifying with the rhythmic stability of the Creature’s breathing rate, and \keying in" onit to cause an increased stability in their own breathing rate. This could also explain thedecrease in standard deviation of heart rate shown during Creature motion in Experiment1. Such a \stability e ect" could potentially serve as an anxiety coping mechanism, byproviding comforting reassurance and by reinforcing anxiety-reducing physiological metrics.A marked decrease in the high frequency percentage of heart rate variability was also notedbetween the baseline and the mirroring stage. As the high frequency component of heartrate variability is driven primarily by respiration, it is likely that this is partially an e ectof the slow breathing exercises undertaken by the participant mirroring the Creature. Formany participants, this value remained low during the remainder of the experiment: eighthad a lower hf % when sitting calmly with the Creature than during the baseline, and sixduring the reading task.A summary of signi cant physiological results is shown in Table 4.8.924.3. Experiment 2: Creature Entraining and Reactions During a TaskTable 4.8: Summary of signi cant results from Experiment 2. Stage 1: Creature Still; Stage2: Ramped Creature Motion, User Asked to Mirror; Stage 3: Ramped Creature MotionWithout User Task, User Asked to Breathe Normally; Stage 4: Ramped Creature MotionWithout User Task, User Asked to Breathe Normally.physiological metric comparison mean SD pbreath length sd stage 1{2 1:15 s 0:535 s < 0:001stage 1{3 0:780 s 0:686 s 0.002stage 1{4 0:596 s 0:524 s 0.009stage 2{3  0:377 s 0:411 s 0.011stage 3{4  0:560 s 0:395 s < 0:001mean breath length fast{slow training mode  2:91 s 1:98 s 0.003Creature with task 1:46 s 1:18 s 0.008Creature without task  0:0540 s 0:296 s 0.008heart rate hf% stage 2{3 -16.4 12.5 0.006stage 2{4 -23.0 18.2 0.007stage 3{4 -6.59 12.7 0.0084.3.5 ConclusionsParticipants were able to consciously mirror the Creature’s respiration rate when instructedto do so. This mirroring produced a reduction in the overall mean standard deviation ofbreath lengths for participants, as well as changes in mean heart rate for eight out ofnine participants. Either this training stage or the motion of the Creature also producedphysiological e ects in participants during the remainder of the experiment. The standarddeviation of breath lengths remained signi cantly less during all stages with the Creaturethan during the baseline, but was signi cantly higher during the stages with the task thanwhen training. When the Creature was present, there was a signi cant di erence in overallparticipant mean breath length between when the Creature was moving at a slow constantrate and a fast constant rate | this was likely a response to Creature motion.The high frequency component of heart rate variability was signi cantly di erent be-tween the training stage and both task stages, as well as between the task stages. In generalparticipants reported feeling comfortable with the Creature on their lap, and despite  ndingthe Creature a bit noisy, most did not  nd it disturbing or distracting during their task.Overall, participants typically reported a high awareness of the Creature’s breathing, and alower awareness of the Creature’s pulse; this corresponded with a much greater awarenessof their own breathing than their own pulse.934.4. Experiment 3: Experiment with Children4.3.6 Feedback for Iterated DesignThis experiment completed the readiness testing of the TAMER platform. With the re- nements in Creature mechanisms and performance made after the  rst experiment, nomajor changes were necessary. However, several modi cations were made to the overallplatform to improve function during future experiments. These included logistical changes,additional data logging capability, and some cosmetic re nements.While the physiological sensors continued to record adequate data, linking the data toboth speci c moments in the experiment and Creature activity proved di cult. The loggingof data from the Creature was found to malfunction occasionally, with several participants’Creature logs missing several sections. Software protocols were adjusted to be more robust,and alerting added to notify the facilitator when Creature logging had failed.Finally, several cosmetic improvements were made to tidy up the sensor wiring to re-duce the risk of tangles. Where possible the cables were bundled and rerouted away fromcommonly accessed areas.Much of the procedure from this experiment was carried over into the next experiment.Of concern was the length of time required to gain meaningful physiological data frominteraction with the Creature. After two hundred and ten seconds of sitting still with theCreature on their lap, moving at a fairly constant rate, many participants became bored.They looked away from the Creature and began to search around the room for other stimuli;some even asked the facilitator if the experiment were over yet. This is representative of themaximum amount of time participants can be expected to focus solely on the Creature beforeit becomes tedious. More varying motions of the Creature could be of use in maintainingengagement, but would not have allowed for the physiological e ects sought for in thisexperiment to be measured.4.4 Experiment 3: Experiment with ChildrenThe experience and success gained from the previous experiments provided the method-ological foundation to conduct an experiment with the Creature in a more representativeenvironment. Due to the potential for increased receptiveness, or at the very least variedphysiological responses from this very di erent age group, it had previously been decidedthat this school experiment would take place regardless of the  ndings of Creature successin manipulating physiological metrics in the previous experiments. Therefore, the resultingsuccess of the Creature in a ecting breathing and heart rate metrics was encouraging. Asubject pool of children was expected to provide a very di erent experience than that ofyoung adults: children were certain to be more physically demanding upon the Creature,due to either rough play or lack of care, but it was expected that they would also prove944.4. Experiment 3: Experiment with Childrenmore physiologically receptive to the Creature.The location for this experiment was the Eaton Arrowsmith School [112], \a co-educational,non-denominational, independent day school for elementary and secondary school studentswith learning di erences/disabilities." This school was chosen both for its location on theUniversity of British Columbia campus and its sta ’s willingness to work with researchers,as well as its unique curriculum and student population. Although the school’s students arenot clinically diagnosed with severe emotional, behavioral, or intellectual disorders, theyhave experienced di culty functioning academically in the regular school system. Duringtheir time at this school they spend several hours each day building cognitive skills throughrepetitive training exercises. This makes this group an ideal subject pool for the TAMERplatform, as they spend the majority of their school day performing timed, intense, stressinducing activities. Many of these activities are performed individually on the computer,allowing for experimental sessions to be performed without disrupting the students’ dailyroutine.The procedure for this experiment draws heavily from that of the previous experiments,especially Experiment 2. There were several main research goals. The  rst goal was tocon rm that the computer activity performed by the student was able to induce measurablephysiological changes, and to determine what are these physiological changes. The computeractivity chosen for this experiment was called \Clocks." This computer program is usedas part of the school’s curriculum. During the activity, the screen displays a clock facewith tick-marks but no numbers. For each trial, a time is represented using hands of equallength, and the student must input the time displayed based upon the relation between thehands. For example, a clock with one hand pointing towards the 11 position, and anotherpointing between the 3 and the 4, but close to the 4, must be displaying 3:50; it could notbe displaying 11:18. If the hour hand were pointing straight at the 11 position, the minutehand would have to be near the 12 mark on the dial. This exercise is fairly simple fortwo or three hands, but becomes increasingly di cult as more hands are added (eventuallythousandths of a second, second, minute, hour, day, month, year, century, and millenniumare displayed on the clock). The students must answer as quickly as possible and are givenfeedback after each clock and their overall score at the end. The assigned di culty levelis increased after the student masters a level, so that the students are always working at ahigh level of di culty for them. Students generally have a high level of engagement withthe program and are motivated to produce as high a score as possible as their performanceis tracked and assessed. They typically perform this activity for up to half an hour at atime. To investigate this activity, physiological data of students performing the activitywere recorded.A second goal was to investigate whether the Creature could be e ective in alleviatingstress or anxiety during this task. Students were asked to perform the task with the Creature954.4. Experiment 3: Experiment with Childrenon their lap both still, moving more slowly than their baseline heart rate, and moving morequickly than their baseline heart rate. To determine this, physiological data were gatheredto assess any changes from Creature presence and Creature motion, and students were askedtheir impressions of the Creature during the task and whether it helped or distracted them.The  nal, and perhaps most important goal, was to evaluate children’s receptiveness toand comfort level with the TAMER platform. Informal pilot studies had been conductedwith children on a one{to{one basis as well as with non-EAS school groups, but this was the rst time a large-scale study was conducted involving the Haptic Creature, physiologicalsensors, and children. Receptiveness to the sensors and the Creature was observed, andchildren were asked what they liked about the Creature and how they felt while playingwith it.4.4.1 Research QuestionsThis experiment investigated the following research questions:• Do participants  nd the Haptic Creature calming or engaging, based upon subjectiveresponse?• Do the students’ computer activities induce physiological changes, and are any ofthese linked to an increased level of stress or anxiety?• Is the Haptic Creature able to induce physiological changes in participants during theexperiment, either when still or moving slowly or quickly relative to the participant’sown rates?Similar to previous experiments, the mean heart rate, heart rate standard deviation,heart rate skewness, heart rate rms standard deviation, heart rate variability: pnn50, vlf%,lf%, mf%, and hf%, skin conductance, skin conductance derivative, electromyogram, elec-tromyogram derivative, skin temperature, skin temperature standard deviation, respirationrate, respiration rate standard deviation, respiration amplitude, and respiration amplitudestandard deviation (see Section 3.5.2) were calculated and compared among and between all ve experiment stages (see Figure 4.22) for all subjects using two-tailed dependent samplet-tests (all  = 0.05). Additionally, the series of each participant’s heart rate interbeatintervals (ibi) and breath lengths for each stage were compared within subjects using atwo-tailed independent sample t-test ( = 0:05).4.4.2 Experimental ProcedureThis experiment consisted of  ve major stages, as shown in Figure 4.22. This experimenttook place in an o ce: the participant sat on one end of a table in front of a personal964.4. Experiment 3: Experiment with Childrencomputer, the experimenter and equipment were diagonally opposite, as far away as possible,at the other end of the table. Participants were taken out of their regular classroom activitiesduring the school day for a thirty minute experiment session, the length of a typical schoolperiod, and the timings of each stage chosen to accommodate this length. Twenty-fourparticipants, ages seven to thirteen, took part in the experiment. Participants wore noise-canceling headsets during the experiment. Permission slips were collected from the students’parents and assent forms from the students by the school’s teachers before the experiment.introduction to creatureactivity w/o creatureactivity w/creature inactiveactivity w/creature “slow”activity w/creature “fast”Figure 4.22: The Experiment 3 procedure diagram.Introduction to CreatureEach participant was brought into the room, and told that they were about to participate inan experiment with the Creature. The student was then asked if they would mind wearingthe physiological sensors. They were  tted with six sensors: the skin conductance, bloodvolume pulse, and skin temperature sensors on their non-dominant hand, as well as theheart rate (EKG), EMG on the corrugator muscle of the forehead, and respiration ratesensors. They then had the Creature placed on their lap. An introduction to the Creaturewas given, in which the mechanisms of the Creature were described and a demonstrationof the Creature both mirroring the participant and being actuated at a constant rate wereshown. Participants were given an opportunity to pet the Creature and ask questions aboutit. After the introduction session the Creature was removed. The entire process was scriptedto take approximately  ve minutes, with the gathering of physiological data starting afterthe sensors were donned, and lasting for about three minutes. This served as the baselinefor the Experiment.974.4. Experiment 3: Experiment with ChildrenActivity Without CreatureThe participant was then instructed to begin their computer activity. He or she continuedfor about four minutes while physiological data were gathered.Activity With Creature InactiveThe participants were then interrupted from their task and given the Creature. Theywere instructed that it might move, and to resume the activity. The Creature remainedmotionless for three minutes.Activity With Creature SlowAfter four minutes the Creature was activated with a breathing rate 20% slower and a heartrate 20 beats per minute less than that of the participant’s during the activity with creatureinactive stage. The Creature remained in this state for four minutes.Activity With Creature FastThe Creature then transitioned for ninety seconds from the \slow" rate to the \fast" one:with breathing rate 20% faster and a heart rate 20 beats per minute higher than that ofthe participant during the activity with creature inactive state. The Creature remained inthis state for four minutes, and was then deactivated.The order of the \fast" and the \slow" stages was counterbalanced, with the transitionbeing modi ed appropriately. The activity without Creature state was always performed rst, this minimized the disruption to the participant caused by handing them the Creatureor taking it from them, which necessarily distracted them from their computer activity.Experiment ConclusionThe Creature was removed from the participant, and then the sensors were removed whilethe participant was asked to discuss his or her experience. The experimenter initiated aconversation with all subjects during each session to elicit comments on their experience,with the goal of assessing their level of comfort and determining their subjective reactions.Notes were logged immediately following the session to avoid interrupting the  ow of thesessions and yet maximize the amount of detail retained related to each session. No explicitquestionnaire was used for this discussion.4.4.3 ResultsTwenty-six students, 14 female and 12 male, between the ages of 7 and 14 participated inthe experiment, with an average age of 10.9. An image of a user during the experiment,984.4. Experiment 3: Experiment with Childrenattached to the physiological sensors and holding the Haptic Creature, is shown in Figure4.23.Figure 4.23: Experiment 3 participant during experiment.Data from seven participants were not used for group-wise physiological comparisons.Of these, two were unable or unwilling to complete the speci ed computer activity, two hadequipment failures, and for three there were external disruptions during the experimentthat made their data unsuitable for comparison. For the remainder, the computer activityinduced a reduction in heart rate variability pnn50, heart rate variability hf%, mean skinconductance, and respiration rate standard deviation (all p < 0:05), an increase in heartrate standard deviation and the standard deviation of skin temperature (all p < 0:05), aswell as a change in heart rate variability vlf% (p< 0:05) as compared to baseline.Creature presence during the activity induced an increase in heart rate standard devi-ation, heart rate variability vlf %, skin conductance derivative standard deviation, meanskin temperature, and skin temperature standard deviation (all p < 0:05) as compared toperforming the activity without the Creature. A summary of signi cant results is shown inTable 4.9.Subjective reactions to the Creature and experiment are discussed in Section 4.4.5, rawdata is located in Appendix B.4.1.994.4. Experiment 3: Experiment with ChildrenTable 4.9: Summary of signi cant results from Experiment 3.comparison physiological metric mean sd p unitbaseline to activitywithout Creatureheart rate variability sd 422 418 < 0:001 msheart rate pnn50 -0.054 0.099 0.013heart rate vlf% 2:12 10 4 2:31 10 4 0.019heart rate hf% -0.001 0.002 0.015norm. skin conductancemean-0.121 0.170 0.002skin temperature sd 0.124 0.239 0.019  Crespiration rate sd -29.8 66.8 0.039 bpmactivity withoutCreature to activitywith Creatureheart rate variability sd -85.4 260 0.037 msheart rate vlf%  1:74 10 4 2:51 10 4 0.001norm. skin conductancederivative sd0.015 0.0246 0.011skin temperature mean 1.23 1.67 < 0:001  Cskin temperature sd 0.257 0.507 0.032  C4.4.4 Additional Investigation with the CreatureAfter the  rst round of experiments was completed, the school at which the experimentswere performed asked if the experimenters could return to perform trials with several par-ticipants who were not in school during the  rst round, but were still eager to participate.Due to equipment and space limitations it would have been impossible to maintain ade-quate controls with the  rst round of experiments. Therefore, their data were not pooledwith others, but subjective results are reported here for completeness. An additional threestudents were used to pilot di erent interaction styles with the Creature. For these stu-dents the Creature ran for a longer amount of time, or at a di erent rate than the previousexperiment. Four students who had previously participated in the experiment were broughtback to determine second reactions to the Creature. They also participated with the Crea-ture operating continuously for a longer amount of time, and at di erent speeds than aspreviously. Physiological data from this part of the experiment were not analyzed or re-ported as the experimental conditions for this group were comparatively poor (the quietroom previously used for the study was not available, and a di erent, noisy and high tra croom was used). However, subjective reactions to the experience of participating with thecreature and the experimental setup were recorded, and those reactions are included in thediscussion in the following section.4.4.5 DiscussionOverall reactions to the TAMER platform and the experiment were quite positive. Stu-dents were excited to participate in the experiment; those who participated were su cientlymotivated enough to return a signed permission slip from their parents. They were not mo-1004.4. Experiment 3: Experiment with Childrentivated just for the opportunity to miss class, since the school does not follow a traditionalschedule. The Creature was undoubtedly the most appealing part of the experiment. Par-ticipants were uniformly enthusiastic about getting to know the Creature: all wanted to petit, and upon entering the experiment room, most were disappointed that they had to havethe physiological sensors attached before they could interact with the Creature.Physiological SensorsReception to the physiological sensors was generally positive, most children were comfortablewith the application and wearing of them. Two students expressed extreme apprehensionof the sensors | one was calmed with the help of her personal assistant, and another byslowly putting on one sensor at a time. Although the name and purpose of each sensor wasexplained when they were put on, most seemed uninterested in their descriptions. Severalstudents also expressed interest in viewing their physiological data on the computer. Oncethe sensors were on, the respiration sensor often required adjustments to ensure properfunction. Although overall sensor performance during the experiment was good, a fewcommon glitches were noticed during the experiment. In particular, both the EKG sensorand the BVP sensor would intermittently drop out, although almost never at the same time.Due to this redundancy useful heart rate data were collected, however this did necessitatemanually selecting the cleanest signal for each time period. The EKG sensor was particularlyprone to coming loose during experiments. To make the sensor less intrusive for the child andexperimenter, instead of the common three-lead electrodes placed on separate areas of thechest, a single triode electric was placed in the middle of the chest. This is known to be lesssensitive and less reliable, as the weight of the entire sensor is supported by one electrode,and skin adherence of that one electrode can be greatly reduced by perspiration. The bloodvolume pulse sensor was attached to the  nger with a velcro strap that could occasionallybecome loose or cause the sensor to lose alignment; more often than not this occurred asthe participant was petting or stroking the Creature. The skin conductance sensors werealso attached by velcro to the  ngers, and for two participants the skin conductance sensorelectrodes became detached during the experiment.Once the sensors were attached, participants generally did not express discomfort withthem during the experiment. Although participants did not seem particularly encumberedby the hand sensors, more demanding activities would not likely have been possible. Theywould have been unable to write with the hand mounted sensors on, and typing wouldhave been di cult, but possible. Participants were naturally cautious of touching objectswith their sensor hand. Several would initially hold their hand in the air without touchinganything, and participants often had to be told that it was all right to pet and touch theCreature with the hand bearing the sensors. Once they were told that they could touch the1014.4. Experiment 3: Experiment with ChildrenCreature, however, their interaction with it did not seem to be a ected by the sensors. Afew participants also found the EMG sensor distracting in that the wire, although generallyheld up by their headphones, could fall down and obstruct vision. The sensor was alsodi cult to attach to smaller children, who did not have a large forehead area relative tothe size of the sensor. As analysis of the EMG data did not reveal any distinguishingcharacteristics; it was left o for the  nal subjects.A combination of the many wires and the logistics of the experiment room did makethe sensors more cumbersome than they might otherwise have been. Although the wireshad been bundled since the previous experiment, the large number of wires connected tothe participant did make it somewhat di cult to pass the Creature to them. Due to thelayout of the room, it was necessary to hand the Creature to the participant on the sameside as the encoder. Had the Creature been able to be on the other side of the participantthis would not have been a problem. There was also a worry that if a participant decidedto hurriedly leave the experiment room they would drag a large number of sensors andwires with them, possible causing equipment damage, but fortunately this did not happenduring trials. The caution most students showed with the sensors also makes this possibilityunlikely, although a child in the middle of an anxiety attack might not show such caution.Reaction to CreatureReaction to both Creature presence and Creature motion was positive. Almost all partic-ipants were comfortable with having the Creature in their laps. One student was reticentabout Creature on his lap, and desired to interact with the Creature on the desk beforehe would let it be placed there. Once he achieved initial comfort with the Creature, hewas not uncomfortable during the remainder of the experiment. Students were surprisedand pleased to  nd that the Creature was able to emulate their own breathing rate andheart rate. When asked, they felt that the breathing did not seem or sound mechanical.No students complained that the Creature was too heavy or noisy during the experiment.Several students said that the Creature reminded them of their own pets, particularly thewarming sensation on the lap. In fact, the majority of students who participated in theexperiment reported that they have or had had a dog or a cat at home.Creature motion and Creature activity also generally elicited positive reactions. Acommon comment after the experiment was that \the Creature felt alive." Many studentsafter the experiment asked if they could have their own Creature. In particular, severalstudents who reported not being able to have a pet at home expressed that they wouldlike to have the Creature as a substitute. Students also said that they liked the Creaturebetter when it was moving than when it was not. Students also preferred a gently movingCreature to a still one. By sending a null command to the respiration servo, instead of1024.4. Experiment 3: Experiment with Childrenturning it o completely, a gentle humming sound and noise, similar to a continual purr,could be emitted from the Creature. Students preferred this somewhat active resting stateto the Creature not moving at all.Interaction with the CreatureTwo typical interaction styles with the Creature were observed. In one the Creature seemsto serve as a \comforting presence," in the other as a \brief reassurance." Most studentsperformed their computer activity as normal, but petted the Creature during the activity.They reported the Creature as \comforting" and \pleasant" to have on their laps. Severalalso reported that they found the Creature to be \calming." A few students, however,would take a break from the computer activity periodically to stop and look at the Crea-ture, petting it and occasionally breathing with it. These breaks were usually correlatedwith either the end of a computer activity \level" or the completion of a computer activityproblem. One student suggested that the Creature helped her do better on the activity,another that the Creature was comforting to her when she got an answer wrong. Althoughthe computer activities are timed, and thus taking a break during a level might not be ben-e cial for grading purposes, taking a brief break after a level to interact with the Creature,if helpful in reducing stress and anxiety, could have a bene cial e ect on performance inthe next level.Just as adults were generally unable to distinguish between di erent Creature motionstates, so were the students. Several students reported after the experiment that they hadthought the Creature was mirroring their breathing and pulse the entire time. One studentsaid it \felt like him and I [the Creature] were one." Another said that \I found it calming,it reminds me of my stu y [stu ed animal]." Not all reactions to the Creature were positive,however. Several students reported that they felt the Creature to be distracting during theactivity, and would have preferred it not move during the activity. The initial activation ofthe Creature also disturbed several students, who either jumped slightly, or brie y lookedat the Creature when it turned on.A level of anthropomorphization of the Creature was observed in the students’ reactionsto the Creature. They would become worried when the Creature stopped moving, or aftera few minutes into the experiment if the Creature had not yet been activated. A few askedif the Creature was \sleeping" when it was not moving, or whether it was awake when itwas moving. Older children tended to ask more if the Creature were \on" or \o ," andthose more self-aware of the experiment would often ask if something was wrong when theCreature stopped moving.The physiological results shown from this were consistent with previous experiments,in that there were changes in heart rate variability associated with the Creature. This1034.4. Experiment 3: Experiment with Children\steadying" e ect, also seen in Experiments 1 and 2, was associated with Creature pres-ence. There was no di erence between the activity of the Creature \fast" and the Creature\slow" during the experiment. The computer activity reduced heart rate variability with areduction in heart rate standard deviation and pnn50 that is consistent with the reduction inheart rate variability typically associated with stress, whereas Creature activity increasedthese metrics. Skin temperature increased as compared to the baseline during both theactivity and creature presence stages, this change is most likely due to the increased levelof physical and mental exertion caused by the computer activity. The \activity withoutcreature" and \activity with creature inactive" stages were always performed  rst for rea-sons of experiment  ow and introduction to the Creature. This constitutes a randomizationrestriction, which might have implications on the interpretation of results from this Exper-iment (e.g. potential confounds with e ects of adaptation, learning, habituation or fatigueand boredom). We saw this as a necessary constraint.A change to a 30% di erence from participant levels in breathing rate and pulse ratefor the low and high activity states resulted in a Creature respiration rate that was almostuncomfortably fast, and was reported to be distracting by the test subjects. This was alsoimpractical, since such a large di erence or a small error in measurement of respiration ratecould result in commanded respiration and pulse rates that nearly exceed the capabilitiesof the mechanism. The longer-term time frame of investigations allowed for meaningfulcalculations of the mean of various physiological signals and indices that were not possiblein the shorter-term experiment.4.4.6 ConclusionsOverall, students had a positive reaction to interaction with the Haptic A ect Platform.The students reported that they found the Creature comforting during the activity, andexpressed a wish to interact with it again. Stressful computer activity induced changesin heart rate variability standard deviation, pnn50, and vlf%; skin conductance mean andstandard deviation of derivative; and skin temperature mean and standard deviation. Thechanges in heart rate variability and skin temperature are typical of response to stressfulevents. The Haptic Creature was able to induce several physiological changes in participantsduring the experiment. Creature presence induced changes in heart rate variability standarddeviation and vlf%; skin conductance standard deviation of derivative; and skin temperaturemean and standard deviation.4.4.7 Feedback for Iterated DesignThis experiment provided many lessons and suggestions for interactions where children arethe primary subject group, as well as valuable feedback on the TAMER platform’s hardware1044.4. Experiment 3: Experiment with Childrenand software. There were several re nements to experimental protocol that should beincorporated into future experiments with the platform. First, the sensors should be placedon the student before they are shown the Creature or other experiment equipment, as mostwill be more interested in those things than putting on the sensors. Due to the small size ofthe experiment room in this experiment it was impossible to hide the Creature completelyfrom view as the students were walking in, and many immediately wanted to see and pet theCreature once they started the experiment. The students did not seem to su er particulardistress from having the Creature removed from their laps during the experiment; therefore,this was unlikely to have a ected their task performance.Care must also be taken when describing the experiment to the students. Once severalwere told that they would be performing a computer activity they immediately startedperforming the computer activity, even before they had sensors attached. In two cases thecomputer screen had to be turned o so that they would break away from the activityto don the sensors. Unsurprisingly, children in this subject pool had demonstrably lessimpulse control than previous adult participants, and were not capable of waiting indepen-dently. They did, however, express a high level of enthusiasm and receptiveness towardsthe Creature.Experiments in which strict adherence to experimental protocols are necessary to main-tain experimental controls are challenging with younger participants, as they may not beable to accurately follow directions. For this experiment, there were no criteria for ex-clusion of participants. Several students who participated in the experiment were unableto complete the experiment protocol in a way that allowed for meaningful comparisons ofphysiological data between them and the other participants. Two had no experience withthe computer \Clocks" activity that was being used, and two were unwilling to completethe clocks activity. These students were still enthusiastic to see the Creature and, as users,could potentially derive valuable bene ts from the TAMER platform, but are not practicalparticipants when limited experimental time is available.For shy or reticent students, a gradual interaction with the platform was found to be thebest way to make them comfortable with it. Students who were wary of the sensors becamemore comfortable with them once the  rst sensor was put on and shown to cause no harm,and would eventually allow the remainder of the sensors to be put on them. Similarly,several students did not wish to have the Creature on their lap at  rst, and instead gentlypetted the Creature while it sat on the desk. After some time seeing Creature motion, thestudents would let the Creature be placed on their lap. This progressive interaction withthe Creature took much longer than the typical experiment session, but allowed for studentswho otherwise would not have been able to participate to interact with the Creature.In addition to this gradual interaction, students would also bene t from a more coherentCreature \story," detailing the expected motions and behavior of the Creature. As men-1054.4. Experiment 3: Experiment with Childrentioned previously, students, particularly the younger ones, tended to anthropomorphize theCreature, and would become concerned when it stopped moving, started moving, or did notmove for a long period of time. A narrative that incorporated both the Creature mecha-nisms and expected Creature actions would help alleviate student anxiety about experimentequipment performance, allowing them to focus more on their activity and emotional state.Separate research is ongoing to have the Creature display coherent emotional states: an ex-planation that the Creature is \sleeping" or \awake" would help children understand whatthe Creature is capable of doing and what to expect from it during the experiment. Atthe same time, care must be taken in describing the purpose of the Creature to potentialsubjects, or the parents of potential subjects. In this experiment there was a general aware-ness that the Creature was part of a study about anxiety and anxiety-reducing techniques,which may have colored self-reported comments from the students. While, as in a drug trial,describing the purpose of the Creature should not interfere with results, a greater emphasison terms such as \companion" or \assistant" would reduce the concerns that user reportswere in uenced by the experiment vocabulary.While platform participants were concerned about expected Creature behavior, theywere also occasionally confused about their own expected behavior. For this experimentstudents were not instructed to do anything other than pet the Creature during the activity.Several came up with innovative uses of the Creature, including pausing to relax withthe Creature between activities, but several seemed confused by the lack of guidance forCreature interaction. Speci c behavior instructions, such as pausing to breathe with theCreature or petting the Creature only during certain activities could lead to additionalphysiological bene t.The computer activities chosen for this experiment may not give useful informationabout the e cacy of the Creature in anxiety reduction. Students in general had variousreactions to the computer activity. Some maintained a high level of engagement with thescreen, devoting their attention to it rather than the Creature. This was evident in somestudents’ body language, where they would make visible or audible gestures of frustrationupon getting an answer wrong, or success upon completing a problem. Others were unen-thused by their computer activity, and did not seem to care about their score or successrate. It is possible that this level of engagement with the computer activity a ected thee ect of the Creature on the participant. It is also possible that the physiological e ects ofthe computer activity are not constant, but vary during the course of an activity session.If the computer activities are to be used for further experiments with the Creature, longer-term analysis of the physiological e ects of the computer activities must be investigated. Itis possible that physiological e ects and scores on the activity are correlated: if true, thiscould be a useful measure of engagement with the task.The platform hardware could also bene t from from several further re nements. The1064.5. Re ections on Resultsamount of wires necessary for the sensors necessitates a stationary subject, and thereforeprecludes long-term engagement with the platform. Improved sensor form-factor, perhapsin the form of wearable clothing, would allow for the use of the TAMER platform in morediverse user environments. Creature hardware could also still be improved by the develop-ment of a quieter pulse mechanism, which would allow for use in a classroom. Althoughthe noise was not loud enough to be noticed by the experiment participant when wearingnoise canceling headphones, it would be disruptive in a quiet classroom environment.4.5 Re ections on ResultsWith the completion of these experiments, the TAMER platform has been iteratively re-vised and developed into a functional and engaging tool that is attractive and intriguing tochildren and many adults. It has been shown to have an e ect on heart rate and breath-ing rate metrics. Next steps are to commence longer-term studies of interaction with theplatform, to determine both the functionality of the hardware of the system over longerdurations as well as working towards developing e ective software strategies to accomplishthe anxiety reduction goal. These experiments were mostly undirected in that goal; thebreathing and heart rate of the Creature were varied in order to determine what e ectsare provoked in the human user, without attempting more focused interventions. The factthat physiological e ects were produced was promising, but even more important was theinteraction data gathered that will allow for future e ective use of the TAMER platform.It was always the intention to involve therapists and psychologists in the development ofthe TAMER platform feedback loop, and now that the TAMER platform hardware hasstabilized, it may be time to develop interaction scenarios and assessment strategies fortherapeutic bene t.107Chapter 5Conclusions and RecommendationsThis thesis presents a research platform and a set of research questions and experimentalobservations related to the platform’s use. This chapter  rst discusses the research out-comes and a methodological critique of the experiments, followed by the conclusions andrecommendation for platform design.5.1 Experimental OutcomesThe overall research objectives were to determine the reactions to the Haptic Creature,and to determine whether physiological reactions can be provoked or manipulated in theuser through the use of the TAMER platform. The experiments revealed several behavioraland physiological outcomes from Creature presence and actions. In particular, participantstended to  nd the Creature’s presence comforting. Although they were not generally able torecognize the Creature mirroring their breathing or pulse, once they were informed of thisability they found it intriguing and comforting. It was a concern that this would be perceivedas \creepy" or intrusive, but this was an uncommon response. The mirroring also seemedto give participants, particularly the younger ones, a sense of meaning for the experimentand an understanding of the purpose of the physiological sensors. Participants were ableto successfully perform the reverse, matching their breathing to that of the Creature, butthis did not have any heart rate e ects. Suddenly switching from this user-following modeto Creature mirroring mode without informing them did, on occasion, result in what maybe described as a positive feedback loop, which one participant found quite uncomfortablein this experiment.Participants preferred an active Creature to an inactive Creature: they preferred even agentle purring to no motion or sound at all. There was high receptiveness to the Creatureamong children, who generally did not  nd the Creature distracting during other activities.A summary of some signi cant physiological e ects found during the experiments is below.During the pilot experiment:• Disturbing images correlated with changes in mean heart rate, mean heart rate stan-dard deviation, mean heart rate acceleration, mean EMG, mean derivative of skin1085.1. Experimental Outcomesconductance, mean arousal.• Creature presence correlated with reduced mean skin conductance, mean derivativeof skin conductance, mean arousal.In Experiment 1:• Mean heart rate and heart rate variability signi cantly di erent between creatureconstant motion and creature still stages.• Mean skin conductance and skin temperature higher during creature constant motionand creature mirroring user stages than creature still stage.• Standard deviation of breath lengths less during Creature motion stage than Creaturestill stageIn Experiment 2:• Mean breath length signi cantly di erent between entraining, creature with task, andcreature without task stages.• Creature presence correlated with a di erence in mean heart rate.• Standard deviation of breath lengths signi cantly less during Creature entraining,creature with task, and creature without task stages than baseline.• Hf % of heart rate variability signi cantly di erent between Creature entraining andCreature with task, Creature entraining and Creature without task, and Creaturewithout task and Creature with task stages.In Experiment 3:• Computer activity correlated with reduced mean standard deviation of heart rate,heart rate pnn50, and skin conductance.• Computer activity correlated with increased mean derivative of skin conductance, skintemperature, standard deviation of skin temperature, and heart rate vlf %.• Creature presence correlated with increase in mean heart rate standard deviation,heart rate vlf %, standard deviation of derivative of skin conductance, skin tempera-ture, and standard deviation of skin temperature as compared to computer activitywithout Creature.1095.2. Methodological Critique and Recommendations5.2 Methodological Critique and Recommendations5.2.1 Platform PresentationThe experimental protocol for the TAMER platform used throughout these experimentsproved ine ective in certain areas. In particular, there is a great need for explanation whenpresenting the Creature and the experiment. As mentioned, most participants were not ableto recognize the Creature mirroring their breathing and heart rate without an explanationthat this would occur. Even after they had just been equipped with physiological sensorsthat measure their breathing and heart rate, participants were surprised that a link betweenthem and the Creature could be established. Many participants also did not notice the pulsemechanism in the Creature | it was only able to be felt over a small area of the Creature,and should be identi ed before use. It is necessary to fully describe the mechanisms of theCreature to the participant before the experiment, they are not likely to recognize the mech-anisms on their own. The application of the physiological sensors and interaction with theCreature represents a fairly novel event for most participants, and although the zoomorphicnature of the Creature may imply that it contains certain expressive mechanisms, these arenot always evident upon initial investigation.It is also important that the Creature’s mechanisms be activated during this intro-duction, and that the Creature operate at a breathing and pulse rate appropriate for theexperiment, in order to set a proper expectation for the behavior of the Creature. Theactual breathing and pulse rate of animals the Haptic Creature’s size is di erent than thatof a human, and the awareness of this fact in participants will also vary. The Creature’snormal physiological activity level must be established as similar to that of a human.The timing of this introduction is also important. Consent forms, questionnaires, andphysiological sensors should be administered and attached before the participant is ableto see the Creature. Participants, in particular children, often wanted to interact with theCreature upon seeing it for the  rst time, and expressed a desire to hold it and pet it. It wasthen necessary to temper their enthusiasm in order to attach the physiological sensors, andin the case of several students it was particularly di cult to take away their attention fromthe Creature in order to setup the experiment. However, there is also the possibility thatparticipants having seen the Creature during this and prior experiments before undergoingtheir baseline assessment may have inadvertently reduced the e ects on the experimentalresults of any sort of \novelty e ect" caused by exposure to the Creature, as it would nowalso in uence their physiological baselines. Most participants also found initial Creaturemotion new and interesting, but the actions of the Creature mechanisms were simple enoughthat transitions between Creature activity states during the experiments were not likely toinduce a signi cant e ect, and indeed the transitions were not often recognized by theparticipants. Participants, particularly children, were typically excited to interact with1105.2. Methodological Critique and Recommendationsthe Creature, and this enthusiasm lasted through their experimental sessions. A clinicalintroduction to the Creature should leverage this initial appeal to help develop a long-term working relationship with the TAMER platform. Although in actual operation theCreature’s motion is quite subtle and non-intrusive, allowing for users to focus on othertasks, a more sophisticated model of emotions for the Creature could help improve a user’sattachment to the Creature. The increased amount of time required to discover all of theCreature’s eventual operating behaviors should guarantee adequate observation time for thechild’s psychologist to observe his or her interactions with the Creature and train his or herbehaviors.5.2.2 Platform InteractionA Coherent StoryIn experiments with children, and possibly in experiments with adults, there is also theneed for a coherent \story" to explain Creature activity and motions during an experiment.Children, especially, were surprisingly aware of the behavior of the Creature during theexperiment. If the Creature had not moved after a long time period, or stopped movingafter being active, they would often express worry or be upset that the Creature was broken,or that something in the experiment was not working. An explanation of the Creature’sbehavior implying that the Creature may be asleep sometimes, awake other times, andcurious or happy for part of the experiment would help to relieve participant anxiety aboutCreature functionality, and provide an expectation for Creature behavior. Care should betaken, however, to ensure that a story of the Creature does not impose a speci c species,with possible confounding associated behavior expectations, onto the Creature. Informalsurveys revealed descriptions of the Creature as variously a cat, rabbit, mouse, pig, guineapig, or simply a \furry thing," with no one answer predominating. Avoiding identifyingthe Creature as a speci c species, although presenting creative challenges in developing astory, helps to avoid possible negative reactions to the Creature due to a user’s previousinteractions with the chosen species. In particular, this greatly simpli es the introductionof the Creature to children, as a detailed investigation of a user’s past interactions withanimals is not necessary before presenting the Creature to them.A behavior model for another version of the Haptic Creature has been developed thatlinks various Creature mechanism activity levels to Creature emotional states, as interpretedby users. Integrating this model into the TAMER platform could allow for more advancedinteraction during experiments. At the very least, a comparison between the Creature’stypical activity rates when mirroring a human and the emotional states ascribed to theCreature running at those rates could prove informative.Such a behavior model should also be incorporated into the next stage of TAMER1115.2. Methodological Critique and Recommendationsplatform experiments, that of longer-term engagement with the system. Creature behav-ior during these experiments was extremely simple, it acted as essentially a physiologicalmetronome on the lap of the subject, occasionally changing tempo but only gradually.While short-term results with this method were promising, the eventual usage targeted forthe TAMER platform is longer-term anxiety reduction. Although the physiological resultsfrom the experiments presented here were promising, these may not occur in a long-termexperiment, and may require more sophisticated Creature behaviors to maintain engage-ment. Longer-term experiments, particularly with a broader subject pool, may also revealpersonality or background characteristics that would tend to make certain users particularlymore or less receptive to engagement with the Creature. No such trends were observed inthese studies.Additional experiments, such as comparing the e ects of the Creature to a child’s com-panion stu ed animal, could also provide valuable feedback as to the e ectiveness of theTAMER platform compared to typical therapy methods.Interaction ModelsIn order to enable longer duration experiments, as well as to improve the shorter exper-iments, there is a need for more focused and directed interaction with Creature. Exceptwhen asked to follow the Creature’s breathing, participants were not given any instructionson how to interact with the Creature. Often participants, particular adult participants,seemed uncertain of how to behave with the Creature. Child participants were aware thatthe Creature was related to a study on anxiety, and therefore might help to calm themdown, but were not aware of how this would actually occur. More detailed instructions toparticipants about the desired e ects of the Creature, and what actions they could taketo help achieve them, might help the participants achieve greater success in accomplishingthese goals.Observations of interaction with the Creature during the experiment suggest three po-tential interaction models to be experimentally investigated. The  rst is Creature guidedinteraction, where the Creature is used to lead the participant through a series of breathingexercises. Experimental data showed that participants could easily follow the Creature’sbreathing with their own: this could be of potential use in anxiety inducing situations.Many relaxation techniques involve deep breathing exercises, and the Creature could pro-vide calming and engaging guidance in this task. A short break to breath slowly and deeplywith the Creature either before, after, or in the middle of an anxiety-provoking task might beable to produce calming e ects in the participant, and allow them to access their previouslytaught strategies for coping with stress more easily.The second is Creature mirroring of users to improve awareness of their own physiological1125.3. Platform Designstate. Participants generally reported a low awareness of their own heart rate. Using theCreature to improve the user’s self awareness could help in training them to recognizeincreased levels of stress or an impending anxiety attack. By having knowledge of their owntypical and stressed body states, users could again intervene with situation appropriatecoping skills.The third interaction model is that of intervention. Once the TAMER platform iscapable of recognizing either anxious states or the precursors to anxiety, the Creature couldbecome active only when the user is approaching an anxiety attack. Instead of running allthe time mirroring the user, the Creature would activate only when necessary, alerting theuser to their anxious state. Once active, the Creature could then attempt to calm down theuser. A simple slow, steady breathing rhythm similar to that used in the present experimentmight prove su cient in reducing anxiety, but more sophisticated behaviors are possible.For example, the Creature itself could present an anxious state, either by mirroring theuser or acting independently. The user could then be trained to reduce the Creature’slevel of anxiety by breathing slowly or performing other therapeutic techniques, and theCreature’s activity level could gradually decrease in response to changes in physiologicalmetrics. These behaviors should be investigated as allowing for longer-term use of theCreature and TAMER platform in anxiety reduction.5.3 Platform Design5.3.1 OutcomesOverall, the individual components of the TAMER platform were integrated to produce ane ective and reliable system. However, there are several improvements that could be madeto improve overall functionality.CreatureThe Haptic Creature was shown to be an e ective device for displaying a ect throughbreathing and heart rate mechanisms, while being comfortable for and engaging with itsusers. Participants reported high levels of comfort with the Creature: they were receptive toit being placed on their laps and moving around, and found its fur to be soft and pleasing tothe touch. In particular, participants responded positively to the warmth of the Creatureand its life-like attributes. They desired that it gently purr even when still, as opposedto just sitting as a dead-weight on their lap. The Creature’s breathing mechanism wassuccessfully able to portray breathing, it was recognized as such by users. The Creature’spulse mechanism was particularly noisy, but it did successfully generate a pulse sensationin a narrow area of the Creature.1135.3. Platform DesignSensorsParticipants, both the adults and mildly-anxious children, were surprisingly receptive towearing the physiological sensors, and generally did not  nd them distracting or uncomfort-able during experiments. Several minor problems were associated with sensor functionalityduring experiments. The EKG sensor, when mounted to the chest with a single triodeelectrode, instead of three separate electrodes, would occasionally become detached duringexperiments, leading to loss of signal. The blood volume pulse sensor, and indeed all thesensors attached to the  ngers, could occasionally become detached during the experiment.Participants were typically hesitant to touch anything with the hand attached to the sensors,and had to be instructed that it was acceptable to pet the Creature with that hand. Oncetold, they did not seem encumbered by the sensors. Had the participants attempted to movearound during the experiment, however, they would have found their motion constrained bythe sensors. The numerous wires required for the sensors were continually getting tangled,and there was a concern that a sudden large motion by a participant, such as a nervouschild desiring to leave the room, could cause damage to the equipment. This scenario didnot occur during experiments, however.5.3.2 RecommendationsThere are several changes recommended for the hardware and software of the TAMERplatform. The Creature, although functional, requires several modi cations that wouldallow for more e ective-longer term experiments, and help to move the Creature from alaboratory environment to a less clinical setting, such as a school or a home. In particular,the noise of the pulse mechanism was moderately audible, and would be noticeable in thequiet of a classroom. A pulse mechanism that created a motion able to be felt over a largerarea of the Creature could make the pulse mechanism more e ectively able to convey thepulse sensation. Creature noise must be assessed to reduce it to a level that would notbother other students in a quiet room. The gentle vibration that participants preferred tothe Creature being completely inactive did have an audio component; the use of the purringmechanism instead of the breathing mechanism linkage to create this sensation should beinvestigated. In addition, consideration should be made towards eliminating as many wiresto the Creature as possible. Presently, with the radio system in use, the power cable is theonly wire that must be attached to the Creature. Provisions exist on the electronics boardfor an internal battery pack to be mounted; the use of this should be investigated. Theneed for an external power supply not only restricts the usage of the Creature, but there isalso the risk that the cable could become inadvertently disconnected during use, disablingthe Creature prematurely.The software for reading the physiological sensors and controlling the Creature was1145.3. Platform Designadequate for the experiment. However, to support both this platform and other experimentswith the physiological sensors, the integration of additional timing and observational inputs,such as video feeds and push-button controls, should be developed. Linking the physiologicaldata to exact moments in the experiment, or indeed to exact times in general is oftendi cult. A more advanced sensor suite incorporating video and physiological data, as wellas a better system for marking notable occurrences during an experiment, would greatlysimplify data analysis, and potentially allow for more subtle results to be uncovered. Inaddition, the physiological data gathered should be used at a higher level than simply meanvalues. Medical interpretation of the physiological data gathered by the platform, or theuse of an inference engine or machine learning techniques to estimate clinical assessmentsof anxiety, such as the Multidimensional Anxiety Scale for Children [113], based on trainingdata provided by psychologists, could provide more reliable estimations of the e ect theplatform has on anxiety levels. Better online estimation of anxiety levels could allow fora more e ective platform, as the speci c Creature activities could be associated with theire ects on anxiety and then used appropriately in a therapy regimen. A physiological sensingsystem with both the form factor and capability to support longer-term observations couldalso allow for the identi cation of chronic anxiety with the physiological sensors. Thisalong with medical observations would aid the determination of any e ects of the TAMERplatform on longer-term, chronic anxiety.Sensor form factor is one of the limiting factors of this platform. The present sensors aresomewhat intrusive, and require both a large amount of time to set up and many wires tobe connected to the participant. Reducing the amount of wires necessary to be attached tothe participant, or, ideally, eliminating wires all together, would both improve participantexperience and allow the sensors to be used on mobile participants in an actual classroomenvironment. Combining several sensors into a single form factor, such as a piece of clothingor a glove, could also help participants who were reticent about having the sensors attachedto feel more comfortable. Additionally, several of the sensors, particularly heart rate andheart rate variability, can generate useful data over observation periods of several hours oreven days, that may be useful in anxiety reduction. Integration of these long-term wearablesensors into the platform could produce improved results and more useful data.Better integration of the sensor data into the experiment procedure might also lead tointeresting avenues of investigation and applications for the platform. Several children wereinterested in viewing their physiological data on the computer, and wanted to know moreabout the sensor readings. The children who participated in the computer activity werescore and goal focused, due to typically having their performance assessed during theseactivities. Cataloging physiological measures related to anxiety and then displaying themto the user has been shown to be of bene t for adults. Feedback to users of both long-term and short-term physiological data as both a visual and a haptic (Creature) display1155.4. Conclusioncould assist them in recognizing anxiety inducing behaviors and therefore eliminating them.Even at this young age, these children are already quite familiar with improving score-basedperformance.Overall TAMER platform functionality was su cient for the experiment, but thesechanges recommended should improve platform performance.5.4 ConclusionThis thesis has described the construction of the TAMER platform and the initial testingand experimental veri cation of the same. The Haptic Creature constructed as part of theTAMER platform distinguishes itself from other robotic companions by recreating physi-ological activities through a solely haptic presentation method, and is, uniquely, capableof reacting to a user’s sensed physiological state or displaying a user’s state with its ownmechanisms. This link establishes an advancement in biofeedback technology, as it shouldbe easier, especially for children, to relate to a robot than to a pulse or heart rate moni-tor. Physiological interaction with robots is advanced: the TAMER platform demonstratesreal-time reaction to a user’s physiological state and real-time interaction with the potentialto guide the user’s physiological state in a controlled feedback loop. Further, the TAMERplatform has been demonstrated in a school environment. Results from the experiments inthis thesis support the potential of the TAMER platform to be used in anxiety manage-ment therapy. Users of the Haptic Creature, in particular children, reported a strong desireto interact with and work with the Creature; they also found it comforting and calmingduring tasks. Users were able to follow the Haptic Creature in a breathing related experi-ment. Physiological e ects from the Creature were also found in users interacting with thecreature | a  rst step towards fully controlled manipulation of user physiological state.116Bibliography[1] P. Rani and N. Sarkar, \Emotion-sensitive robots-a new paradigm for human-robotinteraction," in 2004 4th IEEE/RAS International Conference on Humanoid Robots,pp. 149{167, 2004.[2] R. Picard, E. Vyzas, and J. Healey, \Toward machine emotional intelligence: Analysisof a ective physiological state," IEEE transactions on pattern analysis and machineintelligence, pp. 1175{1191, 2001.[3] S. Yohanan and K. MacLean, \The Haptic Creature Project: Social Human-RobotInteraction through A ective Touch," in Proceedings of the AISB 2008 Symposiumon the Reign of Catz & Dogs: The Second AISB Symposium on the Role of VirtualCreatures in a Computerised Society, vol. 1, pp. 7{11, 2008.[4] C. Suveg, P. Kendall, J. Comer, and J. Robin, \Emotion-focused cognitive-behavioraltherapy for anxious youth: A multiple-baseline evaluation," Journal of ContemporaryPsychotherapy, vol. 36, no. 2, pp. 77{85, 2006.[5] G. Macklem, Practitioner’s guide to emotion regulation in school-aged children.Springer Verlag, 2007.[6] R. Reiner, \Integrating a portable biofeedback device into clinical practice for pa-tients with anxiety disorders: Results of a pilot study," Applied Psychophysiology andBiofeedback, vol. 33, no. 1, pp. 55{61, 2008.[7] M. Hertenstein, \Touch: Its communicative functions in infancy," Human Develop-ment, vol. 45, no. 2, pp. 70{94, 2002.[8] S. Yohanan and K. MacLean, \A tool to study a ective touch," in Proceedings ofthe 27th international conference extended abstracts on Human factors in computingsystems, pp. 4153{4158, ACM, 2009.[9] D. Kuli c and E. Croft, \A ective State Estimation for Human{Robot Interaction,"IEEE Transactions on Robotics, vol. 23, no. 5, pp. 991{1000, 2007.[10] \Sony AIBO Support." http://support.sony-europe.com/aibo/index.asp,March 2010.[11] \Furby wikipedia entry." http://en.wikipedia.org/wiki/Furby, March 2010.[12] B. Friedman, P. Kahn Jr, and J. Hagman, \Hardware companions?: What onlineAIBO discussion forums reveal about the human-robotic relationship," in Proceedingsof the SIGCHI conference on Human factors in computing systems, pp. 273{280, ACMNew York, NY, USA, 2003.117Bibliography[13] G. Melson, P. Kahn Jr, A. Beck, B. Friedman, T. Roberts, and E. Garrett, \Robotsas dogs?: Children’s interactions with the robotic dog AIBO and a live AustralianShepherd," in CHI’05 extended abstracts on Human factors in computing systems,p. 1652, ACM, 2005.[14] G. Melson, P. Kahn Jr, A. Beck, and B. Friedman, \Robotic pets in human lives:Implications for the human-animal bond and for human relationships with personi edtechnologies," Journal of Social Issues, vol. 65, no. 3, pp. 545{567, 2009.[15] M. Banks, L. Willoughby, and W. Banks, \Animal-assisted therapy and loneliness innursing homes: use of robotic versus living dogs," Journal of the American MedicalDirectors Association, 2007.[16] T. Tamura, S. Yonemitsu, A. Itoh, D. Oikawa, A. Kawakami, Y. Higashi, T. Fu-jimooto, and K. Nakajima, \Is an entertainment robot useful in the care of elderlypeople with severe dementia?," Journals of Gerontology Series A: Biological and Med-ical Sciences, vol. 59, no. 1, p. 83, 2004.[17] T. Shibata, T. Mitsui, K. Wada, A. Touda, T. Kumasaka, K. Tagami, andK. Tanie, \Mental commit robot and its application to therapy of children," in 2001IEEE/ASME International Conference on Advanced Intelligent Mechatronics, 2001.Proceedings, vol. 2, 2001.[18] K. Wada and T. Shibata, \Robot therapy in a care house-its sociopsychological andphysiological e ects on the residents," in Proc. IEEE ICRA, pp. 3966{3971, 2006.[19] C. Kidd, W. Taggart, and S. Turkle, \A sociable robot to encourage social interactionamong the elderly," in IEEE International Conference on Robotics and Automation(ICRA), Orlando, Florida, USA, Citeseer, 2006.[20] T. Shibata, K. Wada, Y. Ikeda, and S. Sabanovic, \Cross-Cultural Studies on Sub-jective Evaluation of a Seal Robot," Advanced Robotics, vol. 23, no. 4, pp. 443{458,2009.[21] \Paro Robots Inc.." http://parorobots.com/, March 2010.[22] W. Stiehl, J. Lieberman, C. Breazeal, L. Basel, L. Lalla, and M. Wolf, \Design ofa therapeutic robotic companion for relational, a ective touch," Robot and HumanInteractive Communication, 2005. ROMAN 2005. IEEE International Workshop on,pp. 408{415, Aug. 2005.[23] W. Stiehl, C. Breazeal, K. Han, J. Lieberman, L. Lalla, A. Maymin, J. Salinas,D. Fuentes, R. Toscano, C. Tong, et al., \The huggable: a therapeutic robotic compan-ion for relational, a ective touch," in ACM SIGGRAPH 2006 Emerging technologies,p. 15, ACM, 2006.[24] W. Stiehl, J. Lee, C. Breazeal, M. Nalin, A. Morandi, and A. Sanna, \The huggable:a platform for research in robotic companions for pediatric care," in Proceedings ofthe 8th International Conference on Interaction Design and Children, pp. 317{320,ACM New York, NY, USA, 2009.118Bibliography[25] J. Saldien, K. Goris, S. Yilmazyildiz, W. Verhelst, and D. Lefeber, \On the design ofthe huggable robot Probo," Journal of Physical Agents, vol. 2, no. 2, p. 3, 2008.[26] K. Goris, J. Saldien, I. Vanderniepen, and D. Lefeber, \The Huggable Robot Probo, aMulti-disciplinary Research Platform," in Proceedings of the EUROBOT Conference,pp. 22{24, Springer, 2008.[27] A. van Breemen, X. Yan, and B. Meerbeek, \icat: an animated user-interface robotwith personality," in AAMAS ’05: Proceedings of the fourth international joint confer-ence on Autonomous agents and multiagent systems, (New York, NY, USA), pp. 143{144, ACM, 2005.[28] G. Castellano, A. Pereira, I. Leite, A. Paiva, and P. W. McOwan, \Detecting userengagement with a robot companion using task and social interaction-based features,"in ICMI-MLMI ’09: Proceedings of the 2009 international conference on Multimodalinterfaces, (New York, NY, USA), pp. 119{126, ACM, 2009.[29] C. Breazeal, \Toward sociable robots," Robotics and Autonomous Systems, vol. 42,no. 3-4, pp. 167{175, 2003.[30] M. Heerink, B. Kr ose, B. Wielinga, and V. Evers, \Enjoyment intention to use andactual use of a conversational robot by elderly people," in Proceedings of the 3rdACM/IEEE international conference on Human robot interaction, pp. 113{120, ACM,2008.[31] T. Kanda, T. Hirano, D. Eaton, and H. Ishiguro, \Interactive robots as social partnersand peer tutors for children: A  eld trial," Human-Computer Interaction, vol. 19,no. 1, pp. 61{84, 2004.[32] H. Kozima, M. Michalowski, and C. Nakagawa, \A Playful Robot for Research, Ther-apy, and Entertainment," Int J Soc Robot, vol. 1, pp. 3{18, 2009.[33] C. Plaisant, A. Druin, C. Lathan, K. Dakhane, K. Edwards, J. M. Vice, and J. Mon-temayor, \A storytelling robot for pediatric rehabilitation," in Assets ’00: Proceedingsof the fourth international ACM conference on Assistive technologies, (New York, NY,USA), pp. 50{55, ACM, 2000.[34] G. Kronreif and P. GmbH, \Robot Systems for Play in Education and Therapy of Dis-abled Children," Towards Intelligent Engineering and Information Technology, p. 221,2009.[35] A. Cook, B. Bentz, N. Harbottle, C. Lynch, and B. Miller, \School-based use of arobotic arm system by children with disabilities," Neural Systems and RehabilitationEngineering, IEEE Transactions on, vol. 13, pp. 452 {460, dec. 2005.[36] H. Krebs, B. Ladenheim, C. Hippolyte, L. Monterroso, J. Mast, and G. Wittenberg,\Robot-assisted task-speci c training in cerebral palsy.," Developmental medicine andchild neurology, vol. 51, p. 140, 2009.119Bibliography[37] K. Dautenhahn and A. Billard, \Games children with autism can play with robota,a humanoid robotic doll," Universal Access and Assistive Technology, pp. 179{190,2002.[38] B. Robins, K. Dautenhahn, R. Te Boekhorst, and A. Billard, \E ects of repeatedexposure to a humanoid robot on children with autism," Designing a More InclusiveWorld, pp. 225{236, 2004.[39] T. Salter, I. Werry, and F. Michaud, \Going into the wild in child{robot interactionstudies: issues in social robotic development," Intelligent Service Robotics, vol. 1,no. 2, pp. 93{108, 2008.[40] C. Liu, K. Conn, N. Sarkar, and W. Stone, \Online A ect Detection and RobotBehavior Adaptation for Intervention of Children With Autism," IEEE Transactionson Robotics, vol. 24, no. 4, pp. 883{896, 2008.[41] B. Robins, K. Dautenhahn, and J. Dubowski, \Robots as isolators or mediators forchildren with autism? A cautionary tale," in Proc. AISB, vol. 5, pp. 82{88, Citeseer,2005.[42] M. Raskin, G. Johnson, and J. Rondestvedt, \Chronic anxiety treated by feedback-induced muscle relaxation: A pilot study," Archives of General Psychiatry, vol. 28,no. 2, p. 263, 1973.[43] R. Townsend, J. House, and D. Addario, \A comparison of biofeedback-mediatedrelaxation and group therapy in the treatment of chronic anxiety," American Journalof Psychiatry, vol. 132, no. 6, p. 598, 1975.[44] P. Lehrer, E. Vaschillo, and B. Vaschillo, \Resonant frequency biofeedback training toincrease cardiac variability: rationale and manual for training," Applied Psychophys-iology and Biofeedback, vol. 25, no. 3, pp. 177{191, 2000.[45] P. Lehrer and R. Woolfolk, \Research on clinical issues in stress management," Prin-ciples and practice of stress management, pp. 703{721, 2007.[46] J. Murphy, Comparison of relaxation techniques for group cognitive behavioral therapyfor generalized anxiety disorder. PhD thesis, Alliant International University, 2009.[47] J. Fisher, M. Rytting, and R. Heslin, \Hands touching hands: A ective and evaluativee ects of an interpersonal touch," Sociometry, vol. 39, no. 4, pp. 416{421, 1976.[48] F. Willis and H. Hamm, \The use of interpersonal touch in securing compliance,"Journal of Nonverbal Behavior, vol. 5, no. 1, pp. 49{55, 1980.[49] J. Vormbrock and J. Grossberg, \Cardiovascular e ects of human-pet dog interac-tions," Journal of Behavioral Medicine, vol. 11, no. 5, pp. 509{517, 1988.[50] S. Shiloh, G. Sorek, and J. Terkel, \Reduction of state-anxiety by petting animalsin a controlled laboratory experiment," Anxiety, Stress and Coping, vol. 16, no. 4,pp. 387{395, 2003.120Bibliography[51] A. Haans and W. A. IJsselsteijn, \The virtual midas touch: Helping behavior after amediated social touch," IEEE Transactions on Haptics, vol. 2, pp. 136{140, 2009.[52] H. Cramer, N. Kemper, A. Amin, and V. Evers, \The e ects of robot touch andproactive behaviour on perceptions of human-robot interactions.," in Proceedings ofthe 4th ACM/IEEE international conference on Human robot interaction, pp. 275{276, ACM, 2009.[53] K. Salminen, V. Surakka, J. Lylykangas, J. Raisamo, R. Saarinen, R. Raisamo,J. Rantala, and G. Evreinov, \Emotional and behavioral responses to haptic stim-ulation," in CHI ’08: Proceeding of the twenty-sixth annual SIGCHI conference onHuman factors in computing systems, (New York, NY, USA), pp. 1555{1562, ACM,2008.[54] C. Swindells, K. MacLean, K. Booth, and M. Meitner, \Exploring a ective designfor physical controls," in Proceedings of the SIGCHI conference on Human factors incomputing systems, p. 942, ACM, 2007.[55] J. Smith and K. MacLean, \Communicating emotion through a haptic link: Designspace and methodology," International Journal of Human-Computer Studies, vol. 65,no. 4, pp. 376{387, 2007.[56] A. Chang, B. Resner, B. Koerner, X. Wang, and H. Ishii, \Lumitouch: an emotionalcommunication device," in CHI ’01: CHI ’01 extended abstracts on Human factors incomputing systems, (New York, NY, USA), pp. 313{314, ACM, 2001.[57] F. Vetere, M. R. Gibbs, J. Kjeldskov, S. Howard, F. F. Mueller, S. Pedell, K. Mecoles,and M. Bunyan, \Mediating intimacy: designing technologies to support strong-tierelationships," in CHI ’05: Proceedings of the SIGCHI conference on Human factorsin computing systems, (New York, NY, USA), pp. 471{480, ACM, 2005.[58] F. F. Mueller, F. Vetere, M. R. Gibbs, J. Kjeldskov, S. Pedell, and S. Howard, \Hugover a distance," in CHI ’05: CHI ’05 extended abstracts on Human factors in com-puting systems, (New York, NY, USA), pp. 1673{1676, ACM, 2005.[59] L. Bonanni, C. Vaucelle, J. Lieberman, and O. Zuckerman, \Taptap: a haptic wear-able for asynchronous distributed touch therapy," in CHI ’06: CHI ’06 extended ab-stracts on Human factors in computing systems, (New York, NY, USA), pp. 580{585,ACM, 2006.[60] K. Kim, S. Bang, and S. Kim, \Emotion recognition system using short-term mon-itoring of physiological signals," Medical and biological engineering and computing,vol. 42, no. 3, pp. 419{427, 2004.[61] J. Wagner, J. Kim, and E. Andr e, \From physiological signals to emotions: Imple-menting and comparing selected methods for feature extraction and classi cation,"in IEEE International Conference on Multimedia & Expo (ICME 2005), pp. 940{943,2005.121Bibliography[62] D. Kuli c and E. Croft, \Physiological and subjective responses to articulated robotmotion," Robotica, vol. 25, no. 01, pp. 13{27, 2006.[63] L. Rabiner and B. Juang, \An introduction to hidden Markov models," IEEE ASSpMagazine, vol. 3, no. 1, pp. 4{16, 1986.[64] C. Liu, K. Conn, N. Sarkar, and W. Stone, \A ect recognition in robot assistedrehabilitation of children with autism spectrum disorder," in Proc. of the 15th IEEEIntl. Conf. on Robotics and Automation, Citeseer, 2006.[65] P. Rani, C. Liu, N. Sarkar, and E. Vanman, \An empirical study of machine learningtechniques for a ect recognition in human&#x2013;robot interaction," Pattern Anal.Appl., vol. 9, no. 1, pp. 58{69, 2006.[66] C. Bethel, K. Salomon, R. Murphy, and J. Burke, \Survey of psychophysiology mea-surements applied to human-robot interaction," in IEEE International Symposium onRobot and Human Interactive Communication, Jeju, Korea, 2007.[67] Y. Takahashi, N. Hasegawa, K. Takahashi, and T. Hatakeyama, \Human interfaceusing PC display with head pointing device for eating assist robot and emotionalevaluation by GSR sensor," in IEEE International Conference on Robotics and Au-tomation, vol. 4, pp. 3674{3679, IEEE; 1999, 2001.[68] K. Itoh, H. Miwa, Y. Nukariya, M. Zecca, H. Takanobu, S. Roccella, M. Carrozza,P. Dario, and T. Atsuo, \Development of a Bioinstrumentation System in the Inter-action between a Human and a Robot," in International Conference of IntelligentRobots and Systems, pp. 2620{2625, 2006.[69] P. Rani, N. Sarkar, C. A. Smith, and L. D. Kirby, \Anxiety detecting robotic system|towards implicit human-robot collaboration," Robotica, vol. 22, no. 1, pp. 85{95, 2004.[70] N. Hanajima, T. Goto, Y. Ohta, H. Hikita, and M. Yamashita, \A motion rule forhuman-friendly robots based on electrodermal activity investigations and its appli-cation to mobile robot," in 2005 IEEE/RSJ International Conference on IntelligentRobots and Systems, 2005.(IROS 2005), pp. 3791{3797, Citeseer, 2005.[71] D. Kuli c and E. Croft, \Pre-collision safety strategies for human-robot interaction,"Autonomous Robots, vol. 22, no. 2, pp. 149{164, 2007.[72] S. Yohanan, M. Chan, J. Hopkins, H. Sun, and K. MacLean, \Hapticat: explorationof a ective touch," in ICMI ’05: Proceedings of the 7th international conference onMultimodal interfaces, (New York, NY, USA), pp. 222{229, ACM, 2005.[73] J. Chang, K. MacLean, and S. Yohanan, \The haptic creature’s gesture recognitionengine," in EuroHaptics, (Amsterdam), Springer, 2010.[74] M. Banzi, D. Cuartielles, T. Igoe, G. Martino, and D. Mellis., \Arduino." http://www.arduino.cc.[75] \Dimension engineering llc." http://www.dimensionengineering.com/.122Bibliography[76] B. Fry and C. Reas, \Processing programming language." http://www.processing.org.[77] M. Craske, Anxiety disorders: Psychological approaches to theory and treatment. BasicBooks, 1999.[78] C. Bankart and R. Elliott, \Heart rate and skin conductance in anticipation of shockswith varying probability of occurrence," Psychophysiology, vol. 11, no. 2, pp. 160{174,1974.[79] A.  Ohman and J. Soares, \\Unconscious Anxiety": Phobic Responses to MaskedStimuli," Journal of Abnormal Psychology, vol. 103, no. 231{231, p. 1994, 1994.[80] H. Caprara, P. Eleazer, R. Bar eld, and S. Chavers, \Objective measurement ofpatient’s dental anxiety by galvanic skin reaction," Journal of Endodontics, vol. 29,no. 8, pp. 493{496, 2003.[81] R. Hoehn-Saric, D. McLeod, and W. Zimmerli, \Somatic manifestations in womenwith generalized anxiety disorder: Psychophysiological responses to psychologicalstress.," Archives of General Psychiatry, vol. 46, no. 12, pp. 1113{1119, 1989.[82] R. Hoehn-Saric and D. McLeod, \Anxiety and arousal: physiological changes andtheir perception," Journal of A ective Disorders, vol. 61, no. 3, pp. 217{224, 2000.[83] R. Dishman, Y. Nakamura, M. Garcia, R. Thompson, A. Dunn, and S. Blair, \Heartrate variability, trait anxiety, and perceived stress among physically  t men andwomen," International Journal of Psychophysiology, vol. 37, no. 2, pp. 121{133, 2000.[84] E. Blom, E. Olsson, E. Serlachius, M. Ericson, and M. Ingvar, \Heart rate variability(HRV) in adolescent females with anxiety disorders and major depressive disorder,"Acta P diatrica, vol. 99, no. 4, pp. 604{611, 2010.[85] J. Thayer, B. Friedman, and T. Borkovec, \Autonomic characteristics of generalizedanxiety disorder and worry," Biological Psychiatry, vol. 39, no. 4, pp. 255{266, 1996.[86] B. Friedman and J. Thayer, \Anxiety and autonomic  exibility: a cardiovascularapproach," Biological Psychology, vol. 47, no. 3, pp. 243{263, 1998.[87] W. Suess, A. Alexander, D. Smith, H. Sweeney, and R. Marion, \The e ects of psy-chological stress on respiration: A preliminary study of anxiety and hyperventilation,"Psychophysiology, vol. 17, no. 6, pp. 535{540, 1980.[88] J. Martinez, J. Kent, J. Coplan, S. Browne, L. Papp, G. Sullivan, M. Kleber, F. Pere-pletchikova, A. Fyer, D. Klein, et al., \Respiratory variability in panic disorder.,"Depression and anxiety, vol. 14, no. 4, p. 232, 2001.[89] V. Niccolai, M. van Duinen, and E. Griez, \Respiratory patterns in panic disor-der reviewed: a focus on biological challenge tests," Acta Psychiatrica Scandinavica,vol. 120, no. 3, pp. 167{177, 2009.123Bibliography[90] S. Rimm-Kaufman and J. Kagan, \The psychological signi cance of changes in skintemperature," Motivation and Emotion, vol. 20, no. 1, pp. 63{78, 1996.[91] B. Mittelmann and H. Wol , \Emotions and skin temperature: Observationson patients during psychotherapeutic (psychoanalytic) interviews," PsychosomaticMedicine, vol. 5, no. 3, p. 211, 1943.[92] P. Boudewyns, \A comparison of the e ects of stress vs. relaxation instruction on the nger temperature response," Behavior Therapy, vol. 7, no. 1, pp. 54{67, 1976.[93] J. Smith, M. Bradley, and P. Lang, \State anxiety and a ective physiology: e ectsof sustained exposure to a ective pictures," Biological Psychology, vol. 69, no. 3,pp. 247{260, 2005.[94] J. Cacioppo, R. Petty, M. Losch, and H. Kim, \Electromyographic activity over fa-cial muscle regions can di erentiate the valence and intensity of a ective reactions,"Journal of personality and social psychology, vol. 50, no. 2, pp. 260{268, 1986.[95] U. Dimberg, \Facial electromyography and emotional reactions.," Psychophysiology,vol. 27, no. 5, pp. 481{494, 1990.[96] U. Dimberg, M. Thunberg, and K. Elmehed, \Unconscious facial reactions to emo-tional facial expressions," Psychological Science, pp. 86{89, 2000.[97] D. Kuli c, Safety for Human-Robot Interaction. PhD thesis, The University of BritishColumbia, 2005.[98] \Thought Technology inc.." http://www.thoughttechnology.com/, January 2010.[99] P. Hamilton and W. Tompkins, \Quantitative investigation of QRS detection rulesusing the MIT/BIH arrhythmia database," IEEE Trans. Biomed. Eng, vol. 33, no. 12,pp. 1157{1165, 1986.[100] K. Jensen-Urstad, B. Saltin, M. Ericson, N. Storck, and M. Jensen-Urstad, \Pro-nounced resting bradycardia in male elite runners is associated with high heart ratevariability," Scandinavian journal of medicine & science in sports, vol. 7, no. 5,pp. 274{278, 1997.[101] M. Malik, J. Bigger, A. Camm, R. Kleiger, A. Malliani, A. Moss, and P. Schwartz,\Heart rate variability: Standards of measurement, physiological interpretation, andclinical use," European Heart Journal, vol. 17, no. 3, p. 354, 1996.[102] \American History X (video clip)." http://www.youtube.com/watch?v=rdVeW4hCLpE, 1998.[103] P. Lang, A. Ohman, and D. Vaitl, \The international a ective picture system (photo-graphic slides)," Gainesville, FL: Center for Research in Psychophysiology, Universityof Florida, 1988.[104] M. McManis, M. Bradley, W. Berg, B. Cuthbert, and P. Lang, \Emotional reactionsin children: Verbal, physiological, and behavioral responses to a ective pictures,"Psychophysiology, vol. 38, no. 02, pp. 222{231, 2001.124[105] C. Weems, A. Zakem, N. Costa, M. Cannon, and S. Watts, \Physiological Responseand Childhood Anxiety: Association With Symptoms of Anxiety Disorders and Cog-nitive Bias," Journal of Clinical Child and Adolescent Psychology, vol. 34, no. 4,pp. 712{723, 2005.[106] P. Rani, J. Sims, R. Brackin, and N. Sarkar, \Online stress detection using psy-chophysiological signals for implicit human-robot cooperation," Robotica, vol. 20,no. 06, pp. 673{685, 2002.[107] S. Zoghbi, D. Kuli c, E. Croft, and M. Van der Loos, \Evaluation of a ective stateestimations using an on-line reporting device during human-robot interactions," inProceedings of the 2009 IEEE/RSJ international conference on Intelligent robots andsystems, pp. 3742{3749, IEEE Press, 2009.[108] D. Kuli c and E. Croft, \Estimating robot induced a ective state using hidden markovmodels," in Robot and Human Interactive Communication, 2006. ROMAN 2006. The15th IEEE International Symposium on, pp. 257{262, 2006.[109] M. Lader, \Palmar skin conductance measures in anxiety and phobic states.," Journalof Psychosomatic Research, vol. 11, no. 3, pp. 271{281, 1967.[110] S. Barker and K. Dawson, \The e ects of animal-assisted therapy on anxiety ratingsof hospitalized psychiatric patients," Psychiatric Services, vol. 49, no. 6, p. 797, 1998.[111] \The graduate record exam." http://www.ets.org/gre/.[112] \Eaton Arrowsmith School®." http://www.eatonarrowsmithschool.com/.[113] J. March, J. Parker, K. Sullivan, P. Stallings, and C. Conners, \The MultidimensionalAnxiety Scale for Children (MASC): factor structure, reliability, and validity," Journalof Amer Academy of Child & Adolescent Psychiatry, vol. 36, no. 4, p. 554, 1997.125Appendix ADerivationsA.1 Creature Physiological Mirroring DerivationsThe following variables are used throughout these derivations:y Commanded Creature breathing servo amplitude (roughly how high the abdomen ap-pears) [0,1]r User average respiration rate [breaths per second]t time [seconds]l User average breath length [seconds]p User averaged heart rate [bps]i Commanded Creature interbeat interval (time between heart beats) [seconds]A.1.1 Derivation of Ramped Breathing Motion CommandsA sinusoidal wave increasing frequency at rate k:y = cos(2 t(f0 + k2t)) (A.1)GeneralFrom t0 to t1:k = r2 r0t1 t0)y1 =  cos 2 (t t0) (r2 r0)(t t0)2(t1 t0) +r0  (A.2)y1 = cos 2 (t t0) (r2 r0)(t t0)2(t1 t0) +r0  (A.3)and:y1(t = t1) = cos 2 (t1 t0) (r2 r0)(t1 t0)2(t1 t0) +r0  (A.4)126A.1. Creature Physiological Mirroring DerivationsFrom t1 to t2:y2 =  cos(2 r2t+ ) (A.5)y2(t = t1) = y1(t = t1) (A.6) cos(2 r2t+ ) =  cos 2 (t1 t0) (r2 r0)(t1 t0)2(t1 t0) +r0  (A.7)2 r2t1 + = 2 (t1 t0) (r2 r0)(t1 t0)2(t1 t0) +r0 (A.8) = 2 (t1 t0) (r2 r0)(t1 t0)2(t1 t0) +r0  2 r2t1 (A.9)y2 = cos (2 r2t+ ) where  = 2 (t1 t0) (r2 r0)(t1 t0)2(t1 t0) +r0  2 r2t1 (A.10)and:y2(t = t2) = cos (2 r2t2 + ) (A.11)From t2 to t3:k = r4 r2t3 t2)y3 = cos 2 (t t2) (r4 r2)(t t2)2(t3 t2) +r0  + ) (A.12)y2(t = t2) = y3(t = t2) (A.13) cos (2 r2t2 + ) =  cos 2 (t2 t2) (r4 r2)(t2 t2)2(t3 t2) +r0  + ) (A.14) = 2 r2t2 + (A.15)y3 =  cos 2 (t t2) (r4 r2)(t t2)2(t3 t2) +r0  + ) (A.16)(A.17)y3 = cos 2 (t t2) (r4 r2)(t t2)2(t3 t2) +r0  + ) where  = 2 r2t2 +  = 2 (t1 t0) (r2 r0)(t1 t0)2(t1 t0) +r0  2 r2t1(A.18)and:y3(t = t3) = cos 2 (t3 t2) (r4 r2)(t3 t2)2(t3 t2) +r0  + ) (A.19)From t3 to t4:127A.1. Creature Physiological Mirroring Derivationsy4 =  cos(2 r4t+ ) (A.20)y4(t = t3) = y3(t = t3) (A.21) cos(2 r4t3 + ) =  cos 2 (t t2) (r4 r2)(t t2)2(t3 t2) +r0  + ) (A.22)2 r2t3 + = 2 (t3 t2) (r4 r2)(t3 t2)2(t3 t2) +r0 + (A.23) = 2 (t3 t2) (r4 r2)(t3 t2)2(t3 t2) +r0 + (A.24)y4 = cos(2 r4t+ ) where  = 2 (t3 t2) (r4 r2)(t3 t2)2(t3 t2) +r0 +  = 2 r2t2 +  = 2 (t1 t0) (r2 r0)(t1 t0)2(t1 t0) +r0  2 r2t1(A.25)ExamplesWhere r2 = 1:2r0; r4 = 0:8r0; t0 = 30; t1 = 90; t2 = 150; t3 = 210:y(t) =8>>>><>>>>: cos(2 (t 30)(r0 + 1600r0(t 30))) t< 30 cos(125  r0t 84 r0) 30 t< 90 cos(2 (t 150)(r0 1300r0(t 150))) 90 t< 150 cos(85 r0t+ 372 r0) 150 t< 210(A.26)Where r2 = 1:2r0; r4 = 0:8r0; t0 = 60; t1 = 180; t2 = 300; t3 = 420:y(t) =8>>>><>>>>: cos(2 (t 60)(r0 + 11200r0(t 60))) t< 60 cos(125  r0t 168 r0) 60 t< 180 cos(2 (t 300)(r0 1600r0(t 300))) 180 t< 300 cos(85 r0t+ 744 r0) 300 t< 420(A.27)A.1.2 Derivation of Ramped Pulse RateGeneralFrom t0 to t1:i1 = 60p0 + p2t1 t0t; (A.28)From t1 to t2:i2 = 60p2(A.29)128A.1. Creature Physiological Mirroring DerivationsFrom t2 to t3:i3 = 60p2 + p4t3 t2t; (A.30)From t3 to t4:i4 = 60p4(A.31)ExamplesWhere p2 = 1:2p0; p4 = 0:8p0; t1 = 30; t2 = 90; t3 = 150; t4 = 210:i(t) =8>>>>><>>>>>:601:2p0+ p030t t< 3050p0 30 t< 90600:8p0+1:2p060t 90 t< 15075p0 150 t< 210(A.32)Where p2 = 1:2p0; p4 = 0:8p0; t1 = 60; t2 = 180; t3 = 300; t4 = 420:i(t) =8>>>>><>>>>>:601:2p0+ p030t t< 6050p0 60 t< 180600:8p0+1:2p0120t 180 t< 30075p0 300 t< 420(A.33)A.1.3 Derivation of Ramped Breathing Motion Commands [Simpli edMotion]GeneralFrom t1 to t2:yA = cos(2 rAt) (A.34)and:yA(t = t2) = cos(2 rAt2) (A.35)From t2 to t3:k = rc rat3 t2)yB =  cos 2 (t t2) (rA rC)(t t2)2(t2 t3) +rA) +  (A.36)129A.1. Creature Physiological Mirroring DerivationsyB(t = t2) = yA(t = t2) (A.37) cos( ) =  cos(2 rAt2) (A.38) = 2 rAt2 (A.39)(A.40)yB = cos 2 (t t2) (rA rC)(t t2)2(t2 t3) +rA) + 2 rAt2 (A.41)and:yB(t = t3) = 4 (t3 t2)(rA +rC) + 2 rAt2 (A.42)From t3 to t4:yC = cos(2 rCt+ ) (A.43) = yB(t = t3) = 4 (t3 t2)(rA +rC) + 2 rAt2 (A.44)yC = cos(2 rCt+ 4 (t3 t2)(rA +rC) + 2 rAt2) (A.45)Experiment 3In Experiment 3, where rA = 1:2r0; rC = 0:8r0; t1 = 0; t2 = 240; t3 = 350:y(t) =8><>:yA = cos(2:4 r0t) t< 240yB = cos(2 (t 240)(65r0 1550r0(t 240)) + 576 r0) 240 t< 350yC = cos(1456 r0 + 85 r0t) 350 t< 590(A.46)In Experiment 3, where rA = 0:8r0; rC = 1:2r0; t1 = 0; t2 = 240; t3 = 350:y(t) =8><>:yA = cos(1:6 r0t) t< 240yB = cos(2 (t 240)(45r0 1550r0(t 240)) + 384 r0) 240 t< 350yC = cos(1264 r0 + 125  r0t) 350 t< 590(A.47)A.1.4 Derivation of Ramped Pulse Rate [Simpli ed Motion]GeneralFrom t1 to t2:iA = 60pA; (A.48)130A.2. Physiological Sensor Data Analysis MethodsFrom t2 to t3:iB = 60pA + pCt3 t2t; (A.49)From t3 to t4:iC = 60pC; (A.50)Experiment 3In Experiment 3, where pA = 1:2p0; pC = 0:8p0; t1 = 0; t2 = 240; t3 = 350:i(t) =8>><>>:50p0 t< 240601:2p0 0:4p090t 240 t< 35075p0 350 t< 590(A.51)In Experiment 3, where pA = 0:8r0; pC = 1:2r0; t1 = 0; t2 = 240; t3 = 350:i(t) =8>><>>:75p0 t< 240600:8p0+0:4p090t 240 t< 35050p0 350 t< 590(A.52)A.2 Physiological Sensor Data Analysis MethodsDuring experiments, the following physiological measures were typically calculated.• mean heart rate• heart rate standard deviation• heart rate skewness• heart rate rms standard deviation• heart rate variability:{ pnn50{ vlf%{ lf%{ mf%131A.2. Physiological Sensor Data Analysis Methods{ hf%• skin conductance• skin conductance derivative• electromyogram• electromyogram derivative• skin temperature• skin temperature standard deviation• respiration rate• respiration rate standard deviation• respiration amplitude,• respiration amplitude standard deviationFor an experiment with n physiological measures m calculated, t stages p, o subjects s,for each physiological measure the physiological measure for each subject for each stage  was calculated. For pool-, or group-wise comparisons, two-tailed dependent sample t-testswith  of 0.05 were performed between the columns of  .mn =266666664p1    pts1  1;1... ...so  o;t377777775From these comparisons it was possible to state whether the condition di erence betweenstages had an e ect on that physiological measure.Participant’s heart rate interbeat intervals (ibi) and breath lengths were a series variable,there were numerous samples for each participant for each stage for each experiment. for132A.2. Physiological Sensor Data Analysis Methodsthese two variables only, two-tailed independent sample t-tests were used within subjects todetermine for each participant if the series of ibis or breath lengths were di erent betweenstages. Between subjects comparisons were not performed. From these comparisons it waspossible to state whether a participant’s ibi or breath lengths were di erent between stages.In response to the examination committee, a Bonferroni comparison does not seemappropriate for this situation. These results are also clearly labeled as exploratory, andcomparisons made here are single analyses on separate sensor channels for the users.There are no statistical analyses that I am aware of that can be performed on thequalitative survey results to produce signi cant results. A graph of the survey results istherefore included where these are discussed in the text.133Appendix BExperiment DocumentsThis chapter contains the experiment data not included in the main body. For each exper-iment a pre-experiment questionnaire, post-experiment questionnaire, sample data, samplecomparisons, and participant consent form are included. As explained in each experiment’s\Experiment Procedure" section, there was no pre-experiment questionnaire for Experi-ments 1, 2, and 3; and no post-experiment questionnaire for Experiment 3. The followingis a list of what is included in this section:• Preliminary Experiment{ Pre-Experiment Questionnaire{ Post-Experiment Questionnaire{ Sample Data{ Sample Comparisons{ Participant Consent Form• Experiment 1{ Post-Experiment Questionnaire{ Data Tables{ Sample Data{ Sample Comparisons{ Participant Consent Form• Experiment 2134B.1. Preliminary Experiment{ Post-Experiment Questionnaire{ Data Tables{ Sample Data{ Sample Comparisons{ Participant Consent Form• Experiment 3{ Sample Data{ Sample Comparisons{ Participant Consent Form{ Participant Assent FormA table of contents is also located in the Table of Contents.B.1 Preliminary ExperimentB.1.1 Pre-Experiment QuestionnaireParticipant Questionnaire for Haptics and Anxiety Study1. Age: 18-22 23-26 27-30 30+2. Gender: Male Female3. Profession or Program of Study:135B.1. Preliminary Experiment4. Do/Did you have pets or do you regularly interact with pets. If so, what kind of pets?5. In general do you enjoy the company of animals? If so, what kind of animals (ifdi erent from above)?6. Do you often interact with young children or babies? Yes No7. If yes, list a few of your most pleasurable interactions (e.g., carrying the child, tuckingthem into bed...):8. Did you have stu ed toys when you were a child? Yes No9. Do you currently interact (e.g., play, cuddle, sleep with, etc...) with a stu ed toy? Yes No10. Please rate your comfort with the following ’physical touch’ situations:136B.1. Preliminary ExperimentBeing hugged by a loved-one:(not comfortable) 1 2 3 4 5 (very comfortable)Being hugged by a new acquaintance:(not comfortable) 1 2 3 4 5 (very comfortable)Shaking hands with a colleague:(not comfortable) 1 2 3 4 5 (very comfortable)Shaking hands with a stranger:(not comfortable) 1 2 3 4 5 (very comfortable)Patting a family member’s back:(not comfortable) 1 2 3 4 5 (very comfortable)Patting a friend’s back:(not comfortable) 1 2 3 4 5 (very comfortable)Are there other situations that you would like to mention?137B.1. Preliminary ExperimentB.1.2 Post-Experiment QuestionnairePost-Experiment Questionnaire for Haptics and Anxiety Study1. Please answer the following questions on the given scales.Please rate your emotional state while watching the  rst set of images:(Anxious) 1 2 3 4 5 (Relaxed)(Agitated) 1 2 3 4 5 (Calm)(Quiescent 1 2 3 4 5 (Surprised)Please rate your emotional state while watching the second set of images:(Anxious) 1 2 3 4 5 (Relaxed)(Agitated) 1 2 3 4 5 (Calm)(Quiescent 1 2 3 4 5 (Surprised)I found the haptic device comforting while watching the images:(Strongly Disagree) 1 2 3 4 5 (Strongly Agree)I found the actions of the haptic creature to be a distraction while watching theimages:(Strongly Disagree) 1 2 3 4 5 (Strongly Agree)I feel that the haptic creature would be useful in reducing my anxiety in othersituations:(Strongly Disagree) 1 2 3 4 5 (Strongly Agree)2. Please comment on your reaction to the haptic creature:138B.1. Preliminary ExperimentDe nitions:anxious: troubled or uneasy in mind.relaxed: at ease, free from constraint or tension.agitated: excited, disturbed in mind.calm: quiet, still, tranquil, serene.quiescent: being at rest; quiet; still; inactive or motionless.surprise: to come upon or discover suddenly and unexpectedly.B.1.3 Sample Data0 50 100 150 200 250 300 350 400 450 500-0.5-0.4-0.3-0.2-0.100.10.20.30.4time [s]heart rate accelerationheart rate accelerationFigure B.1: Heart rate acceleration for a participant during the Pilot Experiment. Blacklines delineate experiment stages.139B.1. Preliminary Experiment0 50 100 150 200 250 300 350 400 450 50065707580859095100105110time [s]heart rate [bpm]heart rateFigure B.2: Heart rate for a participant during the Pilot Experiment. Black lines delineateexperiment stages.0 50 100 150 200 250 300 350 400 450 50000.10.20.30.40.50.60.70.80.91time [s]SCRnormnormalized skin conductanceFigure B.3: Normalized skin conductance for a participant during the Pilot Experiment.Black lines delineate experiment stages.140B.1. Preliminary Experiment0 50 100 150 200 250 300 350 400 450 500-0.3-0.2-0.100.10.20.30.40.50.60.7time [s]dSCRnormnormalized derivative of skin conductanceFigure B.4: Skin conductance derivative for a participant during the Pilot Experiment.Black lines delineate experiment stages.0 50 100 150 200 250 300 350 400 450 500-0.19-0.18-0.17-0.16-0.15-0.14-0.13-0.12-0.11-0.1-0.09time [s]normalized EMGnormalized EMGFigure B.5: Normalized EMG for a participant during the Pilot Experiment. Black linesdelineate experiment stages.141B.1. Preliminary ExperimentB.1.4 Sample Comparisons1 2 3 4 5 6 7 8 90102030405060708090100mean heart rate for participants during preliminary experimentparticipantheart rate [bpm]baseline no creature disturbing images no creature baseline creature disturbing images creatureFigure B.6: Mean heart rate for participants during Pilot Experiment.142B.1. Preliminary Experiment1 2 3 4 5 6 7 8 900.050.10.150.20.250.30.35mean normalized standard deviation of heart rate for all participants during preliminary experimentparticipantHR norm sdbaseline no creature disturbing images no creaturebaseline creature disturbing images creatureFigure B.7: Standard deviation of normalized heart rates for participants during PilotExperiment.1 2 3 4 5 6 7 8 9−0.07−0.06−0.05−0.04−0.03−0.02−0.0100.010.02mean normalized heart rate acceleration for all participants during preliminary experimentparticipantHRAccelnormbaseline no creature disturbing images no creaturebaseline creature disturbing images creatureFigure B.8: Mean normalized heart rate acceleration for participants during Pilot Experi-ment.143B.1. Preliminary Experiment1 2 3 4 5 6 7 8 900.10.20.30.40.50.60.70.80.9mean normalized skin conductance for all participants during preliminary experimentparticipantSCRNormbaseline no creature disturbing images no creaturebaseline creature disturbing images creatureFigure B.9: Mean normalized skin conductance for participants during Pilot Experiment.1 2 3 4 5 6 7 8 9−0.03−0.02−0.0100.010.020.030.040.05mean normalized derivative of skin conductance for all participants during preliminary experimentparticipantdSCRnormbaseline no creature disturbing images no creaturebaseline creature disturbing images creatureFigure B.10: Mean normalized derivative of skin conductance for participants during PilotExperiment.144B.1. Preliminary Experiment1 2 3 4 5 6 7 8 9−0.2−0.15−0.1−0.0500.050.10.150.2mean normalized EMG for participants during preliminary experimentparticipantEMGnormbaseline no creature disturbing images no creaturebaseline creature disturbing images creatureFigure B.11: Mean normalized EMG for participants during Pilot Experiment.1 2 3 4 5 6 7 8 900.050.10.150.20.250.30.350.40.45estimated arousal for all participants during preliminary experimentparticipantarousalbaseline no creature disturbing images no creaturebaseline creature disturbing images creatureFigure B.12: Mean estimated arousal for participants during Pilot Experiment.145B.1. Preliminary ExperimentB.1.5 Participant Consent Form                 THE UNIVERSITY OF BRITISH COLUMBIA            Department of Computer Science 2366 Main Mall Vancouver, B.C.  Canada  V6T 1Z4 tel:   (604) 822-3061 fax:  (604) 822-4231 (PARTICIPANT’S  COPY CONSENT FORM) Project Title: Physical user interfaces: Communication of information and affect  (UBC Ethics #B01-0470) Principal Investigator: Associate Professor K. MacLean, tel. 604-822-8169  The purpose of this study is to examine the role of haptic (touch sense) feedback on anxiety levels.  You will be asked to wear external (i.e., non-invasive) sensors that collect some basic physiological information such as the heart rate, respiration rate, some muscle activity, and perspiration.  Please tell the experimenter if you find the sensor positioning uncomfortable, and adjustments will be made.  You will be asked to answer questions in two questionnaires as part of the experiment. The study will be viewed by the experimenters in a separate room via a webcam.  It will not be recorded. For this study, you will also be asked to view two slide-shows of pictures that you may find disturbing.  The outline of the study is as follows:  You will first be asked to answer a questionnaire.  You will then be connected to the bio-sensors.  Then you will then be shown a two-minute slide show of approximately ten pictures.  Next you will be given a haptic creature that you will hold while watching another set of pictures shown in the same format as before.  Finally, you will complete another questionnaire.  If you are not sure about any instructions, do not hesitate to ask.  REIMBURSEMENT: $5 per  hour session TIME COMMITMENT:  hour session CONFIDENTIALITY: Your results will be confidential:  you will not be identified by name in any study reports. Test results will be stored in a secure Computer Science account accessible only to the experimenters.  You understand that the experimenter will ANSWER ANY QUESTIONS you have about the instructions or the procedures of this study. After participating, the experimenter will answer any questions you have about this study. Your participation in this study is entirely voluntary and you may refuse to participate or withdraw from the study at any time without jeopardy. Your signature below indicates that you have received a copy of this consent form for your own records, and consent to participate in this study.  If you have any concerns about your treatment or rights as a research subject, you may contact the Research Subject Info Line in the UBC Office of Research Services at 604-822-8598. 146B.2. Experiment 1B.2 Experiment 1B.2.1 Post-Experiment QuestionnaireCreature ImpressionI found the creature comfortable onmy lap.(strongly disagree) 1 2 3 4 5 (strongly agree)Did this impression change at all once the creature started moving?What changes would you recommend to make the creature more comfortable?Creature ActivityDescribe your overall impression of the creature’s activityWhat did you like the most about the creature’s activity?What did you like the least about the creature’s activity?147B.2. Experiment 1Did you expect this sort of activity from the creature?I was startled by the activation of thecreature.(strongly disagree) 1 2 3 4 5 (strongly agree)I found the creature’s motion dis-turbing.(strongly disagree) 1 2 3 4 5 (strongly agree)I found the noise of the creature dis-tracting.(strongly disagree) 1 2 3 4 5 (strongly agree)How many distinct creature operating modes were you able to observe?Please describe all the modes you were able to observe.Which sequence did you  nd more pleasurable?148B.2. Experiment 1Overall ResponseIt was easy to recognize the creaturemirroring my breathing.(strongly disagree) 1 2 3 4 5 (strongly agree)I found the creature mirroring mybreathing comforting.(strongly disagree) 1 2 3 4 5 (strongly agree)I found the creature mirroring mybreathing disturbing.(strongly disagree) 1 2 3 4 5 (strongly agree)The creatures breathing made memore aware of my own breathing.(strongly disagree) 1 2 3 4 5 (strongly agree)Was it evident that the creature was mirroring your breathing?It was easy to recognize the creaturemirroring my pulse.(strongly disagree) 1 2 3 4 5 (strongly agree)I found the creature mirroring mypulse comforting.(strongly disagree) 1 2 3 4 5 (strongly agree)I found the creature mirroring mypulse disturbing(strongly disagree) 1 2 3 4 5 (strongly agree)The creatures pulse made me moreaware of my own heart rate.(strongly disagree) 1 2 3 4 5 (strongly agree)Was it evident that the creature was mirroring your pulse?149B.2. Experiment 1Describe your overall reaction to the creature mirroring your breathing rate and pulse.Were you surprised at your breathing rate when you felt it in the creature?Were you surprised at your breathing rate when you felt it in the creature?150B.2. Experiment 1B.2.2 Data TablesTable B.1: Table of results from Experiment 1 questionnaire (1 = strongly disagree, 5 =strongly agree).Responses Statement1 2 3 4 5 na6 2 2 1 It was easy to recognize the creature mirroring my breathing.1 1 8 I found the creature mirroring my breathing comforting.2 1 2 5 I found the creature mirroring my breathing disturbing.2 2 2 4 The creature’s breathing made me more aware of my own breathing.7 1 1 1 1 It was easy to recognize the creature mirroring my pulse.1 1 7 I found the creature mirroring my pulse comforting.1 2 2 5 I found the creature mirroring my pulse disturbing.5 2 1 2 The creature’s pulse made me more aware of my own heart rate.1 8 1 I found the creature comfortable on my lap.3 1 3 3 I was startled by the activation of the creature.3 4 3 I found the creature’s motion disturbing.4 5 1 I found the noise of the creature distractingTable B.2: Results for two-tailed unequal variance t-test between breath lengths for eachsubject between all stages. ’Y’ indicates a signi cant di erence between the two stages.subjectcondition tested 1 2 3 4 5 6 7 8 9 10  still{constant motion 0.005 0.475 < 0:001 0.226 0.007 < 0:001 0.7482 0.014 < 0:001 0.145 6still{mirroring 0.002 0.116 < 0:001 0.558 0.033 < 0:001 0.631 < 0:001 0.011 0.184 7constant motion{mirroring 0.049 0.129 0.027 0.385 0.417 0.400 0.451 0.080 0.176 0.738 3still{2:5s breaths 0.050 0.970 0.121 0.021 0.368 0.012 < 0:001 0.528 0.018 0.117 5constant motion{2:5s breaths < :001 0.727 0.942 0.018 0.144 0.167 < 0:001 0.020 0.261 0.005 5mirroring{2:5s breaths 0.069 0.313 0.598 0.001 0.157 0.469 0.014 0.016 0.534 0.015 5151B.2. Experiment 1Table B.3: Results for two-tailed unequal variance t-test between series of interbeat intervalsfor each subject between all stages. ’Y’ indicates a signi cant di erence between the twostages.subjectcondition tested 1 2 3 4 5 6 7 8 9 10  still{constant motion 0.314 < 0:001 < 0:001 0.385 0.078 < 0:001 0.120 0.090 < 0:001 0.009 7still{mirroring < 0:001 < 0:001 < 0:001 0.565 0.005 < 0:001 0.085 0.009 < 0:001 < 0:001 9constant motion{mirroring < 0:001 0.800 < 0:001 0.148 0.478 0.185 0.749 0.218 0.062 < 0:001 5still{70bpm 0.004 0.478 0.688 < 0:001 0.712 0.916 0.228 0.003 0.014 < 0:001 5constant motion{70bpm 0.010 0.810 0.017 < 0:001 0.583 0.684 0.716 0.015 0.806 < 0:001 5mirroring{70bpm 0.223 0.779 0.059 < 0:001 0.390 0.630 0.779 0.065 0.492 < 0:001 4B.2.3 Sample Data0 50 100 150 200 250 300 350-10-8-6-4-20246heart rate accelerationtime [s]heart rate acceleration for participant 1Figure B.13: Heart rate acceleration for a participant during Experiment 1. Black linesdelineate experiment stages ii, iii, and iv as listed in Figure 4.9.152B.2. Experiment 10 50 100 150 200 250 300 350-0.6-0.5-0.4-0.3-0.2-0.100.10.20.30.4normalized heart rate accelerationtime [s]normalized heart rate acceleration for participant 1Figure B.14: Normalized heart rate acceleration for a participant during Experiment 1.Black lines delineate experiment stages ii, iii, and iv as listed in Figure 4.9.0 50 100 150 200 250 300 3506065707580859095100heart rate [bpm]time [s]heart rate for participant 1Figure B.15: Heart rate for a participant during Experiment 1. Black lines delineate exper-iment stages ii, iii, and iv as listed in Figure 4.9.153B.2. Experiment 10 50 100 150 200 250 300 350-0.4-0.3-0.2-0.100.10.20.3normalized heart ratetime [s]normalized heart rate for participant 1Figure B.16: Normalized heart rate for a participant during Experiment 1. Black linesdelineate experiment stages ii, iii, and iv as listed in Figure 4.9.0 50 100 150 200 250 300 350020406080100120140160heart rate standard deviation (ms)time [s]heart rate standard deviation for participant 1Figure B.17: Normalized heart rate standard deviation for a participant during Experiment1. Black lines delineate experiment stages ii, iii, and iv as listed in Figure 4.9.154B.2. Experiment 10 50 100 150 200 250 300 3504.555.566.577.588.599.5skin conductance [siemens]time [s]skin conductance for participant 1Figure B.18: Skin conductance response for a participant during Experiment 1. Black linesdelineate experiment stages ii, iii, and iv as listed in Figure 4.9.0 50 100 150 200 250 300 35000.10.20.30.40.50.60.70.80.91normalized skin conductancetime [s]normalized skin conductance for participant 1Figure B.19: Normalized skin conductance for a participant during Experiment 1. Blacklines delineate experiment stages ii, iii, and iv as listed in Figure 4.9.155B.2. Experiment 10 50 100 150 200 250 300 350-0.3-0.2-0.100.10.20.30.40.50.60.7normalized skin conductance derivativetime [s]normalized skin conductance derivative for participant 1Figure B.20: Skin conductance derivative for a participant during Experiment 1. Blacklines delineate experiment stages ii, iii, and iv as listed in Figure 4.9.0 50 100 150 200 250 300 350-0.6-0.5-0.4-0.3-0.2-0.100.10.20.30.4time [s]derivative of skin conductance normalizedderivative of skin conductance normalized for participant 1Figure B.21: Normalized skin conductance derivative for a participant during Experiment1. Black lines delineate experiment stages ii, iii, and iv as listed in Figure 4.9.156B.2. Experiment 10 50 100 150 200 250 300 350350400450500550600650emg [mV]time [s]emg for participant 1Figure B.22: EMG for a participant during Experiment 1. Black lines delineate experimentstages ii, iii, and iv as listed in Figure 4.9.0 50 100 150 200 250 300 350-0.04-0.0200.020.040.060.080.1normalized emgtime [s]normalized emg for participant 1Figure B.23: Normalized EMG for a participant during Experiment 1. Black lines delineateexperiment stages ii, iii, and iv as listed in Figure 4.9.157B.2. Experiment 10 50 100 150 200 250 300 350024681012breath lengths for participant 1breath lengths [s]time [s]Figure B.24: Breath lengths for a participant during Experiment 1. Black lines delineateexperiment stages ii, iii, and iv as listed in Figure 4.9.158B.2. Experiment 1B.2.4 Sample Comparisons1 2 3 4 5 6 7 8 9 1001234567mean breath length for all participantsparticipantmean breath length [s]  creature stillcreature constant motioncreature mirroringFigure B.25: Mean breath lengths for participants during Experiment 1.159B.2. Experiment 11 2 3 4 5 6 7 8 9 1000.511.522.53mean breath length sd for all participantsparticipantmean breath length sd [s]  creature stillcreature constant motioncreature mirroringFigure B.26: Breath length standard deviation for participants during Experiment 1.1 2 3 4 5 6 7 8 9 10-0.6-0.4-0.200.20.40.60.8mean heart rate acceleration for all participantsparticipantmean heart rate accel  creature stillcreature constant motioncreature mirroringFigure B.27: Mean heart rate acceleration for participants during Experiment 1.160B.2. Experiment 11 2 3 4 5 6 7 8 9 10024681012141618mean heart rate accel sd for all participantsparticipantmean heart rate accel sd  creature stillcreature constant motioncreature mirroringFigure B.28: Heart rate acceleration standard deviation for participants during Experiment1.1 2 3 4 5 6 7 8 9 1002468101214mean skin conductance response for all participantsparticipantmean scr [siemens]  baselineno creaturecreatureFigure B.29: Mean skin conductance for participants during Experiment 1.161B.2. Experiment 11 2 3 4 5 6 7 8 9 1000.20.40.60.811.21.41.61.8mean skin conductance standard deviation for all participantsparticipantmean skin conductance sd [siemens]  creature stillcreature constant motioncreature mirroringFigure B.30: Skin conductance standard deviation for participants during Experiment 1.162B.2. Experiment 1mean breath lengths during experimentparticipantmean breath length [s]1 2 3 4 5 6 7 8 9 100123456creature stillcreature constant motioncreature mirroringαbeta.ipagamma.ipaδεζ gamma.ipa αbeta.ipaδ εζ αbeta.ipaδ αbeta.ipagamma.ipaδ beta.ipaεζ αbeta.ipaεζ αbeta.ipaδ εζcreature constant motion breath lengthparticipantstandard deviation of breath lengths [s] standard deviation of breath lengths during experiment1 2 3 4 5 6 7 8 9 1000.20.40.60.811.21.41.61.82creature stillcreature constant motioncreature mirroringFigure B.31: Mean and standard deviation of breath lengths of participants during eachstage of Experiment 1. Greek letters refer to within-subject mean di erences. For eachparticipant,  indicates signi cant di erence between still and constant motion stages.  indicates signi cant di erence between still and mirroring stages.  indicates signi cantdi erence between constant motion and mirroring stages.  indicates signi cant di erencebetween still stage and constant motion and constant 2:5 s breaths.  indicates signi cantdi erence between constant motion stage and constant 2:5 s breaths.  indicates signi cantdi erence between mirroring stage and constant 2:5 s breaths. The standard deviation ofthe constant motion stage is at or close to zero.163B.2. Experiment 11 2 3 4 5 6 7 8 9 10405060708090100110120mean heart rate for all participantsparticipantmean heart rate [bpm]  creature still creature constant motioncreature mirroringbeta.ipagamma.ipaδε αbeta.ipa αbeta.ipagamma.ipaεζ δεζ αbeta.ipa αbeta.ipagamma.ipa beta.ipa αbeta.ipaδεζαbeta.ipagamma.ipaδαbeta.ipagamma.ipaδεζcreature constant motion heart rate1 2 3 4 5 6 7 8 9 10050100150200250300mean heart rate standard deviation for all participantsparticipantmean heart rate sd [ms]  creature stillcreature constant motioncreature mirroringFigure B.32: Mean and standard deviation of heart rate for participants during Experiment1. Greek letters refer to within-subject mean di erences. For each participant,  indicatessigni cant di erence between still and constant motion stages.  indicates signi cant di er-ence between still and mirroring stages.  indicates signi cant di erence between constantmotion and mirroring stages.  indicates signi cant di erence between still stage and con-stant motion at 70bpm.  indicates signi cant di erence between constant motion stageand constant motion at 70bpm.  indicates signi cant di erence between mirroring stageand constant motion at 70bpm. The standard deviation of the constant motion stage is ator close to zero.164B.2. Experiment 1B.2.5 Participant Consent FormVersion 1.0 / August 10, 2009             THE UNIVERSITY OF BRITISH COLUMBIA    Department of Computer Science 2366 Main Mall Vancouver, B.C.  Canada  V6T 1Z4 tel:   (604) 822-3061 fax:  (604) 822-4231 (PARTICIPANT’S  COPY CONSENT FORM) Project Title:  Investigation of haptic-affect loop through the haptic creature   (UBC Ethics #B01-0470)  Principal Investigators:  Dr. Karon MacLean, Department of Comptuer Science, 604-822-8169      Dr. Elizabeth Croft, Department of Mechanical Engineering, 604-822-6614 Student Investigator: Joseph P. Hall III, Department of Mechanical Engineering, jphiii@interchange.ubc.ca  The purpose of this study is to examine your reaction to interaction through touch with a robotic pet.   You will be asked to hold and touch a small robot that may gently move.  You will be asked to wear external (i.e. non-invasive) sensors that collect some basic physiological information such as  heart rate, respiration rate, some muscle activity, and perspiration.  Please tell the experimenter if you find the sensors uncomfortable and adjustments will be made. You will be asked to answer questions in a questionnaire as part of the experiment.  Parts of this experiment will be videotaped for later analysis. If you are unsure about any instructions, do not hesitate to ask.  TIME COMMITMENT:  -1 hour session CONFIDENTIALITY: Your results will be confidential:  you will not be identified by name in any study reports. Test results will be stored in a secure computer account accessible only to the experimenters.  You understand that the experimenters will ANSWER ANY QUESTIONS you have about the instructions or the procedures of this study. After participating, the experimenter will answer any other questions you have about this study. Your participation in this study is entirely voluntary and you may refuse to participate or withdraw from the study at any time without jeopardy. Your signature below indicates that you have received a copy of this consent form for your own records, and consent to participate in this study.  If you have any concerns about your treatment or rights as a research subject, you may contact the Research Subject Info Line in the UBC Office of Research Services at 604-822-8598.   165B.3. Experiment 2B.3 Experiment 2B.3.1 Post-Experiment QuestionnaireWhen asked to mirror the creatureI was able to easily mirror the crea-tures breathing.(strongly disagree) 1 2 3 4 5 (strongly agree)I was aware of the creatures pulse. (strongly disagree) 1 2 3 4 5 (strongly agree)I was comfortable with the creatureon my lap.(strongly disagree) 1 2 3 4 5 (strongly agree)I was aware of my own breathing. (strongly disagree) 1 2 3 4 5 (strongly agree)I was aware of my own heart rate. (strongly disagree) 1 2 3 4 5 (strongly agree)I found the noise of the creature dis-tracting.(strongly disagree) 1 2 3 4 5 (strongly agree)While sitting still with the creatureI was aware of the creatures breath-ing.(strongly disagree) 1 2 3 4 5 (strongly agree)I was aware of the creatures pulse. (strongly disagree) 1 2 3 4 5 (strongly agree)I noticed changes in the creaturesbreathing.(strongly disagree) 1 2 3 4 5 (strongly agree)I noticed changes in the creaturespulse.(strongly disagree) 1 2 3 4 5 (strongly agree)I was aware of my own breathing. (strongly disagree) 1 2 3 4 5 (strongly agree)I was aware of my own heart rate. (strongly disagree) 1 2 3 4 5 (strongly agree)I was comfortable with the creatureon my lap.(strongly disagree) 1 2 3 4 5 (strongly agree)166B.3. Experiment 2During the reading assignmentI was aware of the creatures breath-ing.(strongly disagree) 1 2 3 4 5 (strongly agree)I was aware of the creatures pulse. (strongly disagree) 1 2 3 4 5 (strongly agree)I noticed changes in the creaturesbreathing.(strongly disagree) 1 2 3 4 5 (strongly agree)I noticed changes in the creaturespulse.(strongly disagree) 1 2 3 4 5 (strongly agree)I was aware of my own breathing. (strongly disagree) 1 2 3 4 5 (strongly agree)I was aware of my own heart rate. (strongly disagree) 1 2 3 4 5 (strongly agree)I was comfortable with the creatureon my lap.(strongly disagree) 1 2 3 4 5 (strongly agree)I found the creatures motion dis-tracting.(strongly disagree) 1 2 3 4 5 (strongly agree)In general during the experimentThe creature made me more awareof my own breathing.(strongly disagree) 1 2 3 4 5 (strongly agree)The creature made me more awareof my own heart rate.(strongly disagree) 1 2 3 4 5 (strongly agree)I enjoyed interacting with the crea-ture.(strongly disagree) 1 2 3 4 5 (strongly agree)167B.3. Experiment 2B.3.2 Data TablesTable B.4: Table of results from Experiment 2 questionnaire (1 = strongly disagree, 5 =strongly agree), n = 10.1 2 3 4 55 2 2 1 It was easy to recognize the creature mirroring my breathing.1 1 I found the creature mirroring my breathing comforting (if noticed).2 1 2 I found the creature mirroring my breathing disturbing (if noticed).2 2 2 4 The creature’s breathing made me more aware of my own breathing.6 1 1 1 1 It was easy to recognize the creature mirroring my pulse.1 1 8 I found the creature mirroring my pulse comforting.1 2 2 5 I found the creature mirroring my pulse disturbing.5 2 1 2 The creature’s pulse made me more aware of my own heart rate.1 8 1 I found the creature comfortable on my lap.3 1 3 3 I was startled by the activation of the creature.3 4 3 I found the creature’s motion disturbing.3 4 3 I found the noise of the creature distractingTable B.5: Results for two-tailed unequal variance t-test between series of interbeat intervalsfor each subject between all stages. ’Y’ indicates a signi cant di erence between the twostages.subject1 2 3 4 5 6 7 8 9baseline-training 0.108 0.014 0.000 0.000 0.000 0.000 0.000 0.008 0.018baseline-task 0.466 0.001 0.001 0.012 0.000 0.103 0.004 0.000 0.023baseline-no task 0.000 0.040 0.000 0.000 0.000 0.000 0.000 0.000 0.000baseline - 70bpm 0.723 0.146 0.205 0.778 0.000 0.965 0.001 0.571 0.158task-70bpm 0.732 0.057 0.794 0.421 0.032 0.581 0.012 0.415 0.297no task-70bpm 0.840 0.226 0.042 0.553 0.221 0.899 0.003 0.864 0.436168B.3. Experiment 2Table B.6: Questionnaire results from Experiment 2 post-experiment survey (1 = stronglydisagree, 5 = strongly agree).When asked to mirror creature:1 4 4 I was able to easily mirror the creature’s breathing2 1 6 I was aware of the creature’s pulse5 4 I was comfortable with creature on my lab9 I was aware of own breathing6 3 I was aware of own heartrate4 4 1 I found noise of creature distracting1 2 3 4 5While sitting with active creature:2 7 I was aware of the creature’s breathing1 1 1 6 I was aware of the creature’s pulse1 1 4 3 I noticed changes in the creature’s breathing6 1 2 I noticed changes in the creature’s pulse1 2 4 2 I was aware of my own breathing3 4 2 I was aware of my own heart rate1 4 4 I was comfortable with creature on my lap1 2 3 4 5During reading task:1 1 5 2 I was aware of the creature’s breathing1 4 3 1 I was aware of the creature’s pulse1 4 1 1 2 I noticed changes in the creature’s breathing4 4 1 I noticed changes in the creature’s pulse4 3 2 I was aware of my own breathing6 2 1 I was aware of my own heart rate1 2 5 1 I was comfortable with creature on my lap1 3 1 3 1 I found creature’s motion distracting1 2 3 4 5Overall:1 4 4 creature made me more aware of breathing1 6 2 creature made me more aware of heart rate2 7 enjoyed interacting1 2 3 4 5169B.3. Experiment 2B.3.3 Sample Data0 200 400 600 800 1000 1200-8-6-4-2024heart rate accelerationtime [s]heart rate acceleration for participant 1Figure B.33: Heart rate acceleration for a participant during Experiment 2. Black linesdelineate experiment stages i, ii, iii, and iv as listed in Figure 4.14.170B.3. Experiment 20 200 400 600 800 1000 1200-0.5-0.4-0.3-0.2-0.100.10.20.3normalized heart rate accelerationtime [s]normalized heart rate acceleration for participant 1Figure B.34: Normalized heart rate acceleration for a participant during Experiment 2.Black lines delineate experiment stages i, ii, iii, and iv as listed in Figure 4.14.0 200 400 600 800 100012006065707580859095heart rate [bpm]time [s]heart rate for participant 1Figure B.35: Heart rate for a participant during Experiment 2. Black lines delineate exper-iment stages i, ii, iii, and iv as listed in Figure 4.14.171B.3. Experiment 20 200 400 600 800 1000 1200-0.4-0.3-0.2-0.100.10.20.3normalized heart ratetime [s]normalized heart rate for participant 1Figure B.36: Normalized heart rate for a participant during Experiment 2. Black linesdelineate experiment stages i, ii, iii, and iv as listed in Figure 4.14.0 200 400 600 800 1000 1200020406080100120heart rate sd (ms)time [s]heart rate standard deviation for participant 1Figure B.37: Normalized heart rate standard deviation for a participant during Experiment2. Black lines delineate experiment stages i, ii, iii, and iv as listed in Figure 4.14.172B.3. Experiment 20 200 400 600 800 1000 120067891011121314skin conductance [siemens]time [s]skin conductance for participant 1Figure B.38: Skin conductance response for a participant during Experiment 2. Black linesdelineate experiment stages i, ii, iii, and iv as listed in Figure 4.14.0 200 400 600 800 1000 120000.10.20.30.40.50.60.70.80.91normalized skin conductancetime [s]normalized skin conductance for participant 1Figure B.39: Normalized skin conductance for a participant during Experiment 2. Blacklines delineate experiment stages i, ii, iii, and iv as listed in Figure 4.14.173B.3. Experiment 20 200 400 600 800 1000 1200-6-4-20246x 10-3derivative of skin conductance [siemens/s]time [s]derivative of skin conductance for participant 1Figure B.40: Skin conductance derivative for a participant during Experiment 2. Blacklines delineate experiment stages i, ii, iii, and iv as listed in Figure 4.14.0 200 400 600 800 1000 1200-0.6-0.4-0.200.20.40.60.8normalized skin conductance derivativetime [s]normalized skin conductance derivative for participant 1Figure B.41: Normalized skin conductance derivative for a participant during Experiment2. Black lines delineate experiment stages i, ii, iii, and iv as listed in Figure 4.14.174B.3. Experiment 20 200 400 600 800 1000 1200480500520540560580600620640emg [mV]time [s]emg for participant 1Figure B.42: EMG for a participant during Experiment 2. Black lines delineate experimentstages i, ii, iii, and iv as listed in Figure 4.14.0 200 400 600 800 1000 1200-0.05-0.04-0.03-0.02-0.0100.01normalized emgtime [s]normalized emg for participant 1Figure B.43: Normalized EMG for a participant during Experiment 2. Black lines delineateexperiment stages i, ii, iii, and iv as listed in Figure 4.14.175B.3. Experiment 20 200 400 600 800 1000 120028.52929.53030.531time [s]skin temperature [ºC]skin temperature for participant 1Figure B.44: Skin temperature for a participant during Experiment 2. Black lines delineateexperiment stages i, ii, iii, and iv as listed in Figure 4.14.0 200 400 600 800 1000 1200024681012breath lengths for participant 1breath length [s]time [s]Figure B.45: Breath lengths for a participant during Experiment 2. Black lines delineateexperiment stages i, ii, iii, and iv as listed in Figure 4.14.176B.3. Experiment 2B.3.4 Sample Comparisonsstandard deviation [s]participantStandard deviation of breath lengths for participants during experiment1 2 3 4 5 6 7 8 900.511.522.5baselinewhen asked to mirror creaturewhen sitting calmlywhen performing taskFigure B.46: Standard deviation of breath lengths for all participants during Experiment2.177B.3. Experiment 21 2 3 4 5 6 7 8 90510152025participantmean breath length [s]mean breath length for participants during experiment 2  slow trainingfast trainingslow no taskfast no taskslow task fast taskFigure B.47: Mean breath length for all participants during Experiment 2.1 2 3 4 5 6 7 8 955606570758085heart rate for all participantsparticipantmean heart rate [bpm]  calm imagesdisturbing images w/o creaturecalm imagesdisturbing images with creatureFigure B.48: Mean heart rate for participants during Experiment 2.178B.3. Experiment 21 2 3 4 5 6 7 8 90123456789participanthr sd [bpm]hr sd for all participants during experiment 2  baseline training task no taskFigure B.49: Mean heart rate standard deviation for participants during Experiment 2.HF % of heart rate variability during experimentparticipanthf %1 2 3 4 5 6 7 8 9010203040506070baselinewhen asked to mirror creaturewhen sitting calmlywhen performing taskFigure B.50: High frequency component of heart rate variability during Experiment 2 forall participants for all stages.179B.3. Experiment 21 2 3 4 5 6 7 8 900.10.20.30.40.50.60.70.80.9mean normalized skin conductance for all participantsparticipantmean normalized skin conductance  baselinetraining no task taskFigure B.51: Mean skin conductance for participants during Experiment 2.1 2 3 4 5 6 7 8 9-0.025-0.02-0.015-0.01-0.00500.0050.010.0150.02normalized skin conductance derivative for all participantsparticipantmean normalized skin conductance derivative  baselinetrainingno task taskFigure B.52: Mean derivative of skin conductance for participants during Experiment 2.180B.3. Experiment 21 2 3 4 5 6 7 8 90100200300400500600700EMG for all participantsparticipantmean EMG [mV]  baselinetraining no task taskFigure B.53: Mean EMG for participants during Experiment 2.1 2 3 4 5 6 7 8 905101520253035skin temperature for all participantsparticipantmean skin temperature [˚C]  baselinetraining no task taskFigure B.54: Mean skin temperature for participants during Experiment 2.181B.3. Experiment 2B.3.5 Participant Consent FormVersion 1.1 / December 2, 2009             THE UNIVERSITY OF BRITISH COLUMBIA               Department of Computer Science 2366 Main Mall Vancouver, B.C.  Canada  V6T 1Z4 tel:   (604) 822-3061 fax:  (604) 822-4231 (PARTICIPANT’S  COPY CONSENT FORM) Project Title:  Investigation of haptic-affect loop through the haptic creature   (UBC Ethics #B01-0470) Principal Investigators:   Dr. Karon MacLean, Department of Computer Science, 604-822-8169     Dr. Elizabeth Croft, Department of Mechanical Engineering, 604-822-6614 Student Investigator:  Joseph P. Hall III, Department of Mechanical Engineering, jphiii@interchange.ubc.ca  The purpose of this study is to examine your reaction to interaction through touch with a robotic pet.   You will be asked to hold and touch a small robot that may gently move.  You will be asked to wear external (i.e. non-invasive) sensors that collect some basic physiological information such as  heart rate, respiration rate, some muscle activity, and perspiration.  Please tell the experimenter if you find the sensors uncomfortable and adjustments will be made. You will be asked to answer questions in a questionnaire as part of the experiment.   If you are unsure about any instructions, do not hesitate to ask. TIME COMMITMENT: ½ -1 hour session CONFIDENTIALITY: Your results will be confidential:  you will not be identified by name in any study reports. Test results will be stored in a secure computer account accessible only to the experimenters. You understand that the experimenters will ANSWER ANY QUESTIONS you have about the instructions or the procedures of this study. After participating, the experimenter will answer any other questions you have about this study. Your participation in this study is entirely voluntary and you may refuse to participate or withdraw from the study at any time without jeopardy. Your signature below indicates that you have received a copy of this consent form for your own records, and consent to participate in this study.  If you have any concerns about your treatment or rights as a research subject, you may contact the Research Subject Info Line in the UBC Office of Research Services at 604-822-8598. 182B.4. Experiment 3B.4 Experiment 3B.4.1 Sample Data0 100 200 300 400 500 600 700 800 900 1000-4-3-2-10123time [s]heart rate accelerationheart rate acceleration for participant 1Figure B.55: Heart rate acceleration for a participant during Experiment 3. Black linesdelineate experiment stages as listed in Figure 4.22.183B.4. Experiment 30 100 200 300 400 500 600 700 800 900 1000-0.2-0.15-0.1-0.0500.050.10.150.2time [s]normalized heart rate accelerationnormalized heart rate acceleration for participant 1Figure B.56: Normalized heart rate acceleration for a participant during Experiment 3.Black lines delineate experiment stages as listed in Figure 4.22.0 100 200 300 400 500 600 700 800 900 1000859095100105110time [s]heart rate [bpm]heart rate for participant 1Figure B.57: Heart rate for a participant during Experiment 3. Black lines delineate exper-iment stages as listed in Figure 4.22.184B.4. Experiment 30 100 200 300 400 500 600 700 800 900 1000-0.15-0.1-0.0500.050.10.150.2time [s]normalized heart ratenormalized heart rate for participant 1Figure B.58: Normalized Heart Rate for a participant during Experiment 3. Black linesdelineate experiment stages as listed in Figure 4.22.0 100 200 300 400 500 600 700 800 900 10000510152025303540time [s]heart rate standard deviation [ms]heart rate standard deviation for participant 1Figure B.59: Normalized heart rate standard deviation for a participant during Experiment3. Black lines delineate experiment stages as listed in Figure 4.22.185B.4. Experiment 30 100 200 300 400 500 600 700 800 900 10003.544.555.566.577.588.5time [s]skin conductance response [siemens]skin conductance response for participant 1Figure B.60: Skin conductance response for a participant during Experiment 3. Black linesdelineate experiment stages as listed in Figure 4.22.0 100 200 300 400 500 600 700 800 900 100000.10.20.30.40.50.60.70.80.91time [s]normalized skin conductance responsenormalized skin conductance response for participant 1Figure B.61: Normalized skin conductance for a participant during Experiment 3. Blacklines delineate experiment stages as listed in Figure 4.22.186B.4. Experiment 30 100 200 300 400 500 600 700 800 900 1000-2-1.5-1-0.500.511.522.53x 10-3time [s]derivative of skin conductance [siemens/s]derivative of skin conductance response for participant 1Figure B.62: Skin conductance derivative for a participant during Experiment 3. Blacklines delineate experiment stages as listed in Figure 4.22.0 100 200 300 400 500 600 700 800 900 1000-0.4-0.200.20.40.60.81time [s]normalized derivative of skin conductancenormalized derivative of skin conductance response for participant 1Figure B.63: Normalized skin conductance derivative for a participant during Experiment3. Black lines delineate experiment stages as listed in Figure 4.22.187B.4. Experiment 30 100 200 300 400 500 600 700 800 900 100002468101214time [s]emg [mV]emg for participant 1Figure B.64: EMG for a participant during Experiment 3. Black lines delineate experimentstages as listed in Figure 4.22.0 100 200 300 400 500 600 700 800 900 1000-0.200.20.40.60.811.2time [s]normalized emgnormalized emg for participant 1Figure B.65: Normalized EMG for a participant during Experiment 3. Black lines delineateexperiment stages as listed in Figure 4.22.188B.4. Experiment 30 100 200 300 400 500 600 700 800 900 100025.425.625.82626.226.426.626.82727.2time [s]skin temperature [ºC]skin temperature for participant 1Figure B.66: Skin temperature for a participant during Experiment 3. Black lines delineateexperiment stages as listed in Figure 4.22.0 100 200 300 400 500 600 700 800 900 1000024681012breath lengths for participant 1breath lengths [s]time [s]Figure B.67: Breath lengths for a participant during Experiment 3. Black lines delineateexperiment stages as listed in Figure 4.22.189B.4. Experiment 3B.4.2 Sample Comparisons190B.4.Experiment30 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 1900.050.10.150.20.250.30.350.40.45hr sd for all participantsparticipanthr sd [s]  baseline no creature creature creature slow creature fast activity creature presence creature motionFigure B.68: Mean heart rate standard deviation for participan ts during Exp erimen t 3.191B.4.Experiment30 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 1900.10.20.30.40.50.60.7hr pnn50 for all participantsparticipanthr pnn50  baseline no creature creature creature slow creature fast activity creature presence creature motionFigure B.69: Mean heart rate pnn50 for participan ts d uring Exp erimen t 3.192B.4.Experiment30 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 1900.10.20.30.40.50.60.70.8scr norm mean for all subjectssubjectscr norm mean  baseline no creature creature creature slow creature fast activity creature presence creature motionFigure B.70: Mean skin conductance for participan ts during Exp erimen t 3.193B.4.Experiment30 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 190.10.150.20.250.30.350.4scr norm sd for all participantsparticipantscr norm sd  baseline no creature creature creature slow creature fast activity creature presence creature motionFigure B.71: Mean skin conductance standard deviation for participan ts during Exp erimen t 3.194B.4.Experiment30 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 1920253035skin temp. mean for all participantsparticipantmean skin temp [˚C]  baseline no creature creature creature slow creature fast activity creature presence creature motionFigure B.72: Mean skin temp erature for participan ts during Exp erimen t 3.195B.4.Experiment30 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 1900.511.522.5skin temp. sd for all participantsparticipantskin temperature sd [˚C]  baseline no creature creature creature slow creature fast activity creature presence creature motionFigure B.73: Mean skin temp erature standard deviation for participan ts d uring Exp erimen t 3.196B.4.Experiment30 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19012345678breath lengths mean for all participantsparticipantbreath lengths mean [s]  baseline no creature creature creature slow creature fast activity creature presence creature motionFigure B.74: Mean breath length f or participan ts during Exp erimen t 3.197B.4.Experiment30 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 1900.511.522.533.5x 10-3 hr vlf % for all participantsparticipanthr vlf %  baseline no creature creature creature slow creature fast activity creature presence creature motionFigure B.75: Heart rate vlf% for participan ts during Exp erimen t 3.198B.4.Experiment30 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 1900.050.10.150.20.25dscr norm sd for all participantsparticipantdscr norm sd  baseline no creature creature creature slow creature fast activity creature presence creature motionFigure B.76: Mean deriv ativ e of skin c on ductance standard deviation for participan ts during Exp erimen t 3.199B.4.Experiment30 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 1901234567breath length sd for all participantsparticipantbreath lengths [s]  baseline no creature creature creature slow creature fast activity creature presence creature motionFigure B.77: Mean breath length s tan dard deviation for participan ts du ring Exp erimen t 3.200B.4.Experiment30 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 1960708090100110120hr mean for all participantsparticipanthr mean [bpm]  baseline no creature creature creature slow creature fast activity creature presence creature motionFigure B.78: Mean heart rate for participan ts during Exp erimen t 3.201B.4.Experiment30 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 2300.10.20.30.40.50.60.70.8hr rms ssd (from ibi) for all subjectssubjecthr rms ssd (from ibi)  baseline no creature creature creature slow creature fast activity creature presence creature motionFigure B.79: Mean heart rate rms standard deviation for participan ts during Exp erimen t 3.202B.4.Experiment30 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19-0.02-0.015-0.01-0.00500.0050.010.015dscr norm mean for all participantsparticipantdscr norm mean  baseline no creature creature creature slow creature fast activity creature presence creature motionFigure B.80: Mean deriv ativ e of skin cond uc t ance for participan ts du ring Exp erimen t 3.203B.4. Experiment 3{introduction to creatureactivity w/o creatureactivity w/creature inactiveactivity w/creature “slow”activity w/creature “fast”(i) ~ 90 s(ii) ~ 180 s(iii) ~ 240 s(iv) ~ 240 s(v) ~ 240 siiiiiiivv{ii, iii, iv, v} - activity{iii, iv, v} - creature activity{iv, v} - creature motioniiiivv{iii, iv, v}{iv, v}ivv{iv, v}iiiiiiv - v{{Stage Comparisons12345679108111213161514Figure B.81: Summary of comparisons made during Experiment 3.204B.4.Experiment3T able B.7: Summary of results from Exp erimen t 3. In v estigated columns ingreen, signi can t results are in b old. See Figure B.81 f or comparisons.comparisonunit 1 2 3 4 5 6 7 8 9 1 0 11 12 13 14 15 16 unitibi mean p 0.263 0.181 0.200 0.160 0.178 0.174 0.175 0.513 0.635 0.377 0.476 0.458 0.944 0.578 0.746 0.406 smean 0.011 0.014 0.014 0.020 0.014 0.015 0.017 0.004 0.003 0.009 0.005 0.006 0.000 0.006 0.003 0.006sd 0.046 0.050 0.051 0.067 0.048 0.054 0.058 0.026 0.031 0.049 0.032 0.039 0.029 0.048 0.037 0.035ibi sd p 0.000 0.016 0.465 0.543 0.320 0.608 0.658 0.531 0.272 0.481 0.242 0.311 0.242 0.524 0.295 0.828 smean -0.022 -0.018 -0 . 0 08 -0.010 -0.009 -0.006 -0.006 0.004 0.013 0.012 0.015 0.015 0.010 0.008 0.012 -0.001sd 0.022 0.034 0.054 0.077 0.044 0.060 0.068 0.027 0.058 0.081 0.062 0.072 0.040 0.063 0.053 0.031hr mean p 0.308 0.168 0.185 0.157 0.168 0.156 0.161 0.308 0.428 0.366 0.312 0.354 0.893 0.805 0.912 0.642mean -1.286 -2.019 -1 . 9 15 -2.299 -1.838 -2.060 -2.118 -0.73 3 -0.628 -1.013 -0.774 -0.832 0.105 -0.280 -0.099 -0.385sd 6.042 6.952 6.859 7.707 6.328 6.883 7.172 3.443 3.813 5.377 3.667 4.308 3.787 5.479 4.356 4.004hr sd p 0.000 0.003 0.004 0.002 0.004 0.005 0.002 0.916 0.156 0.893 0.037 0.091 0.070 0.985 0.097 0.275mean 4 22 410 285 407 148 2 55 275 -11.7 -137 -14.4 -85 -146 -125 -2.63 -135 122sd 418 597 441 5 60 329 405 394 542 45 6 519 260 406 322 66 6 381 535hr sk ewness p 0.859 0.349 0.703 0.575 0.533 0.656 0.753 0.232 0.556 0.494 0.529 0.631 0.293 0.853 0.232 0.725mean -0.042 0.298 0.119 0.233 0.176 0.141 0.096 0.340 0.161 0.275 0.183 0.138 -0.179 -0.065 -0.202 0.113sd 1.144 1.526 1.517 2.001 1.361 1.524 1.472 1.357 1.324 1.933 1.398 1.385 0.812 1.714 0.808 1.556hr rms ssd p 0.001 0.029 0.371 0.512 0.229 0.473 0.531 0.830 0.575 0.684 0.529 0.596 0.500 0.662 0.540 0.988mean -0.024 -0.022 -0 . 0 15 -0.015 -0.015 -0.013 -0.013 0.002 0.009 0.009 0.011 0.011 0.008 0.008 0.009 0.000sd 0.030 0.047 0.078 0.108 0.060 0.085 0.098 0.040 0.081 0.110 0.086 0.102 0.055 0.083 0.074 0.041hr p nn50 p 0.013 0.072 0.025 0.014 0.009 0.017 0.013 0.820 0.784 0.552 0.803 0.643 0.549 0.368 0.409 0.750mean -0.054 -0.049 -0 . 0 60 -0.066 -0.065 -0.059 -0.063 0.005 -0.00 6 -0.012 -0.005 -0.009 -0.012 -0.017 -0.014 -0.005sd 0.099 0.127 0.123 0.120 0.104 0.112 0.114 0.113 0.109 0.094 0.091 0.093 0.093 0.090 0.083 0.081hr vlf % p 0.019 0.064 0.165 0.037 0.004 0.001 0.003 0.267 0.648 0.943 0.001 0.003 0.630 0.247 0.053 0.398mean 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000sd 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000hr lf % p 0.098 0.142 0.232 0.399 0.268 0.217 0.275 0.932 0.134 0.275 0.115 0.182 0.091 0.222 0.139 0.553mean 0.000 0.000 0.001 0.001 0.000 0.001 0.001 0.000 0.001 0.001 0.001 0.001 0.001 0.001 0.001 0.000sd 0.001 0.001 0.004 0.004 0.002 0.003 0.004 0.001 0.004 0.004 0.003 0.004 0.003 0.004 0.004 0.002hr mf % p 0.051 0.185 0.941 0.624 0.634 0.894 0.706 0.228 0.246 0.324 0.276 0.303 0.295 0.361 0.347 0.425mean -0.001 -0.001 0.000 0.001 0.000 0.000 0.001 0.000 0.001 0.002 0.001 0.002 0.001 0.002 0.001 0.001sd 0.002 0.002 0.004 0.009 0.003 0.005 0.007 0.001 0.004 0.009 0.005 0.007 0.003 0.008 0.006 0.005hr hf % p 0.015 0.010 0.591 0.720 0.623 0.987 0.900 0.601 0.618 0.460 0.501 0.497 0.405 0.389 0.390 0.408mean -0.001 -0.001 -0 . 0 01 0 . 0 01 0 . 0 00 0 . 0 00 0.00 0 0.00 0 0.00 1 0.00 2 0.00 1 0.00 1 0.00 1 0.002 0.001 0.001sd 0.002 0.002 0.005 0.012 0.004 0.007 0.009 0.001 0.005 0.012 0.007 0.009 0.004 0.011 0.008 0.008Con tin ued. . .205B.4.Experiment3unit 1 2 3 4 5 6 7 8 9 1 0 11 12 13 14 15 16 unitscr norm mean p 0.002 0.000 0.000 0.000 0.000 0.000 0.000 0.874 0.842 0.845 0.967 0.993 0.577 0.890 0.839 0.434mean -0.121 -0.123 -0 . 1 12 -0.126 -0.120 -0.120 -0.119 -0.00 4 0.00 7 -0.007 -0.001 0.000 0.011 -0.003 0.004 -0.013sd 0.170 0.123 0.125 0.133 0.117 0.117 0.122 0.119 0.155 0.168 0.143 0.156 0.089 0.101 0.086 0.081scr norm sd p 0.057 0.286 0.034 0.695 0.584 0.456 0.357 0.271 0.451 0.156 0.141 0.162 0.242 0.476 0.738 0.098mean -0.028 -0.012 -0 . 0 21 -0.005 -0.005 -0.007 -0.009 0.018 0.009 0.025 0.023 0.021 -0.009 0.007 0.003 0.016sd 0.069 0.051 0.044 0.057 0.043 0.046 0.047 0.077 0.056 0.082 0.071 0.068 0.036 0.046 0.035 0.044dscr norm mean p 0.195 0.067 0.336 0.198 0.155 0.159 0.245 0.443 0.917 0.899 0.816 0.984 0.208 0.359 0.240 0.657mean -0.002 -0.003 -0 . 0 02 -0.002 -0.002 -0.002 -0.002 -0.00 1 0.00 0 0.00 0 0.00 0 0.00 0 0.00 1 0.001 0.001 0.000sd 0.008 0.007 0.008 0.007 0.006 0.007 0.007 0.008 0.007 0.008 0.007 0.007 0.005 0.005 0.005 0.004dscr norm sd p 0.434 0.028 0.006 0.012 0.010 0.005 0.006 0.270 0.011 0.023 0.011 0.008 0.095 0.214 0.101 0.553mean 0.006 0.014 0.024 0.022 0.019 0.022 0.023 0.007 0.017 0.014 0.015 0.016 0.010 0.007 0.009 -0.003sd 0.036 0.029 0.039 0.038 0.032 0.033 0.037 0.027 0.029 0.027 0.024 0.026 0.028 0.028 0.026 0.022emg norm mean p 0.420 0.630 0.424 0.512 0.739 0.471 0.464 0.207 0.182 0.252 0.195 0.214 0.595 0.750 0.669 0.417mean 0.022 -0.012 -0.024 -0 . 0 19 -0.008 -0.019 -0.021 -0.035 -0.04 7 -0.042 -0.042 -0.044 -0.012 -0.007 -0.010 0.005sd 0.133 0.113 0.140 0.135 0.108 0.121 0.136 0.128 0.163 0.171 0.149 0.166 0.108 0.108 0.107 0.029emg norm sd p 0.053 0.011 0.009 0.007 0.067 0.014 0.010 0.148 0.047 0.052 0.242 0.078 0.406 0.283 0.745 0.916mean -0.035 -0.062 -0 . 0 74 -0.073 -0.031 -0.052 -0.066 -0.02 7 -0.039 -0.038 -0.017 -0.030 -0.012 -0.011 -0.003 0.002sd 0.085 0.108 0.124 0.118 0.076 0.094 0.111 0.086 0.090 0.088 0.068 0.079 0.070 0.047 0.049 0.067skin tem p . mean p 0.069 0.001 0.001 0.000 0.001 0.000 0.000 0.002 0.002 0.001 0.001 0.001 0.032 0.008 0.011 0.870  Cmean 0.273 1.167 1.699 1.727 1.180 1.570 1.714 0.879 1.411 1.439 1.232 1.426 0.532 0.560 0.547 0.028sd 0.702 1.508 2.102 1.951 0.739 1.825 1.987 1.186 1.908 1.708 1.671 1.764 1.117 0.925 0.942 0.800skin tem p . sd p 0.019 0.814 0.890 0.722 0.000 0.002 0.016 0.056 0.080 0.344 0.032 0.308 0.856 0.763 0.091 0.719  Cmean 0.124 0.015 0.008 0.033 0.739 0.344 0.221 -0.114 -0.121 -0.09 6 0.25 7 0.09 2 -0.007 0.019 0.206 0.025sd 0.239 0.296 0.281 0.444 0.621 0.463 0.405 0.271 0.315 0.474 0.507 0.423 0.171 0.294 0.235 0.332res p rate mean p 0.260 0.239 0.132 0.081 0.091 0.075 0.100 0.968 0.974 0.829 0.907 0.898 0.993 0.865 0.930 0.369 smean 36.0 38.9 38.6 45.2 40 . 4 41.1 41.9 1.37 1.08 7.65 3.61 4.38 -0.29 6.28 3.01 6.57sd 153 154 118 1 18 102 106 117 164 15 9 167 146 162 152 17 5 163 34.4res p rate sd p 0.039 0.021 0.029 0.009 0.450 0.171 0.017 0.664 0.861 0.337 0.099 0.778 0.650 0.298 0.520 0.331 smean -29.8 -27.2 -33.7 - 44.9 -8.3 -16.7 -34.9 3.9 -2.6 -13.9 1 4.4 -3.8 -6.5 -17.7 -7.7 -11.2sd 66.8 52.5 69.3 75.2 52.0 56.7 64.8 42.3 71.3 67.7 64.6 6 4.3 67.8 49.4 56.6 54.2breath lengths mean p 0.261 0.130 0.096 0.101 0.070 0.092 0.092 0.886 0.597 0.222 0.743 0.632 0.311 0.335 0.414 0.496 smean 1.565 1.363 2.058 2.416 0.789 1.868 1.956 -0.202 0.493 0.851 0.303 0.391 0.695 1.053 0.593 0.359sd 6.649 4.248 5.816 6.919 1.540 5.208 5.450 6.852 4.504 3.323 4.467 3.941 3.287 5.241 3.488 2.537breath lengths sd p 0.255 0.251 0.202 0.118 0.155 0.129 0.121 0.096 0.185 0.099 0.114 0.110 0.443 0.116 0.182 0.339 smean -0.580 0.588 1.230 1.808 1.371 1.207 1.353 1.168 1.810 2.388 1.787 1.933 0.642 1.220 0.764 0.578sd 2.431 2.450 4.589 5.451 4.562 3.753 4.109 3.292 6.484 6.801 5.333 5.689 4.027 3.660 2.720 2.902Con tin ued. . .206B.4.Experiment3unit 1 2 3 4 5 6 7 8 9 1 0 11 12 13 14 15 16 unitscr mean p 0.859 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.002 0.002 0.001 0.875 Smean 0.032 0.901 1.510 1.485 1.040 1.341 1.498 0.849 1.458 1.433 1.288 1.446 0.609 0.584 0.597 -0.025sd 0.865 0.970 1.353 1.505 1.108 1.247 1.381 0.743 1.060 1.109 0.900 1.018 0.819 0.799 0.718 0.746scr sd p 0.740 0.933 0.516 0.594 0.000 0.002 0.045 0.994 0.433 0.542 0.000 0.017 0.240 0.468 0.005 0.138 Smean 0.015 0.004 0.043 -0.026 0.433 0.243 0.142 0.000 0.039 -0.030 0.239 0.138 0.039 -0.03 0 0.137 -0.069sd 0.222 0.245 0.313 0.229 0.393 0.325 0.319 0.202 0.235 0.231 0.275 0.256 0.154 0.196 0.213 0.215dscr mean p 0.029 0.043 0.054 0.034 0.029 0.032 0.038 0.672 0.937 0.659 0.716 0.773 0.720 0.995 0.847 0.560 Smean 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000sd 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000dscr sd p 0.020 0.680 0.115 0.352 0.496 0.192 0.161 0.003 0.001 0.000 0.000 0.000 0.113 0.449 0.154 0.356 Smean 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000sd 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000 0.000207B.4. Experiment 3B.4.3 Participant Consent FormPage 1 of 5  Version 1.2 - December 3, 2009     THE UNIVERSITY OF BRITISH COLUMBIA  Department of Computer Science 2366 Main Mall Vancouver, BC Canada V6T 1Z4 Phone: (604) 822-3061 Fax: (604) 822-4231   PARTICIPANT & PARENT INFORMATION AND CONSENT FORM   Tamer: Touch-guided Anxiety Management via Engagement with a Robot Pet Principal Investigator:  Associate Professor Karon MacLean, PHD  Department of Computer Science University of British Columbia (604)-822-8169  Sponsor:  Name(s) of industry sponsor or granting agency  INTRODUCTION You (or your child) are being invited to take part in this research study because we feel that your participation and feedback will greatly assist us in developing anxiety-reducing robotic devices.  YOUR PARTICIPATION IS VOLUNTARY Your participation is entirely voluntary, so it is up to you to decide whether or not to take part in this study.  Before you decide, it is important for you to understand what the research involves.  This consent form will tell you about the study, why the research is being done, what will happen to you during the study and the possible benefits, risks, and discomforts.  If you wish to participate, you will be asked to sign this form.  If you do decide to take part in this study, you are still free to withdraw at any time and without giving any reasons for your decision.  If you do not wish to participate, you do not have to provide any reason for your decision not to participate.  Please take time to read the following information carefully.  WHO IS CONDUCTING THE STUDY? The study is being conducted/funded by the National Science and Engineering Research Council of Canada (NSERC). The Principal Investigator has received funds from this agency to compensate subjects for participating in this study.  You are entitled to request any details concerning this compensation from the Principal Investigator.  BACKGROUND This project’s goal is to advance a novel tool and technique to help young children attain independent anxiety regulation skills.  Engagement will be utilize to give children access to cognitive training by interacting with an expressive animatronic pet.  This robot will be programmed to respond physically to a combination of a child’s pattern of touch and biometrically sensed emotional state in a way that rewards patience and progress. 208B.4. Experiment 3Page 2 of 5  Version 1.2 - December 3, 2009   WHAT IS THE PURPOSE OF THE STUDY? The purpose of this study is to examine the role of haptic (touch sense) feedback on anxiety levels.  This study investigates your reaction to a small robotic creature that is touch-sensitive and can breath, purr and stiffen its ears.  WHO CAN PARTICIPATE IN THE STUDY? This study is open to children from ages 5-17, particularly those who may have been diagnosed with mild anxiety or learning disorders, as well as adult subjects between the ages of 17-50. We expect to enroll approximately 20 children and 10 adults in this experiment.  WHAT DOES THE STUDY INVOLVE? You will be asked to wear external (i.e., non-invasive) sensors that collect some basic physiological information such as the heart rate, respiration rate, some muscle activity, and perspiration. We request that you tell the experimenter if you find the sensor positioning uncomfortable, and adjustments will be made. You will be invited to answer questions in two questionnaires as part of the experiment. The study will be viewed by the experimenters in a separate room via a webcam. It will not be recorded. The time commitment required for this session will range from one to three hours.  Image 1 shows a photo of a child attached to the physiological sensors that will be used during these experiments. There are four primary sensors that will be used during these experiments: a. Respiration Rate: A Velcro band is worn around the abdomen outside of the clothing. Expands and contracts with the abdomen to measure respiration rate, waveform, and amplitude.  b. Blood Volume Pulse: also known as a photoplethysmograph (PPG) sensor. A small black box attaches to the distal end of a finger with a Velcro strap. Measures heart rate.  c. Skin Conductance: Two electrodes attach to Velcro straps, each in turn attached to the distal end of two fingers on the same hand. Measures galvanic skin response (GSR), the electrical resistance of the skin.  d. ECG: Three electrodes attach to the upper right and left sides of the chest and the lower abdomen.  Measures heart electrical activity.  IF YOU DECIDE TO JOIN THIS STUDY: SPECIFIC PROCEDURES After being fitted with the sensors, you will be invited to hold the creature in your lap.  The creature may move during this experiment.  You will then complete a questionnaire about your interaction during this experiment. If you are not sure about any instructions, do not hesitate to ask.  WHAT ARE THE POSSIBLE HARMS AND SIDE EFFECTS OF PARTICIPATING?  Image 1: User demonstrating possible physiological sensors:  respiration rate (a), blood volume pulse (b), skin conductance (c), ECG (d). 209B.4. Experiment 3Page 3 of 5  Version 1.2 - December 3, 2009  There are no expected harms or side effects from participating in this experiment. Nothing will be done to impose stress or anxiety on you.  The biosensors that are worn are non-intrusive, and FDA-approved safe for medical uses.  WHAT ARE THE BENEFITS OF PARTICIPATING IN THIS STUDY? No one knows whether or not you will benefit from this study.  There may or may not be direct benefits to you from taking part in this study.  We hope that the information learned from this study can be used in the future to benefit others.  WHAT HAPPENS IF I DECIDE TO WITHDRAW MY CONSENT TO PARTICIPATE? Your participation in this research is entirely voluntary.  You may withdraw from this study at any time.  If you choose to enter the study and then withdraw at a later time, all data collected about you during your enrolment in the study will be retained for analysis.  By law, this data cannot be destroyed.  WHAT WILL THE STUDY COST ME? You are not expected to incur any personal expenses as a result of your participation in this study.  You will be compensated $5 for each 1/2-hour study session.    WILL MY TAKING PART IN THIS STUDY BE KEPT CONFIDENTIAL? Your confidentiality will be respected.  No information that discloses your identity will be released or published without your specific consent to the disclosure. Research records identifying you may be inspected in the presence of the Investigator or his or her designate by representatives of Health Canada and the UBC Research Ethics Board for the purpose of monitoring the research.  However, no records which identify you by name or initials will be allowed to leave the Investigators’ offices.    WHO DO I CONTACT IF I HAVE QUESTIONS ABOUT THE STUDY DURING MY PARTICIPATION? If you have any questions or desire further information about this study before or during participation, you can contact Karon Maclean at (604)-822-8169.  WHO DO I CONTACT IF I HAVE ANY QUESTIONS OR CONCERNS ABOUT MY RIGHTS AS A SUBJECT DURING THE STUDY? If you have any concerns about your rights as a research subject and/or your experiences while participating in this study, contact the Research Subject Information Line in the University of British Columbia Office of Research Services by e-mail at RSIL@ors.ubc.ca or by phone at 604-822-8598.  210B.4. Experiment 3B.4.4 Participant Assent FormPage 1 of 2  Version 1.0 – September 22, 2009 10:23 PM   THE UNIVERSITY OF BRITISH COLUMBIA  Department of Computer Science 2366 Main Mall Vancouver, BC Canada V6T 1Z4 Phone: (604) 822-3061 Fax: (604) 822-4231  SUBJECT ASSENT FORM  Tamer: Touch-guided Anxiety Management via Engagement with a Robot Pet  INVITATION I am being invited to be part of a research study.  A research study tries to find better treatments to help children like me.  It is up to me if I want to be in this study.  No one will make me be part of the study.  Even if I agree now to be part of the study, I can change my mind later.  No one will be mad at me if I choose not to be part of this study. WHY ARE WE DOING THIS STUDY? We are doing this study to investigate how a robot may help reduce my anxiety levels.  We want to see my reactions to a robot that purrs, breathes, and moves on my lap.   WHAT WILL HAPPEN IN THIS STUDY? During this experiment you will be asked to wear physiological sensors as shown in Image 1 on your hands and chest.  These will allow us to record your heart rate, pulse, breathing rate, and skin conductance (how sweaty you are).  If at any time these are uncomfortable please let us know, and we will adjust them for you.   We are investigating your reaction to a small robotic creature is touch-sensitive and can breath, purr and stiffen its ears.  You will be asked to hold the creature in your lap while you complete some schoolwork.  The creature may move during this experiment.    WHO IS DOING THIS STUDY? Karon Maclean and other investigators from the UBC Computer Science Department will be doing this study.  They will answer any questions I have about this study.  I can also call them at 604-822-8169, if I am having any problems or if there is an emergency and I cannot talk to my parents. CAN ANYTHING BAD HAPPEN TO ME? There is nothing in this study that should cause anything bad to happen to me.  Image 1: User demonstrating possible physiological sensors:  respiration rate (a), blood volume pulse (b), skin conductance (c), ECG (d). 211B.5. Experiment EquipmentWHO WILL KNOW I AM IN THE STUDY? Only the people who are involved in the study will know I am it.  When the study is finished, the investigators will write a report about what was learned.  This report will not say my name or that I was in the study.  My parents and I do not have to tell anyone I am in the study if we don’t want to. WHEN DO I HAVE TO DECIDE? I have as much time as I want to decide to be part of the study.  I have also been asked to discuss my decision with my parents.  If I put my name at the end of this form, I agree to be in the study. SUBJECT'S ASSENT TO PARTICIPATE IN RESEARCH I have had the opportunity to read this consent form, to ask questions about my participation in this research, and to discuss my participation with my parents/guardians.  All my questions have been answered. I understand that I may withdraw from this research at any time, and that this will not interfere with the availability to me of other health care. I have received a copy of this consent form. I assent to participate in this study.  ____________________________  ____________________________  ____________ PRINTED NAME OF SUBJECT SIGNATURE DATE  B.5 Experiment EquipmentFigure B.82 shows the command and control scheme used during the experiments. Duringthe experiment the host computer was an IBM Thinkpad T400P with an Intel Core 2 DuoT9400 processor and 2 gigabytes of RAM, running Windows XP. Communications betweenthe sensors and host computer was by USB. Communication of touch data and hardwarestate from the Creature to the host computer was by Bluetooth radio, and of creaturecommands from the host computer to the Creature by the XBee wireless radio system.CREATURESENSORSUSERPHYSIO DATATOUCH DATAHARDWARESTATECREATURE COMMANDSPHYSIOSOFTWARECREATUREDISPLAYFigure B.82: Diagram of TAMER command and control scheme. Arrows represent com-munications links between system components, dashed arrows identify the connections thatare typically wireless.212Appendix CSchematicsC.1 Radio Base Station SchematicsFigure C.1: The radio base station board.213C.1.RadioBaseStationSchematicsATMEGA810k+5V100n+5VGND22p22pGNDGNDGND100nGND100u 100uGND GNDGND180100nGND100n1 0 0+5V1801801k1k16MHz100_NMMC33269D-5.0100nGND+5V100nGNDGNDNDT2955LM358DL M 3 5 8 D10k10kGND100nGND+5V100n+5V100n10k1kGNDGND GNDGNDBC547BGND3.3VGNDS 4 3 0 1 BS 4 3 0 1 BS 4 3 0 1 BM A X 7 2 1 9 C N GG N D1 0 0 n1 0 u F28k1 8 01 8 01 8 06 85 6GNDY E LW H TGNDGNDPOWER_JACKPTH_LOCKGNDFT232RLSSOP+5V+5V(ADC5)PC52 8(ADC4)PC42 7(ADC3)PC32 6(ADC2)PC22 5(ADC1)PC12 4(ADC0)PC0)2 3(SCK)PB51 9(MISO)PB41 8(MOSI)PB31 7(SS)PB21 6(OC1)PB11 5(ICP)PB01 4(AIN1)PD71 3(AIN0)PD61 2(T1)PD51 1(T0)PD46(INT1)PD35(INT0)PD24(TXD)PD13(RXD)PD02GND8VCC7AVCC20AREF21XTAL19XTAL210RESET1AGND22IC1R1C1C2C3C5C6 C731 24S 1R7C4C8LPWRR6LRXLTXR4R5R8R9D1Q2R2VI31VO2IC4ADJC91F C10LT1231IC5A657I C 5 B84R10R11C11C12C1321RESET-ENVCC1DOUT2DIN/CONFIG3CD/DOUT_EN/DO84RESET5PWM0/RSSI6DTR/SLEEP_RQ/DI89GND10RF_TX/AD4/DIO411CTS/DIO712ON/SLEEP13VREF14ASSOC/AD5/DIO515RTS/AD6/DIO616COORD_SEL/AD3/DIO317AD2/DIO218AD1/DIO119AD0/DIO020123XBEE_CSEL0123XBEE_CSEL112XBEE_RESET112XBEE_RESET2R18R21Q162591 0 3LED17418abcdefg D PCC62591 0 3L E D 27418abcdefg D PCC62591 0 3L E D 37418abcdefg D PCCD I G 26D I G 37D I G 43D I G 51 0D I G 65D I G 78S E G A1 4S E G B1 6S E G C2 0S E G D2 3S E G E2 1S E G F1 5S E G G1 7S E G D P2 2D I G 11 1D I G 02D I N1D O U T2 4L O A D1 2C L K1 3I S E T1 8G N D9G N D4V C C1 9U $ 5C 1 4C 1 6R3R 1 2R 1 3R 1 4R 1 5R 1 6G R NB L U CR E DL R G BL E D - T R I C O L O R - T H R O U G H H O L EL 0L 1123XBEE_CSEL1S123XBEE_CSEL0S12345J P 1612345J P 26J1D+D-VBUSGNDRESET19OSCI27OSCO28DSR9DCD10RI63V3OUT17USBDM16USBDP15GND77GND1818GND2121TXD1RXD5VCCIO4AGND25IC3TEST26VCC20TXLED23RXLED22RTS3CTS11DTR2PWREN14TXDEN13SLEEP12+5V+5V+ 5 V+ 5 V+5VG N DGNDGNDN $ 1 0N $ 1 1N $ 1 2N $ 1 3N $ 1 4N $ 1 4N$15N $ 1 5N $ 1 6N $ 1 9N $ 5N $ 5N $ 2 2N $ 2 2N $ 2 2AREFAREFRESETVINVINM8RXDM8RXDM8RXDM8RXDM8RXDM8TXDM8TXDM8TXDSCKS C KPWRINPWRIND-D -D+D +VCC3OVCC3OVCC3OVCC3OM I S OM I S OM O S IM O S IS SS SRTSRTSDTRDTRGATE_CMDCMPUSBVCCUSBVCCCTSDSRDCDRIX9X9X16X16N $ 4 9N $ 5 3VBUSVBUSLRXPLRXPLTXPLTXPN $ 2 0N $ 2 1USB01234567891 01 11 21 3HAPTICAT RADIOBASE STATION9/30/09JOSEPH P. HALL IIIDISPLAYPOWERXBEEUSBATMEGAGND500mA15kR17RESET100R19R20100BLUELRSSIGRNLASSOCGNDGNDFigure C.2: The radio base station sc hematic.214C.1. Radio Base Station SchematicsPart Value Device Form Factor Source Part No.C1 100n Ceramic Capacitor .1" Through-holeDigikey PCC1828CT-NDC10 100n Ceramic Capacitor .1" Through-holeDigikey PCC1828CT-NDC11 100n Ceramic Capacitor .1" Through-holeDigikey PCC1828CT-NDC12 100n Ceramic Capacitor .1" Through-holeDigikey PCC1828CT-NDC13 100n Ceramic Capacitor .1" Through-holeDigikey PCC1828CT-NDC14 100n Ceramic Capacitor .1" Through-holeDigikey PCC1828CT-NDC16 10uF Electrolytic Capaci-torRadial Digikey P975-NDC2 22p Ceramic Capacitor .1" Through-holeDigikey 445-4763-NDC3 22p Ceramic Capacitor .1" Through-holeDigikey 445-4763-NDC4 100n Ceramic Capacitor .1" Through-holeDigikey PCC1828CT-NDC5 100n Ceramic Capacitor .1" Through-holeDigikey PCC1828CT-NDC6 100u Electrolytic Capaci-torRadial Digikey vP12924-NDC7 100u Electrolytic Capaci-torRadial Digikey P12924-NDC8 100n Electrolytic Capaci-torRadial Digikey PCC1828CT-NDC9 100n Electrolytic Capaci-tor.1" Through-holeDigikey PCC1828CT-NDD1 - DIODE-1N4001 SparkFun Digikey 1N4001FSCT-NDF1 500mA Resettable Fuse SMD Digikey MF-MSMF030- 2CT-NDIC1 - ATMEGA8 DIL28-3 SparfunIC3 - FT232RL USB to Se-rial ConverterSSOP28DB Digikey 768-1007-1-NDIC4 - MC33269D-5.0 DPACKIC5 - LM358D LowDropout OpAmpSO08 Digikey LM358DR2GOSCT-NDJ1 - Power Jack SparkfunJP1 - Front Header Pins DigikeyJP2 - Front Header Pins DigikeyL WHT Indicator Light 1206 SMD Digikey 160-1737-1-NDL0 YEL LED 5MM Radial Digikey 365-1190-NDL1 WHT LED 5MM Radial Digikey 67-1695-NDLASSOC GRN LED 5MM Radial Digikey C503B-GCN-CY0C0791-NDLED1 - LED Bar Graph Digikey 160-1068-NDLED2 - LED Bar Graph Digikey 160-1068-NDLED3 - LED Bar Graph Digikey 160-1068-NDLPWR ORG Power Light 1206 SMD Digikey 350-2049-1-NDLRGB - Tricolor LED 5MM RadialLRSSI BLUE LED5MM 5MM Radial Digikey C503B-BAN-CY0C0461-NDLRX BLUE LED-1206 1206 SMD DigikeyLTX BLUE LED-1206 1206 SMD DigikeyQ1 - BC547B TO92 BC547BQ2 16MHz Crystal Oscillator HC49/S CRYTALHC49SR1 10 kOhm Resistor AXIAL-0.3 Digikey P10.0KCACT -NDContinued. . .215C.1. Radio Base Station SchematicsPart Value Device Form Factor Source Part No.R10 10 kOhm Resistor AXIAL-0.3 Digikey P10.0KCACT -NDR11 10 kOhm Resistor AXIAL-0.3 Digikey P10.0KCACT -NDR12 180 Ohm Resistor AXIAL-0.3 Digikey P180CACT-NDR13 180 Ohm Resistor AXIAL-0.3 Digikey P180CACT-NDR14 180 Ohm Resistor AXIAL-0.3 Digikey P180CACT-NDR15 68 Ohm Resistor AXIAL-0.3 DigikeyR16 56 Ohm Resistor AXIAL-0.3 DigikeyR17 15 kOhm Resistor AXIAL-0.3 Digikey P15.0KCACT -NDR18 10 kOhm Resistor AXIAL-0.3 Digikey P10.0KCACT -NDR19 100 Ohm Resistor AXIAL-0.3 Digikey P100CACT-NDR2 100 Ohm Resistor AXIAL-0.3 Digikey P100CACT-NDR20 100 Ohm Resistor AXIAL-0.3 Digikey P100CACT-NDR21 1 kOhm Resistor AXIAL-0.3 Digikey P1.00KCACT -NDR3 28 kOhm Resistor AXIAL-0.3 Digikey P28.0KCACT -NDR4 180 Ohm Resistor AXIAL-0.3 Digikey P180CACT-NDR5 180 Ohm Resistor AXIAL-0.3 Digikey P180CACT-NDR6 100 Ohm Resistor AXIAL-0.3 Digikey P100CACT-NDR7 180 Ohm Resistor AXIAL-0.3 Digikey P180CACT-NDR8 1 kOhm Resistor AXIAL-0.3 Digikey P1.00KCACT -NDR9 1 kOhm Resistor AXIAL-0.3 Digikey P1.00KCACT -NDRESET-EN - SJ jumperS1 - 6mm Tactile Switch 6mm Digikey SW793-NDT1 NDT2955 PMOSSOT223 SOT223U$5 MAX7219CNG DIL24-3 MAX7219CNGU$6 LTA-1000GLTA-1000G LTA-1000GX1 USBPTH USBPTH USB-B-PTHXB1 - Xbee Radio - Digikey XB24-AWI-001-NDXBEE CSEL0 - Xbee RX/TXJumper PinsSparkFunXBEE CSEL0S - Xbee RX/TXJumper PinsSparkFunXBEE CSEL1 - Xbee RX/TXJumper PinsSparkFunXBEE CSEL1S - Xbee RX/TXJumper PinsSparkFunXBEE RESET1- Xbee Reset JumperPinsSparkFunXBEE RESET2- Xbee Reset JumperPinsSparkFun216C.2. Creature Board SchematicsC.2 Creature Board Schematics217C.2. Creature Board SchematicsFigure C.3: The Creature board board.218C.2. Creature Board SchematicsFigure C.4: The Creature board schematic.219C.2.CreatureBoardSchematicsP art V al ue Name Num b er1W R 4.7k RES 4.70K OHM .25W 1% 1206 SMD RHM4. 7 0K FR CT-ND3VREG IC LDO REG W/SD 3.3V SOT223-3 L T1129CST-3.3#PBF-ND5VREG IC LDO REG W/SD 5V SOT223-3 L T1129CST-5#PBF-NDBlueto oth Blueto oth SMD Mo dule - Bluegiga WRL-08771C C1 .1uF CAP .10UF 50V CERAMIC X7R 10% BC1084CT-NDC C2 .01uF CAP .01UF 50V 10% CER RADIAL 399-4148-NDC C3 10uF CAP ELECT 10UF 25V KS RADIAL P975-NDC C4 .01uF CAP .01UF 50V 10% CER RADIAL 399-4148-NDC CA 10uF CAP ELECT 10UF 25V KS RADIAL P975-NDC Q1 TRANS PNP PWR GP 7A 50V TO220AB 2N6109GOS-NDC R1 1.4k RES 1.40K OHM 1/4W 1% 1206 S MD RHM1.40KF CT-NDc R2 150 RES 150K OHM 1/4W 1% 1206 SMD RHM150KFR CT-NDC R3 68k RES 68.0K OHM 1/4W 1% 1206 SMD RHM68.0KFR CT-NDC R4 22k RES 22.0K OHM 1/4W 1% 1206 SMD RHM22.0KFR CT-NDC RS 138 RES 137 OHM 1/4W 1% 1206 SMD RHM137F CT-NDC1 .1uF CAP .10UF 50V CERAMIC X7R 10% BC1084CT- NDC10 .1uF CAP .10UF 50V CERAMIC X7R 10% BC1084CT- NDC11 .1uF CAP .10UF 50V CERAMIC X7R 10% BC1084CT- NDC12 .1uF CAP .10UF 50V CERAMIC X7R 10% BC1084CT- NDC13 .1uF CAP .10UF 50V CERAMIC X7R 10% BC1084CT- NDC14 .1uF CAP .10UF 50V CERAMIC X7R 10% BC1084CT- NDC2 10uF CAP ELECT 10UF 25V KS RADIAL P975-NDC21 .1uF CAP .10UF 50V CERAMIC X7R 10% BC1084C T-NDC22 .1uF CAP .10UF 50V CERAMIC X7R 10% BC1084C T-NDC3 10uF CAP ELECT 10UF 25V KS RADIAL P975-NDC4 10uF CAP ELECT 10UF 25V KS RADIAL P975-NDC5 10uF CAP ELECT 10UF 25V KS RADIAL P975-NDC6 .1uF CAP .1UF 25V CERAMIC X7R 0805 PCC1828CT-NDC7 .1uF CAP .1UF 25V CERAMIC X7R 0805 PCC1828CT-NDC8 .1uF CAP .10UF 50V CERAMIC X7R 10% BC1084CT- NDC9 .1uF CAP .10UF 50V CERAMIC X7R 10% BC1084CT- NDD1 DIODE GEN PURPOSE 50V 1A DO41 1N4001FSCT-NDDIGIPOT0 100k IC DGTL POT DUAL 256-T AP 14TSSOP MAX5479EUD+-NDCon tin ued. . .220C.2.CreatureBoardSchematicsP art V al ue Name Num b erDIGIPOT1 100k IC DGTL POT DUAL 256-T AP 14TSSOP MAX5479EUD+-NDDIGIPOT2 100k IC DGTL POT DUAL 256-T AP 14TSSOP MAX5479EUD+-NDHB-STPR0 IC QUAD HALF-H DR VR 16-DIP 296-9518-5-NDHB-STPR1 IC QUAD HALF-H DR VR 16-DIP 296-9518-5-NDHEA T-T0 MOSFET P-CH 12V 8.9A 8-S OIC IRF7433PBF CT-NDHEA T-T1 MOSFET P-CH 12V 8.9A 8-S OIC IRF7433PBF CT-NDHEA T-T2 MOSFET P-CH 12V 8.9A 8-S OIC IRF7433PBF CT-NDHEA T-T3 MOSFET P-CH 12V 8.9A 8-S OIC IRF7433PBF CT-NDIC1 IC BA TT F ASTCHR G NICD/NIMH16SOIC MAX712CSE+-NDLED1 YEL/GRN LED ALINGAP YW/GN CLEAR 1206 SMD 350-2052-1-NDLED2 YEL/GRN LED ALINGAP YW/GN CLEAR 1206 SMD 350-2052-1-NDLED3 GRN LED ALINGAP GREEN CLEAR 1206 SMD 350-2053-1-NDLED4 GRN LED ALINGAP GREEN CLEAR 1206 SMD 350-2053-1-NDLED5 GRN LED ALINGAP GREEN CLEAR 1206 SMD 350-2053-1-NDLED6 WHT LED WHITE YELLO W 260MCD 1206 160-1737-1-NDLED7 GRN LED ALINGAP GREEN CLEAR 1206 SMD 350-2053-1-NDLED8 BLUE LED INGAN BLUE CLEAR 1206 SMD 350-2055-1-NDLEDH0 RED/OR GLED ALINGAP RD/O R CLEAR 1206 S MD 350-2048-1-NDLEDH1 RED/OR GLED ALINGAP RD/O R CLEAR 1206 S MD 350-2048-1-NDLEDH2 RED/OR GLED ALINGAP RD/O R CLEAR 1206 S MD 350-2048-1-NDLEDH3 RED/OR GLED ALINGAP RD/O R CLEAR 1206 S MD 350-2048-1-NDLEDRXB BLUE LED INGAN BLUE CLEAR 1206 SMD 350-2055-1-NDLEDTXB BLUE LED INGAN BLUE CLEAR 1206 SMD 350-2055-1-NDMUX0 IC MUX/DEMUX 1X16 24SOIC 568-4591-5-NDMUX1 IC MUX/DEMUX 1X16 24SOIC 568-4591-5-NDMUX2 IC MUX/DEMUX 1X16 24SOIC 568-4591-5-NDMUX3 IC MUX/DEMUX 1X16 24SOIC 568-4591-5-NDMUXR0 1k RES 1.00K OHM 1/4W 1% 1206 SMD RHM1.00KFR CT-NDMUXR1 1k RES 1.00K OHM 1/4W 1% 1206 SMD RHM1.00KFR CT-NDMUXR2 1k RES 1.00K OHM 1/4W 1% 1206 SMD RHM1.00KFR CT-NDMUXR3 1k RES 1.00K OHM 1/4W 1% 1206 SMD RHM1.00KFR CT-NDOP AMP0 IC OP AM P LO W PWR DUAL 8-SOIC 497-1591-1-NDOP AMP1 IC OP AM P LO W PWR DUAL 8-SOIC 497-1591-1-NDOP AMP2 IC OP AM P LO W PWR DUAL 8-SOIC 497-1591-1-NDCon tin ued. . .221C.2.CreatureBoardSchematicsP art V al ue Name Num b erPO WER JA CKQ1 BC547B TRANS NPN 45V 100MA TO-92 BC547BT A CT-NDR/A HEADR RST 47k RES 47.0K OHM 1/4W 1% 1206 SMD RHM47.0KFR CT-NDR1 180 RES 180 OHM 1/4W 1% 1206 SMD RHM180FR C T-NDR10 180 RES 180 OHM 1/4W 1% 1206 SMD RHM180FR C T-NDR11 150 RES 150 OHM 1/4W 1% 1206 SMD RHM150FR C T-NDR12 150 RES 150 OHM 1/4W 1% 1206 SMD RHM150FR C T-NDR13 150 RES 150 OHM 1/4W 1% 1206 SMD RHM150FR C T-NDR14 100 RES 100 OHM 1/4W 1% 1206 SMD RHM100FR C T-NDMUXR5 1k RES 1.00K OHM 1/4W 1% 1206 SMD RHM1.00KFR CT-NDMUXR4 1k RES 1.00K OHM 1/4W 1% 1206 SMD RHM1.00KFR CT-NDR17 15k RES 10.0K OHM 1/4W 1% 1206 SMD RHM10.0KFR CT-NDR18 10k RES 15.0K OHM 1/4W 1% 1206 SMD RHM15.0KFR CT-NDR19 150 RES 150 OHM 1/4W 1% 1206 SMD RHM150FR C T-NDR2 180 RES 180 OHM 1/4W 1% 1206 SMD RHM180FR C T-NDR20 100 RES 100 OHM 1/4W 1% 1206 SMD RHM100FR C T-NDR21 1k RES 1.00K OHM 1/4W 1% 1206 SMD RHM1.00KFR CT-NDR3 180 RES 180 OHM 1/4W 1% 1206 SMD RHM180FR C T-NDR4 1.5k RES 1.50K OHM 1/4W 1% 1206 SMD RHM1.50KFR CT-NDR5 1.5k RES 1.50K OHM 1/4W 1% 1206 SMD RHM1.50KFR CT-NDR6 180 RES 180 OHM 1/4W 1% 1206 SMD RHM180FR C T-NDR7 100 RES 100 OHM 1/4W 1% 1206 SMD RHM100FR C T-NDR8 100 RES 100 OHM 1/4W 1% 1206 SMD RHM100FR C T-NDR9 180 RES 180 OHM 1/4W 1% 1206 SMD RHM180FR C T-NDR VM1 20k RES 20.0K OHM 1/4W 1% 1206 SMD RHM20.0KFR CT-NDR VM2 10k RES 10.0K OHM 1/4W 1% 1206 SMD RHM10.0KFR CT-NDS1 SWITCH T A CT 6MM 260GF H=4.3MM SW793-NDTEMPB2 IC THERM MICR OLAN PR OG-RES TO-92 DS18B20+P AR-NDU $ 9 IC V OL T-L VL TRANSL 2BIT BI SM8 296-21978-1-NDVR HEA T Dimension Engineering 10W Ad justable Switc hing RegulatorVR MOTORS Dimension Engin e erin g 10W Adjustable Switc hin g RegulatorVR SER V O Dimension Engineering 10W Adjustable Switc hing RegulatorXBEE MODULE ZIGBEE 100MW W/CHIP ANT XBP24-A CI- 001-NDCon tin ued. . .222C.2.CreatureBoardSchematicsP art V al ue Name Num b erXBPINS 2mm 10pin XBee So c k etLEDTXX OR G LED ALINGAP ORN CLEAR 1206 SMD 350-2049-1-NDLEDRXX OR G LED ALINGAP ORN CLEAR 1206 SMD 350-2049-1- NDR15 150 RES 150 OHM 1/4W 1% 1206 SMD RHM150FR C T-NDR16 150 RES 150 OHM 1/4W 1% 1206 SMD RHM150FR C T-NDLEDRX O YEL LED ALINGAP YL W CLEAR 1206 SMD 350-2050-1-NDLEDTX O YEL LED ALINGAP YL W CLEAR 1206 SMD 350-2050-1-NDR22 150 RES 150 OHM 1/4W 1% 1206 SMD RHM150FR C T-NDR23 150 RES 150 OHM 1/4W 1% 1206 SMD RHM150FR C T-ND223Appendix DCodeHerein is code used for the experiments in thesis. Creature accepts incoming serial byteat 9600bps from XBee radio. If that byte is 0-252, value mapped to breathing servo. Ifthat byte is 253, pulse triggered. Heat, ears, and purr all deactivated. The most recentlyreceived value and serial port is sent out via Bluetooth.Table D.1: Haptic Creature communications protocol.Input to Creaturef Pulse out ten stepsd Pulse in ten stepsa Pulse out one steps Pulse in one stepr Start reporting temperature sensor datat Stop reporting temperature sensor dataOutput from CreatureR. Current respiration servo positionT. Output of breathing servo temperature sensorU. Output of anterior temperature sensorV. Output of electronics board temperature sensorListing D.1: Haptic Creature Mirroring Code1 /∗∗∗∗ Arduino code f o r Haptic Creature to allow mirroring of  -breathing and pulse ∗∗∗∗2 ∗ Joseph P. Hall I I I3 ∗ 03/27/104 ∗5 ∗ Accepts incoming s e r i a l byte at 9600 bps6 ∗ I f that byte i s 0 252, value mapped to breathing servo7 ∗ I f that byte i s 253 , pulse t r i g g e r e d8 ∗ heat , ears , and purr o f f9 ∗10 ∗ Use with new e l e c t r o n i c s board :11 ∗  Sends data out via bluetooth , in via XBee12 ∗13 ∗ ∗/1415 // Connection D e f i n i t i o n s , pretty much s e l f explanatory224Appendix D. Code16 #define STEPPER PIN1 4817 #define STEPPER PIN2 4918 #define STEPPER PIN3 5019 #define STEPPER PIN4 4120 #define LIMIT SWITCH PIN 2521 #define PULSE LS 252223 #define BREATH PIN 2324 #define SEAR PIN 2725 #define PEAR PIN 292627 #define PURR ENABLE PIN 728 #define PURR DIR1 PIN 529 #define PURR DIR2 PIN 63031 // OneWire f o r temperature readings32 #include <OneWire . h>33 OneWire ds (53) ; // s t a r t onewire on pin 5334 //OneWire and temperature p r o c e s s i n g v a r i a b l e s35 byte present = 0 ; // 1 i f s e n s o r s present36 byte data [ 1 2 ] ; // data read from s e n s o r s37 byte addr [ 8 ] ; // address of sensor from which to read38 int HighByte , LowByte , TReading , SignBit , Tc , Tc_100 , Whole , Fract ; // -vars f o r converting data to degrees F39 byte tempsense1 [ 8 ] = f40 , 136 , 25 , 15 , 2 , 0 , 0 , 15g; // address of  -temperature sensor in decimal , breathing servo40 byte tempsense2 [ 8 ] = f40 , 81 , 22 , 2 , 2 , 0 , 0 , 175g; // address of  -temperature sensor in decimal , on board41 byte tempsense3 [ 8 ] = f40 , 35 , 65 , 2 , 2 , 0 , 0 , 157g; // address of  -temperature sensor in decimal , creature a n t e r i o r42 boolean t2sflag = false ; // True when we should p ri nt a temperature  -reading43 int t2s = 0 ; // temperature to be sent over s e r i a l44 byte senstoread = 0 ; // the next temperature sensor to read , t y p i c a l l y -1 34546 // Stepper motor f o r pulse47 #include <Stepper . h>48 Stepper Pulse (200 , STEPPER_PIN1 , STEPPER_PIN2 , STEPPER_PIN3 ,  -STEPPER_PIN4 ) ; // 200 pulse per r o t a t i o n stepper49 int Pulse_dir = 1 ; // Direction of next pulse step50 boolean pulseflag = false ; // True when pulse command sent u n t i l pulse -completed51 byte pulsecount = 0 ; // Number of steps into pulse we are52 // Pulse l i m i t switch i s attached to pins 43 and 45 , i s high on  -depress53 boolean pulse_ls_read = true ; // Pulse l i m i t switch reading ;5455 // Servo v a r i a b l e s56 #include <Servo . h>57 Servo Breathing ;58 Servo SEar ;225Appendix D. Code59 Servo PEar ;606162 //Timer Library ( Just to use the temperature s e n s o r s : n)63 #include <MsTimer2 . h>6465 // loop index v a r i a b l e s66 int j = 0 ;67 int i = 0 ;6869 void setup ( ) f70 // Set timer to f i f t e e n seconds71 MsTimer2 : : set(15000 , readtemp) ;72 MsTimer2 : : start ( ) ;7374 // S e r i a l communication channels75 Serial . begin (9600) ; // USB76 Serial1 . begin (9600) ; // XBee77 Serial2 . begin (9600) ; // Bluetooth78 // S e r i a l 3 . begin (9600) ; // t a i l wire7980 // Setup Limit switch power and r e c e i v e r81 pinMode(LIMIT_SWITCH_PIN , INPUT) ;82 digitalWrite (LIMIT_SWITCH_PIN , LOW) ;8384 // Attach Servos85 Breathing . attach(BREATH_PIN) ;86 SEar . attach(SEAR_PIN) ;87 PEar . attach(PEAR_PIN) ;8889 // Set purring motor d i r e c t i o n , turn purring motor o f f90 pinMode(PURR_DIR2_PIN , OUTPUT) ;91 pinMode(PURR_DIR1_PIN , OUTPUT) ;92 digitalWrite (PURR_DIR1_PIN , LOW) ;93 digitalWrite (PURR_DIR2_PIN , HIGH) ;94 pinMode(PURR_ENABLE_PIN , OUTPUT) ;95 analogWrite(PURR_ENABLE_PIN , 0) ; // purr speed to 09697 // I n i t i a t e and zero stepper motor98 Pulse . setSpeed (5) ; //Slow down the speed f o r i n i t i a t i o n99 //Run u n t i l pulse depressed100 for (i = 0 ; i < 50; i++) f101 if(digitalRead(PULSE_LS) != 1) f102 Pulse . step(Pulse_dir) ;103 delay (50) ; // wait f o r debounce104 g105 g106 // take two a d d i t i o n a l steps to be sure i t i s depressed107 Pulse . step(Pulse_dir) ;108 Pulse . step(Pulse_dir) ;109110 // Restore speed226Appendix D. Code111 Pulse . setSpeed (25) ;112113 // d e a c t i va t e heaters114 for (j = 10; j < 13; j++) f115 pinMode(j , OUTPUT) ;116 digitalWrite (j , HIGH) ;117 g118119 // Zero servos120 Breathing . write (0) ;121 SEar . write (0) ;122 PEar . write (0) ;123124 // Start temperature reading125 tempPreparetoRead ( ) ;126 g127128 byte inByte = 0 ; // byte read from s e r i a l port129 byte btarget = 0 ; // breathing amplitude to achieve130 byte boldtarget = 0 ; // previous breathing amplitude to achieve131132 int bvalue = 50; // f i l t e r e d breathing amplitude to achieve133 int millisold = 0 ; // previous time r e s p i r a t i o n command r e c e i v e d134 int diff = 0 ; // amount of time s i n c e previous r e s p i r a t i o n command  -was r e c e i v e d135 int interval = 10; // m i l l i s e c o n d s between r e s p i r a t i o n commands136137 int pulseinterval = 0 ; // amount of time s i n c e previous pulse command  -was r e c e i v e d138 int pulseold = 0 ; // time previous pulse command was r e c e i v e d139140 void loop ( ) f141 // Communication and Control through USB Port142 if (Serial . available ( ) > 0) f143 inByte=Serial . read ( ) ;144 switch (inByte) f145 case 'f' :146 Pulse . step (10) ;147 break ;148 case 'd' :149 Pulse . step( 10) ;150 break ;151 case 'a' :152 Pulse . step (1) ;153 break ;154 case 's' :155 Pulse . step( 1) ;156 break ;157 case 'r' :158 // read t a i l sensor159 MsTimer2 : : start ( ) ;160 break ;227Appendix D. Code161 case 't' :162 MsTimer2 : : stop ( ) ;163 break ;164 g165 inByte = 0 ;166 g167168 // Breathing and pulse commands through XBee Radio169 if (Serial1 . available ( ) > 0) f170 boldtarget = btarget ; // s t o r e the previous commanded  -breathing servo value171 inByte=Serial1 . read ( ) ;172 Serial2 . print("R.") ; // p rin t the new r e s p i r a t i o n rate173 Serial2 . println(inByte , DEC) ;174 if (inByte == 253) f // 253 = pulse command175 pulseinterval = millis ( )  pulseold ; // make sure we ' re  -not r e c e i v i n g these too quickly176 pulseold = millis ( ) ;177 if ( pulseinterval > 50) f178 pulseflag = true ;179 g elsef180 btarget = boldtarget ;181 g182 g183 if (inByte < 253) f184 btarget = inByte ;185 g186 millisold = millis ( ) ;187 g188189 // Now what we pass through every i t e r a t i o n :190 // Send proper command to breathing servo191 diff = millis ( )  millisold ;192 // Updates are r e c e i v e d every ~60 microseconds193 /∗ i f ( d i f f < i n t e r v a l ) f194 bvalue = boldtarget + ( ( btarget  boldtarget ) ∗ d i f f ) / i n t e r v a l ;195 g e l s e f196 bvalue = btarget ;197 g198 i f ( bvalue < 0) f // sanity check , t h i s happened once or twice199 bvalue = 0 ;200 g201 ∗/202 Breathing . writeMicroseconds (map(btarget ,0 ,253 ,950 ,1400) ) ;203204 // Pulse i f necessary205 if (pulseflag == true) f206 if (pulsecount < 5) f // To pulse go three steps forward207 Pulse . step(3∗Pulse_dir) ;208 pulsecount=6;209 g210 else if (pulsecount > 5 && pulsecount < 10) f228Appendix D. Code211 Pulse . step( 3∗Pulse_dir) ; // Then three steps back212 pulsecount=11;213 g214 else f215 pulseflag = false ;216 pulsecount = 0 ;217 g218 g219 g220221 /∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗∗ ∗/222 // To read temperature s e n s o r s f i r s t we s e l e c t sensor , then wait f o r223 // conversion , then read sensor . Some of t h i s code from arduino  -onewire guide224 void tempPreparetoRead ( ) f225 if (senstoread < 4) f // number of temperature s e n s o r s + 1226 senstoread = 1 ;227 g228229 switch(senstoread) f230 case 1 :231 for ( i = 1 ; i<9; i++) f232 addr [ i]=tempsense1 [ i ] ;233 g234 break ;235 case 2 :236 for ( i = 1 ; i<9; i++) f237 addr [ i]=tempsense2 [ i ] ;238 g239 break ;240 case 3 :241 for ( i = 1 ; i<9; i++) f242 addr [ i]=tempsense3 [ i ] ;243 g244 break ;245 g246247248 ds . search(addr) ;249 // Send the command to read250 ds . reset ( ) ;251 ds . select(addr) ;252 ds . write(0x44 , 1 ) ;253 MsTimer2 : : start ( ) ; // s t a r t timer f o r conversion254 g255256 void readtemp ( ) f257 MsTimer2 : : stop ( ) ;258 present = ds . reset ( ) ;259 ds . select(addr) ;260 ds . write(0xBE) ;261229Appendix D. Code262 for (i = 0 ; i<9; i++) f263 data [ i ] = ds . read ( ) ;264 g265266 // Convert to Fahrenheit and send267 LowByte = data [ 0 ] ;268 HighByte = data [ 1 ] ;269 TReading = (HighByte << 8) + LowByte ;270 SignBit = TReading & 0x8000 ;271 if (SignBit) f // negative272 TReading = (TReading ^ 0xffff) + 1 ; // 2 ' s compliment273 g274 Tc_100 = (6 ∗ TReading) + TReading / 4 ;275 Tc = Tc_100 ;276277 if (SignBit) f278 Tc =  1 ∗ Tc ;279 g280 t2sflag=true ;281 t2s=Tc∗9/5+3200;282283 // I f temperature f l a g set , read the s e n s r s .284 if (t2sflag) f285 if (senstoread ==1) f286 Serial2 . print("T.") ;287 g else if(senstoread ==2) f288 Serial2 . print("U.") ;289 g else if (senstoread == 3) f290 Serial2 . print("V.") ;291 g292 Serial2 . println(t2s) ;293 t2sflag = false ;294 g295296 tempPreparetoRead ( ) ;297 g230

Cite

Citation Scheme:

    

Usage Statistics

Country Views Downloads
United States 144 15
United Kingdom 25 143
China 13 0
Germany 8 0
France 6 0
Canada 3 0
Sweden 2 0
Kazakhstan 2 0
Ukraine 2 0
Brazil 1 0
Finland 1 0
Azerbaijan 1 0
Saudi Arabia 1 0
City Views Downloads
Washington 100 0
Unknown 49 142
Ashburn 10 0
San Francisco 6 1
Guangzhou 6 0
Göttingen 5 0
Clarks Summit 5 5
Beijing 4 0
Shenzhen 3 0
Atlanta 3 0
Mountain View 3 0
Stockholm 2 0
New Westminster 2 0

{[{ mDataHeader[type] }]} {[{ month[type] }]} {[{ tData[type] }]}
Download Stats

Share

Embed

Customize your widget with the following options, then copy and paste the code below into the HTML of your page to embed this item in your website.
                        
                            <div id="ubcOpenCollectionsWidgetDisplay">
                            <script id="ubcOpenCollectionsWidget"
                            src="{[{embed.src}]}"
                            data-item="{[{embed.item}]}"
                            data-collection="{[{embed.collection}]}"
                            data-metadata="{[{embed.showMetadata}]}"
                            data-width="{[{embed.width}]}"
                            async >
                            </script>
                            </div>
                        
                    
IIIF logo Our image viewer uses the IIIF 2.0 standard. To load this item in other compatible viewers, use this url:
http://iiif.library.ubc.ca/presentation/dsp.24.1-0080686/manifest

Comment

Related Items