UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Empathy : the role of facial expression recognition Besel, Lana Diane Shyla 2006

Your browser doesn't seem to have a PDF viewer, please download the PDF to view this item.

Notice for Google Chrome users:
If you are having trouble viewing or searching the PDF with Google Chrome, please download it here instead.

Item Metadata

Download

Media
831-ubc_2007-266830.pdf [ 4.8MB ]
Metadata
JSON: 831-1.0100322.json
JSON-LD: 831-1.0100322-ld.json
RDF/XML (Pretty): 831-1.0100322-rdf.xml
RDF/JSON: 831-1.0100322-rdf.json
Turtle: 831-1.0100322-turtle.txt
N-Triples: 831-1.0100322-rdf-ntriples.txt
Original Record: 831-1.0100322-source.json
Full Text
831-1.0100322-fulltext.txt
Citation
831-1.0100322.ris

Full Text

E M P A T H Y : T H E R O L E O F F A C I A L E X P R E S S I O N R E C O G N I T I O N by L A N A D I A N E S H Y L A B E S E L B . A . , University of Victoria, 1996 M . A . , University of British Columbia, 2002 A THESIS S U B M I T T E D I N P A R T I A L F U L F I L L M E N T O F T H E R E Q U I R E M E N T S F O R T H E D E G R E E OF D O C T O R OF P H I L O S O P H Y in T H E F A C U L T Y O F G R A D U A T E S T U D I E S (Psychology) T H E U N I V E R S I T Y OF B R I T I S H C O L U M B I A November 2006 © Lana Diane Shyla Besel, 2006 A B S T R A C T This research examined whether people with higher dispositional empathy are better at recognizing facial expressions of emotion at faster presentation speeds. Facial expressions o f emotion, taken from Pictures o f Facial Affect (Ekman & Friesen, 1976), were presented at two different durations: 47 ms and 2008 ms. Participants were 135 undergraduate students. They identified the emotion displayed in the expression from a list of the basic emotions. The first part of this research explored connections between expression recognition and the common cognitive empathy/emotional empathy distinction. Two factors from the Interpersonal Reactivity Scale (IRS; Davis, 1983) measured self-reported tendencies to experience cognitive empathy and emotional empathy: Perspective Taking (IRS-PT), and Empathic Concern (IRS-EC), respectively. Results showed that emotional empathy significantly positively predicted performance at 47 ms, but not at 2008 ms and cognitive empathy did not significantly predict performance at either duration. The second part examined empathy deficits. The kinds of emotional empathy deficits that comprise psychopathy were measured by the Self-Report Psychopathy Scale (SRP-III; Paulhus, Hemphill; & Hare, in press). Cognitive empathy deficits were explored using the Empathy Quotient (EQ; Shaw et al., 2004). Results showed that the callous affect factor of the SRP (SRP-CA) was the only significant predictor at 47 ms, with higher callous affect scores associated with lower performance. S R P - C A is a deficit in emotional empathy, and thus, these results match the first paper's results. A t 2008 ms, the social skills factor of the E Q was significantly positively predictive, indicating that people with less social competence had more trouble recognizing facial expressions at longer presentation durations. Neither the total scores i i for SRP nor E Q were significant predictors of identification accuracy at 47 ms and 2008 ms. Together, the results suggest that a disposition to react emotionally to the emotions of others, and remain other-focussed, provides a specific advantage for accurately recognizing briefly presented facial expressions, compared to people with lower dispositional emotional empathy. T A B L E O F C O N T E N T S Abstract ii Table o f Contents iv List o f Tables vi List o f Figures • v i i i Acknowledgements ix Dedication • x C H A P T E R 1: Overview 1 References 7 C H A P T E R 2: Cognitive and emotional empathy: The role of facial expression recognition 10 Introduction -. 10 Method 19 Results 24 Discussion 28 Tables Chapter 2 • 38 References 43 C H A P T E R 3: Empathy deficits: The role of facial expression recognition 50 Introduction • •• 50 Method 56 Results 62 Discussion 67 Tables Chapter 3 73 References »r C H A P T E R 4: Summary 88 References 95 A P P E N D I C E S 98 Appendix A : Dependent Variable Distributions 98 Appendix B : Correlation Matrices 100 Appendix C: Scales '. 102 Appendix D : U B C Research Ethics Board's Certificates o f Approval 113 v LIST OF T A B L E S Table 2.1. Mean scores and gender comparisons for expression identification accuracy at 47 msand 2008 ms 38 Table 2.2. Mean scores and gender comparisons for cognitive and emotional empathy 38 Table 2.3. Zero-order Correlations between Social Desirability, Empathy, Gender, and Acculturation 39 Table 2.4. Zero-order Correlations between expression identification accuracy, 47 ms control task, acculturation and gender 40 Table 2.5. Hierarchical Regression Analysis for 47 ms recognition accuracy 41 Table 2.6. Hierarchical Regression Analysis for 2008 ms recognition accuracy 42 Table 3.1. Mean scores and gender comparisons for expression identification accuracy at 47 msand 2008 ms 73 Table 3.3. Zero-order Correlations between Social Desirability and Self-report Psychopathy : 75 Table 3.4. Zero-order Correlations between Social Desirability and Empathy Quotient 75 Table 3.5. Hierarchical Regression Analysis 47 ms recognition accuracy, with SRP and E Q total scores as predictors 76 Table 3.6. Hierarchical Regression Analysis 2008 ms recognition accuracy, with SRP and E Q total scores as predictors 76 Table 3.7. Hierarchical Regression Analysis 47 ms recognition accuracy, including SRP-C A but not E Q - E R as a predictor 77 Table 3.8. Hierarchical Regression Analysis for 2008 ms recognition accuracy 78 Table 3.9. Hierarchical Regression Analysis for Manipulation factor of the SRP 79 vi Table 3.10. Identification accuracy means for each expression type at 47 ms and 2008 ms 79 Table 3.11. Comparison between groups with high and low Self-Report Psychopathy total scores (SRP-T) on recognition of specific types of expressions at 47 ms and 2008 ms 80 Table A . l : Kurtosis and skewness statistics 98 v n LIST O F F I G U R E S Figure A . l Distribution of 47 ms variable • 98 Figure A . 2 Distribution of 2008 ms variable 99 v i i i A C K N O W L E D G E M E N T S First and foremost, I would like to thank my advisor, Dr. John Yui l le , for the freedom he gave me to pursue my research interests, for his patience, and for all I have learned from him over the years. I would also like to thank my committee members: Dr. Christoff, Dr. Dutton, Dr. Paulhus, and Dr. Linden. I would like to give a special thank you to Dr. Paulhus for his questions that got me to think through issues from different perspectives. Data collection and analyzing was made quicker and easier with the help of Cynthia, Jeremy, Laura, Kev in and Han. I would also like to thank all o f my friends and family who provided me with a lot support during graduate school. In particular, I would like to thank my good friend and colleague, Dorothee, for her encouragement, understanding, and for always making time to give feedback. I thank Patrick for his love and support, and for making the last, stressful, home stretch of graduate school, a time of great happiness and peace. Finally, I would like to thank my parents, Dan and Shiela, my sister Esther, and my brother-in-law, Ted, for all of their help and encouragement through these years. ix D E D I C A T I O N To my family: my mother, father, sister and brother-in-law C H A P T E R 1: O V E R V I E W This research explored individual differences in facial expression recognition and its relationship to empathy. Empathy, defined as "the reactions of one individual to the observed experiences of another" (Davis, 1983, p. 113), is commonly divided into, cognitive and emotional aspects (e.g., Davis, 1983; Mehrabian & Epstein, 1972). The cognitive type refers to the ability to imaginatively understand another person's thoughts, feelings and actions. Emotional empathy, conversely, does not involve cognitive apprehension. Rather, it is characterized by visceral, automatic, thoughtless, instinctive, involuntary reactivity (Mehrabian & Epstein, 1972). This reactivity is usually similar to what the other person is feeling and may result in sympathy or concern for that person (Eisenberg et al. 1991). Despite being blurry in reality, the distinction between cognitive empathy and emotional empathy is meaningful, theoretically and practically, in terms of understanding empathy and empathy deficits (Baron-Cohen & Wheelwright, 2004). This research further defined this distinction by associating facial expression recognition accuracy with 'these dimensions. The main question is whether people with higher dispositional empathy are better at recognizing facial expressions, in particular, expressions presented for brief periods of time. Empathy plays a key role in our interpersonal interactions. Empathic behaviour entails accurately discerning the emotional states of others (e.g. Ickes, 1993). Therefore, investigating the relationship between facial recognition accuracy and empathy has important implications for understanding the underpinnings of empathy. In addition, examining accuracy to emotional facial expressions presented at different speeds helps to understand how emotional stimuli are processed in a more general sense. 1 There are numerous factors that influence facial expression recognition accuracy, including expression intensity (Blairy, Herrera, & Hess, 1999), availability of context information (Ekman, 1982), type of emotion (e.g., Baron-Cohen, Jollife, Mortimore, & Robertson, 1997; Ekman & Friesen, 1976), and the length o f time that an expression is viewed (Kirouac & Dore, 1984; Matsumoto et al., 2000; Ogawa & Suzuki, 1999). The present research compared the brief presentation of intense basic emotional expressions, to intense basic emotional expressions presented for a longer duration to examine whether people with higher dispositional empathy are better at recognizing facial expressions of emotion at faster presentation speeds. Facial expressions can be recognized at very low presentation rates; one study found that between 36 ms to 48 ms, accurate emotional recognition capability improved greatly, with correct answers reaching nearly 50% for all expressions (Ogawa & Suzuki, 1999). The first part of this research, described in Chapter 2, researched this basic theoretical question: does emotional empathy, cognitive empathy or neither, relate to better emotional recognition at brief presentation durations, compared to longer durations? In other words, which aspects of the empathy construct relate to a greater capacity to glean more information from a facial expression flashed quickly? On the one hand, correctly identifying the specific emotion in an expression may be higher in people who have a tendency to focus on and analyze the emotions and behaviour of others. People who score higher on cognitive empathy, reportedly understanding the emotions and behaviours of others in an articulated way, and concerning themselves with the perspectives of others, may discern basic expressions better, especially when expressions 2 are presented for loner durations, because they have a more developed framework to understand and place these expressions. On the other hand, research indicates that physiological reactions occur to briefly presented facial expressions of emotion, and these reactions play an important role in emotional empathy, and potentially in discerning emotional stimuli. Greater emotional empathy has been linked to greater mimicry o f facial expressions (Sonnby-Borgstrom, Jonsson, & Svensson, 2003). Mimicry , the tendency to make the same expression, or at least flinch the same muscles as the expression viewed, happens quickly, within 300 ms of exposure to another person's facial expressions (Wild, Erb, & Bartels, 2001). Mimic ry also occurs to facial expressions that were presented for such short durations that they were not consciously perceived (Dimberg, Thunberg, & Elmeged, 2000). Therefore, there are tentative connections between early reactivity to expressions of emotion and greater dispositional emotional empathy, as shown by a few preliminary studies. Does this translate into better labelling of the specific emotion in a facial expression that is briefly presented? So far, there is no evidence that greater mimicry leads to greater decoding accuracy (Blairy, Herrara, & Hess, 1999). However, one study found that participants with higher self-report levels of dispositional emotional empathy more accurately determined the emotional valence of briefly presented facial expressions (Martin, Berry, Dobranski, & van Home, 1996). Consequently, greater emotional reactivity might translate into better emotional identifications. Conversely, perhaps emotional reactivity is . linked to appropriate interpersonal responding, but this knowledge remains non-verbal and unarticulated. No studies, to my knowledge, have investigated the link between 3 emotional recognition of the basic emotions at brief presentation durations, and empathy - neither cognitive empathy, nor emotional. The second paper, described in Chapter 3, focuses on empathy deficits. The emotional-cognitive empathy distinction has practical utility for describing empathy deficits. A s described by Blai r (2005), being high on one dimension and low on the other creates descriptions that have potentially good explanatory power for disorders with empathy deficits as primary features. Psychopathy is characterized by callous affect, manipulation, erratic lifestyle and antisocial behaviour (Hare, 1971; Paulhus, Hemphill , & Hare, in press). Researchers have defined psychopathy as deficient emotional empathy but sufficient cognitive empathy (Blair, Jones, Clark, & Smith, 1997; Richell et al., 2003): awareness of the thoughts and feelings of others remains unimpaired. This deficit is believed to contribute to intentional cruelty and manipulation, because awareness is used callously for one's own gain. The present research explores the kinds of empathic deficits that comprise psychopathy in a non-clinical sample. In addition, deficits in cognitive empathy are explored, such as difficulty understanding interpersonal situations due to an inability to take the perspective of others. Some research has investigated facial expression deficits in psychopathy, but no studies, to my knowledge, have presented expressions for brief durations under one second. Adult psychopaths do not show a general emotional detection deficit (Habel, Kuhn, Salloum, Devos, & Schneider, 2002; Kosson, Suchy, Mayer, & Libby, 2002), although studies show that children with psychopathic tendencies misjudge emotional facial expressions more frequently than children without psychopathic tendencies (Blair, Colledge, Murray, & Mitchell , 2001; Stevens, Charman, & Blair, 2001). In addition, 4 research has shown that psychopathy is specifically associated with an impaired ability to process distress cues of others, such as fear and sadness (Blair et al., 2000; 2004). However, fear is often the most difficult expression to recognize (e.g. Ekman & Friesen, 1974; Kirouac & Dore, 1984), so this model has been criticized for not recognizing this potential confound. In summary, the cognitive-emotional empathy distinction appears to capture some fundamental components of the empathy construct. This research defines these components further through their relationship to facial expression recognition. Part 1 explores how the cognitive and emotional subtypes of empathy relate to facial expression recognition at slow and fast presentation speeds and Part 2 explores whether the same pattern of facial expression recognition capabilities is evident in the psychopathy construct, a disorder with profound empathy deficits. The main hypotheses were: a) emotional empathy is positively related to facial expression recognition accuracy at the short presentation duration b) cognitive empathy is positively related to facial expression recognition accuracy at the long presentation duration c) psychopathic-like empathy deficits are negatively related to facial expression recognition accuracy at the short presentation duration d) manipulation is positively related to cognitive empathy, negatively related to emotional empathy, negatively related to accuracy at the short duration, and positively related to accuracy at the long duration 5 no recognition accuracy differences between for fearful expressions were hypothesized between high and low scorers on the psychopathy scale. References Baron-Cohen, S., Jolliffe, T., Mortimore, C , & Robertson, M . (1997). Another advanced test o f theory o f mind: Evidence from very high-functioning adults with autism or Aspergers Syndrome. Journal of Child Psychology and Psychiatry, 38, 813-822. Baron-Cohen, S. & Wheelwright, S. (2004). The empathy quotient: A n investigation of adults with Asperger Syndrome or High Functioning Autism, and normal sex difference. Journal of Autism and Developmental Disorders, 34(2), 163-175. Blair, R.J.R. (2005). Responding to the emotions of others: Dissociating forms of empathy through the study o f typical and psychiatric populations. Consciousness and Cognition, 14, 698-718. Blair, R.J.R., Jones, L . , Clark, F. & Smith, M . (1997). The psychopathic individual: a lack o f responsiveness to distress cues? Psychophysiology, 34, 192-198. Blair , R.J.R., Mitchel l , D . G . V . , Peschardt, K . S . , Colledge, E . , Leonard, R . A . , Shine, J .H. , Murray, L . K . & Perrett, D.I. (2004). Reduced sensitivity to others' fearful expressions in psychopathic individuals. Personality and Individual Differences, 37,1111-1122. Blairy, S., Herrera, P. & Hess, U . (1999). Mimic ry and the judgment of emotional facial expressions. Journal of Nonverbal Behaviour, 23(1), 5-41. Davis, M . H . (1980). A multidimensional approach to individual differences in empathy. Catalog of Selected Documents in Psychology, 10(4), 1-17. Davis, M . H . (1983). Measuring individual differences in empathy: Evidence for a multidimensional approach. Journal of Personality and Social Psychology, 44(1), 113-126. Dimberg, U . , Thunberg, M . & Elmehed, K . (2000). Unconscious facial reactions to emotional facial expressions. Psychological Science, 11(1), 86-89. Ekman, P. (1982). Emotion in the human race. New York: Cambridge University Press. Ekman, P. & Friesen, W . (1971). Constants across cultures in the face and emotion. Journal of Personality and Social Psychology, 17, 124-129. Ekman, R, & Friesen, W. V . (1976). Pictures of facial affect. Palo Al to , C A . : Consulting Psychologists Press. Habel, U . , Kuhn, E . , Salloum, J., Devos, H . & Schneider, F. (2002). Emotional processing in psychopathic personality. Aggressive Behaviour, 28, 394-400. Hare, R . D . (1991). The Hare Psychopathy Checklist-Revised. Toronto, O N : Multi-Health Systems. Ickes, W . (1993). Empathic Accuracy. Journal of Personality, 61(A), 587-610 Kosson, D.S. , Suchy, Y . , Mayer, A . R . & Libby, J. (2002). Facial affect recognition in criminal psychopaths, Emotion, 2(4), 398-411. Kirouac, G . & Dore, F . Y . (1984). Judgment of facial expressions of emotion as a function of exposure time. Perceptual and Motor Skills, 59, 147-150. Martin R . A . , Berry, G.E . , Dobranski, T. & van Home, M . (1996). Emotion perception threshold: individual differences in emotional sensitivity. Journal of Research in Personality, 38, 290-305. Matsumoto, D . , LeRoux, J. , Wilson-Cohn, C , Kooken, K . , Ekman, P., Yrizarry, N . , Loewinger, S., Uchida, H . , Yee, A . , Amo, L . & Goh, A . (2000). A new test to measure emotion recognition ability: Matsumoto and Ekman's Japanese and 8 Caucasian Br ie f Affect Recognition Tes t ( JACBART) . Journal of Nonverbal Behavior, 24{3), 179-209. Mehrabian, A . , & Epstein, N . (1972). A measure of emotional empathy. Journal of Personality, 40(4), 525-543. Merckelbach, H . , van Hout, W. , van den Hout, M . A . & Mersch, P.P. (1989). Psychological and subjective reactions of social phobics and normals to facial stimuli. Behaviour Research and Therapy, 27, 289-294. Ogawa, T. & Suzuki, N . (1999). Response differentiation to facial expressions of emotion at increasing exposure duration. Perceptual and Motor Skills, 89, 557-563. Paulhus, D . L . , Hemphill , J.D., & Hare, R . D . (in press). Self-Report Psychopathy scale (SRP-IIL). Toronto/Buffalo: Multi-Health Systems. Richell , R . A . , Mitchel l , D . G . V . , Newman, C , Leonard, A . , Baron-Cohen, S., & Blair, R.J.R. (2002). Theory of mind and psychopathy: Can psychopathic individuals read the 'language of the eyes?' Neuropsychologia, 4J 523-526. Sonny-Borgstrom, M . , Jonsson, P., & Svensson, O. (2003). Emotional empathy as related to mimicry reactions at different levels of information processing. Journal of Nonverbal Behaviour, 27(1), 3-23. Stevens, D . , Charman, T., & Blair, R.J.R. (2001). Recognition of emotion in facial expressions and vocal tones in children with psychopathic tendencies. The Journal of Genetic Psychology, 162(2), 201-211. W i l d , B . , Erb, M . & Bartels, M . (2001). Are emotions contagious? Evoked emotions while viewing emotionally expressive faces: Quality, quantity, time course and gender differences. Psychiatry Research, 102, 109-124. 9 C H A P T E R 2: C O G N I T I V E A N D E M O T I O N A L E M P A T H Y : T H E R O L E O F F A C I A L E X P R E S S I O N R E C O G N I T I O N Introduction Empathy has been defined as, "the reactions of one individual to the observed experiences Of another" (Davis, 1983, p. 113). What basic processes underlie empathy -is empathy related to better recognition and articulation of these observed emotional states? Part of behaving empathically involves accurately discerning the emotional states of others and responding appropriately (Ickes, 1993). However, the relationship between accurate recognition o f emotions and empathy has not been fully examined empirically. This study investigated individual differences in facial expression recognition and empathy. The main purpose for researching these issues was to gain a better understanding of the underpinnings of empathy, a central aspect of interpersonal behaviour. The empathy construct has typically been separated into two types: cognitive and emotional (Davis, 1983). The cognitive type refers to the ability to imaginatively understand another person's thoughts, feelings and actions. Emotional empathy, in contrast, involves minimal cognitive apprehension. Rathpr, it is characterized by visceral, automatic, thoughtless, instinctive, involuntary reactivity (Mehrabian & Epstein, 1972) that resembles what the other person is feeling, and may result in sympathy or concern for that person (Eisenberg et al. 1991). Thus, the key difference between these two types of empathy is that a corresponding emotional state is induced in one type and not necessarily in the other. Cognitive empathy maintains an articulated understanding, 10 whereas emotional empathy may remain a non-verbal emotional reaction. In reality, the distinction between these two types is blurred, but the distinction itself remains a theoretically meaningful one (Baron-Cohen & Wheelwright, 2004). Psychopathy, a disorder with profound empathy deficits, exemplifies this distinction. Psychopaths are typically described as behaving callously, with little emotional empathy, but retaining a clear understanding of interpersonal cues and social situations, thus having cognitive understanding o f the other person's perspective (Blair, 2005). Facial expression recognition and processing emotion The typically acknowledged basic emotions are happy, sad, anger, fear, surprise, disgust, and contempt (e.g. Ekman & Friesen, 1976). Some theorists, such as Ekman (1982; 1999), believe these basic emotions differ fundamentally from one another in terms of, for example, different physiological changes, different facial muscle activations and different causal events (i.e., threat, loss). These basic emotions have been found across various cultures, supporting the idea that they are fundamental and mitigating the likelihood that emotion is interpreted through socially constructed labels (e.g., Ekman & Friesen, 1976). Other theorists suggest that we are less capable o f discriminating categories o f emotion (i.e., anger, sadness, fear, disgust, happy, surprise) and instead discriminate emotion in broad terms, for example, valence (positive or negative) and arousal intensity level (e.g., Lang, 1995; Russell & Bullock, 1986). These theorists believe the categories of emotion are mental representations and the boundaries between emotions are fuzzy: for example, the boundary between fear and surprise is often confused. 11 The stages of processing emotional stimuli can largely be broken down into the following: 1) detection of emotionally salient stimuli 2) production of an emotional response 2) interpretation of the incited emotion and 3) emotion regulation (Phillips, Drevets, Rauch, & Lane, 2003). A n important aspect o f processing emotion is its rapid, automatic functioning. In fact, researchers largely agree that there are two main levels of interpreting emotional stimuli (Buck, 1984; Buck & Ginsburg, 1997; Ledoux, 1996; Lazarus, 1991; Ohman, 1993), although they give these levels different names, such as automatic vs. controlled (Ohman, 1993) and knowledge-by-acquaintance vs. knowledge-by-description (Buck, 1984). The first type generates meaning rapidly, automatically, and without volitional control (Lazarus, 1991). It is direct, immediate self-evident and cannot be subjected to logical analysis, instead being a gut feeling. The other type is deliberate, voluntary and propositional (Buck & Ginsburg, 1997). In terms of processing emotional stimuli, neurologically these two processes can be explained by two amygdala routes, one that is quick and one that is longer (Ledoux, 1996). The quick runs directly from the thalamus to the amygdala. It is automatic in that it detects emotion without cortical input, in other words, without conscious thought. The other pathway runs from the thalamus to the cortex to the amygdala (Ledoux, 1996). It is therefore slower and is characterized by a more detailed representation of the stimuli. Facial expressions are emotional stimuli and therefore these stages apply to processing facial. In particular, the rapidity by which facial expressions are detected, responded to and interpreted, are reviewed. Next, the connections between empathy and facial expression reactivity and interpretation are discussed. Facial expressions: rapid reactivity and interpretation 12 First, it is evident that detection of salient emotional information from faces can happen quickly, with little conscious awareness. Whalen et al. (1998) presented participants with a fearful or happy face for too short a time for the face to be consciously perceived. These target faces were presented for 33 ms, followed by a masking neutral face for 167 ms. The neutral face acted as a masking stimulus, overriding the briefly presented emotionally expressive faces, such that the participants only reported seeing the neutral faces. Using functional Magnetic Resonance Imaging (fMRI) they found that amygdala activity was significantly higher to fearful faces than to happy faces. Second, responsiveness to the emotions o f others, as measured by physiological reactivity, also happens rapidly, often without our awareness. Eisenberg and colleagues (1990; 1994) have conducted a series of studies examining physiological arousal to empathy inducing stimuli, such as films. Exposure to these stimuli induces higher levels of physiological reactivity, including changes in heart rate and skin conductance in those participants with higher levels of empathic concern. Mimic ry is another physiological reaction to the emotions of others. Physiological measures show that mimicry to the expressions of others regularly occurs without bur awareness and this mimicking process happens early in emotional processing, within 300 ms. of exposure to another person's facial expressions (Wild, Erb, & Battels, 2001). In addition, even brief, unconscious exposure to facial expressions elicit mimicking. Dimberg, Thunberg, and Elmeged (2000) recorded facial muscle movement with electromyography ( E M G ) , after participants were exposed briefly to masked facial expressions, presented too briefly for conscious awareness. They found that happy faces evoked the zygomatic muscle and angry faces evoked the corrugator supercilli muscle. The zygomatic muscle elevates the lips to form a 13 smile, whereas the corrugator supercilii muscle knits the eyebrow in negative facial expressions. Thus, E M G recordings indicate that participants displayed emotional mimicry even to the unconsciously presented facial expressions. Sonny-Borgstrom (2002) found that below 17 ms, participants did not mimic at all, but between 17ms and 40 ms, where masked stimuli are detected but not consciously processed, mimicking did occur. Third, it is evident that fairly accurate interpretations o f the basic emotions occur at very brief presentation duration rates (Kirouac & Dore, 1984; Matsumoto et al., 2000; Ogawa & Suzuki, 1999). One study found that, between 36 ms to 48 ms, accurate recognition of expression type improves greatly, with correct answers reaching nearly 50% for all expressions (Ogawa & Suzuki, 1999). Another study found that at presentation durations of 17 ms, people are unaware of expressions, at 56 ms expressions can be recognized with difficulty and at 2350 ms, expressions can be clearly identified (Sonny-Borgstrom, 2002). Kirouac and Dore (1984) had somewhat higher accuracy rates, ranging from 26% to 59% at 20 ms and from 70% to 88% at 50 ms. It is generally agreed upon that at 75 ms, facial expression recognition is fairly reliable (Kirouac & Dore, 1984; Sonny-Borgstrom, 2002). However, there is variability among accuracy rates of expression types: typically happy faces are easiest, and anger and fear are hardest (Ekman & Friesen, 1976; Kirouac & Dore, 1984). The term microexpression is used to describe very brief expressions of emotion, typically to refer to concealed emotion that has leaked out, such as when telling a lie (Ekman & O'Sullivan). Empathy and facial expressions: reactivity and interpretation 14 The emotional processing stages discussed above, can also be applied to empathy: to be fully empathic, the emotions of another person need to be detected, the same or related emotion incited within the self, and these emotions need to be interpreted and regulated. For empathic emotion, the key aspect is detecting, experiencing and interpreting the same, or related, emotion. In terms o f producing the same emotion, mimicry has long been assumed a source of vicarious arousal (e.g. Lipps, 1907) and there has been recent renewed interest in the idea that emotional reactivity caused by mimicry underlies empathy (Hatfield et al., 1994; Meltzoff & Moore, 1977; Preston & de Waal, 2002). Lipps (1907) believed that empathy was based on automatic mimicry of the other person's behaviour, including facial ' expressions. Lipps proposed that when we witness a facial expression, we mimic that reaction, and via a feedback process, we ourselves have an affective reaction that corresponds to the state of the person we are interacting with. Lipps believed that this vicarious internal state helps us to understand the emotional state of the other person, forming the basis for empathy. Mimic ry appears to be innate: not long after birth, infants mimic the facial expressions of others (Meltzoff & Moore, 1977; Termine & Izard, 1988). For example, at ten weeks of age, infants already roughly imitate their mothers' facial expressions o f happiness and anger (Haviland & Lelwica, 1987). Some research indicates that those individuals who show higher mimicking behaviour to pre-consciously presented emotional stimuli report having more emotional empathy. Sonnby-Borgstrom, Jonsson, and Svensson (2003) found that at 56 ms, but not 17 ms or 2350 ms, people scoring higher on a self-report emotional empathy scale showed greater mimicry than those scoring lower. Low empathy scores were not 15 associated with mimicry to any presentation duration time. Another study compared facial expression mimicry between Duchenne smiles, which are felt smiles that alter the muscles around the eyes, and non-Duchenne smiles, which only alter the cheek smiles and appear fake (Surakka & Heitanen, 1998). Compared to neutral expressions, there was more mimicking in response to the Duchenne smiles compared to the non-Duchenne smiles. Whereas higher self-reported empathy scores related to more self-reported interest and pleasure in reaction to the Duchenne smiles, empathy scores were uncorrelated with E M G activity. Another study compared brain activations while couples they watched their partner receive a painful shock and while they themselves received one. Results indicated that certain parts of the brain were activated both while viewing their partner and while experiencing the pain themselves. These areas were the bilateral anterior insula, anterior cingulate cortex, brainstem and cerebellum. Moreover, people who scored higher on a self-report emotional empathy scale had more activation in those areas that were also part o f the limbic system: the insula and anterior cingulated cortex (Singer, Seymour, O'Doherty, Kauge, Dolan, & Frith, 2004). This finding was explained in terms of the potential link between emotional mimicry and empathy. Less is known about the connection between emotional recognition and empathy. There does not seem to be a direct link between physiological reactivity and recognition. For example, the research linking mimicry with emotional expression recognition, and empathy, is inconclusive. One study morphed intense facial expressions into varying intensities levels, from 20% to 100% and had the participants rate each expression on each of the basic emotions (Blairy, Herrara, & Hess, 1999). Muscle activity was recorded 16 and subjective emotional experiences were rated. The study did not find a link between decoding accuracy and mimicry. However, the quality of the stimuli may have affected the findings. The low intensity stimuli were unnaturally formed by taking the beginnings of high intensity expressions. Perhaps naturally occurring, low intensity emotions are not simply intense expressions downgraded, but rather show in only one part of the face. If so, morphing intense expressions into subtle ones in this way may not elicit normal mimicking behaviour. It is also possible that mimicry more readily occurs to intense, but not weak, expressions, or to dynamic stimuli instead of still. There is, however, a tentative link between an emotionally empathic disposition and recognition accuracy to briefly presented expressions. Martin et al. (1996) presented pictures of facial expressions for duration times beginning at 50 ms, decreasing in speed i f participants were correct in their judgements and increasing i f participants were incorrect. The participants were only required to judge the valence of the face (i.e. negative or positive emotion). They found that individuals who were more accurate in detecting the valence of the expression at lower thresholds scored higher on an emotional empathy self-report scale. Likewise, other studies have found links in preschool-aged children between accurate recognition of facial expressions and prosocial behaviour (Nowicki & Mitchel l , 1998; Walden & Field, 1982). No research I am aware of has focussed on the recognition of basic emotion expressions presented for under 1 second, and empathy. Thus, research shows that, even at very brief presentation durations, emotional expressions are reacted to physiologically and can be correctly interpreted. Obviously, i f a stimulus is presented for longer, a physiological reaction occurs, but there is more time 17 for cognitive interpretation. The main focus of this research is to investigate individual differences in accuracy at a brief duration rate compared to a longer rate. A t the brief rate, participants had to rely more on their physiological responses and automatic interpretations - does accuracy at this rate relates to empathy? Where does cognitive empathy fit? Very little research has been done on cognitive empathy and facial expression interpretation or reactivity. Perhaps articulation of specific emotional states would be aided by a disposition with higher cognitive empathy. It is possible that people who score higher on cognitive empathy, who report understanding the emotions and behaviours of others in an articulated way, and concern themselves with the perspectives of others, discern basic expressions better, even when there is limited time to view them, because they have a more developed framework to understand and place these expressions. It is reasonable to surmise that especially when expressions are presented for long duration, individuals with higher cognitive empathy would perform better because there is more cognitive interpretation opportunity. Since the basic levels of emotional processing are mainly composed o f physiological reactivity and an automatic interpretation, perhaps people who report higher emotional empathy, and therefore have stronger emotional reactivity to the emotions of others, would perform better, particularly at short duration times. This study presented facial expressions to participants at short and long presentation durations. Participants completed empathy questionnaires and the relationships between cognitive empathy, emotional empathy and accuracy of expression recognition were explored. It was hypothesized that individuals with more emotional 18 empathy would be more accurate at the short duration while cognitive empathy would be an advantage at the long duration. Method Participants The total number o f participants was 135 (98 females, 37 males), after the removal of one participant because of outlier data. They were recruited from the psychology department at the University of British Columbia. Ninety-three percent of the subjects were between the ages of 18 and 23. A l l subjects were under age 35. English was not the first language in 55% of the sample, and 47%, were of East Asian descent. A l l participants were university students and therefore their English language skills were presumed to be sufficient. Procedure After providing written informed consent, participants were seated at a desktop computer in a quiet room by themselves. The computer had a Pentium 3, 866 M h z processor. The monitor size was 17" and had a resolution of 1024x768 with 32 bit colour. The refresh rate was 85Hz. The experiment consisted of one facial expression recognition task presenting stimuli at 47 ms; one facial expression recognition task presenting stimuli at 2008 ms; one facial features judgement task presenting faces at 47 ms that was used to control for individual differences in processing information at 47 ms; and several questionnaires. A l l tasks and questionnaires were completed on the computer, in random order. Before each task and questionnaire, instructions appeared on the computer screen. Subjects read 19 through the instructions and pressed the spacebar to begin the task. The number keys had were used to answer. The instructions for the two facial recognition tasks were as follows: Y O U R T A S K WILL B E T O JUDGE T H E EMOTION OF T H E F A C I A L EXPRESSION IN T H E P H O T O G R A P H T H A T IS PRESENTED T O Y O U . The photographs will be presented for a brief amount of time. If you are unsure, make your best guess. You will see 3 things for each question, presented one after the other: 1) a "+" sign to focus your attention on the centre of the screen 2) a photo of a face 3) a list of possible answer choices with the corresponding number key for each answer. After you give your answer, the next question comes up immediately. Therefore, if you want to break, leave the answer choices on the screen, and answer only when ready to continue. You may take as long as you wish to answer. Press the spacebar to continue. Measures Facial expression recognition tasks Two facial expression recognition tasks were administered to each participant that differed in presentation duration speed. Ogawa and Suzuki (1999) found that between around 36 ms to 48 ms, accurate emotional recognition capability improved greatly, with correct answers reaching nearly 50% for all expressions. Another study found that at 10 ms, accuracy ranged from 16%-20%; at 20 ms 26%-59%; at 30 ms 46%-80%; at 40 ms. 56%-80%; and at 50 ms, 70%-88% (Kirouac & Dore, 1984). The present study presented the facial expressions for 47 ms. This duration time was at the tail end of the improvement threshold Ogawa and Suzuki (1999) suggested and similar to the 56 ms presentation time employed by Sonny-Borgstrom (2002)'s emotional mimicry study, that was found to relate to emotional empathy. This study explored individual differences of 20 this hypothesized threshold. The faces were not masked because restricting the analysis to a preconscious, automatic level of processing was not desired in this study. Rather, this study intended to measure individual differences in how much emotional information could be gleaned from briefly presented expressions, and how this information would be interpreted. The controlled, conscious facial expression recognition task presented the expressions for 2008 ms. This presentation time was long enough for a meaningful contrast in interpretation time with 47 ms. It should be noted that the computer's refresh rate determines the exact presentation duration time and that is why the precise duration times of 47 ms. and 2008 ms are not rounded to 2000 ms. and 50 ms. In addition, since there was no mask to stop the processing time of the stimuli, the precise presentation duration was likely slightly longer than 47 ms. and 2008 ms. A t both presentation duration levels, participants were presented with facial expressions of adult male and female Caucasian faces from the well-used Pictures of Facial Affect photographs (Ekman & Friesen, 1976). These faces showed intense facial expressions depicting either anger, sadness, happiness, disgust, surprise, or fear. Neutral faces were also presented. Reliability data on the expression discrimination o f these faces were used to ensure that the expressions in both tasks were of equal discrimination difficulty. N o expressions were presented twice in the entire study. Each type of expression was presented six times. At 47 ms, there were 21 male faces and 21 female faces. A t 2008 ms, there were 22 female faces and 20 male faces. Each of the seven emotions were presented six times, for a total of 42 trials. There were also 5 practice trials to accustom the participants to the task and these were not calculated into their final score. Therefore, in total, each facial expression recognition task consisted of 47 trials, 21 with a possible total score of 42. Accuracy was scored by the number o f expressions answered correctly out of 42. 47 ms. control task To control for individual differences in general perceptual processing speed, as well as other differences such as problems maintaining attention, another task involving facial judgements at 47 ms was administered. Participants judged whether the face presented had upside-down facial features or right-side-up features. These altered photos were also taken from the Pictures of Facial Affect photographs (Ekman & Friesen, 1976), but were not the same as the photos used in the other tasks. Questionnaires A l l questionnaires are included in Appendix C. The Interpersonal Reactivity Scale (Davis, 1980; 1983) is a commonly used measure of individual differences in empathy. It has four factors. The Perspective Taking subscale (IRS-PT) measures the tendency to spontaneously adopt the point of view of others and is considered a good measure of cognitive empathy. The scale has 7 items and has been found to correlate with measures o f high interpersonal functioning and an "other" orientation. The Empathic Concern subscale (IRS-EC) consists of 7 items and measures emotional empathy. The scale is measured on a 5-point Likert scale. The other two factors are the 7-item Personal Distress subscale (IRS-PD), measuring discomfort and anxiety when witnessing the negative experiences of others and the 7-item Fantasy subscale (IRS-F), measuring the tendency to identify strongly with fictitious characters. The scale has good test retest reliability and internal reliability (Davis, 1980) and good 22 validity, as indicated by correlations between the IRS subscales and other self-report measures of empathy and interpersonal functioning (Davis, 1983) and physiological indicators o f empathic responding (Eisenberg et al., 1988; Fabes, Eisenberg, & Eisenbud; 1993). Davis (1983) found reliability coefficients for the cognitive empathy and emotional empathy factors to be IRS-PT a=.77 and IRS-EC a=.76. In this study, the reliability coefficients were IRS-EC oc=.62 and IRS-PT a=.74. The Balanced Inventory of Desirable Responding (Paulhus, 1991) is a 40-item inventory with two 20-item subscales: self-deceptive enhancement (BIDR-SDE) (ot=.75) measures the tendency to give positively biased but honestly held self-descriptions, and impression management (BIDR-IM) (a=.86) measures denying socially undesirable behaviour, giving favourable descriptions to others. The scale has good concurrent, convergent, and discriminant validity (Paulhus, 1991). In this study, the reliability coefficients were B I D R - S D E a=.50 and B I D R - I M a=.71. The Vancouver Acculturation Scale (Ryder, Alden, & Paulhus, 2000) is bidimensional measure of acculturation. There are two subscales, one measuring identification with the mainstream North American culture ( V I A - M A ) (oc=75) and the other measuring identification with one's heritage culture (VIA-HI) . (a=.79) The scale shows good concurrent validity with other measures of acculturation, with length of time spent in a new culture and with generational status. The scale is used in this study to control for possible cultural differences in interpersonal and emotional behaviour. In this study, the reliability coefficients were V I A - M A a=.85 and V I A - H I a=.88. 23 Results Descriptive statistics and preliminary analyses Appendix A shows the distribution graphs for 47 ms and 2008 ms, and kurtosis and skewness statistics, which are at acceptable levels. Appendix B shows the correlations between all o f the main variables. Table 2.1 shows the mean scores for expression identification accuracy at 47 ms and 2008 ms for the total sample, for gender, and also compares females and males on these variables using t-tests. There were no significant gender differences on either of the facial expression recognition tasks, although females slightly outperformed males on both. Table 2.2 shows the mean scores for emotional empathy (IRS-EC) and cognitive empathy (IRS-PT) for the total sample, for gender, and compares males and females on these empathy self-report measures. Females reported significantly higher IRS-EC scores, t(133)= 1.97, p<.052, d=37, but there were no significant gender differences on the IRS-PT, t(133)= -.943, p<n.s., <i=. 18. There were significant gender differences on the personal distress subscale (IRS-PD), t(133)=4.25, p<.000, d=S3, and the fantasy subscale (IRS-F), t(133)=.2.65,p<.009, d=A9, with females scoring higher than males. Table 2.3 lists the zero-order correlations between social desirability, empathy, the expression tasks, and acculturation. Empathic behaviour is socially desirable and therefore subjects may wish to appear more empathic, or may view themselves as more empathic than they actually are. These two aspects of social desirability form the factors on the Balanced Inventory of Desirable Responding (BIDR): Impression Management (BIDR-IM) and Self-Deceptive Enhancement (BIDR-SDE) . However, people who care about how others view them and about social convention may also be more inclined to 24 i have true empathy for others. Therefore, controlling for social desirability when investigating constructs such as empathy is not recommended because part of the essence of the construct is removed. It is recommended instead to report the correlations between social desirability and the variables. A s shown in Table 2.3, B I D R was significantly correlated with only the empathy variables but not with acculturation variables. B I D R - I M was positively correlated with IRS-PT, r=.30, jp<.000 and with IRS-EC, r=.21,p<.017. B I D R - S D E was positively correlated with IRS-PT, r= 27,/K.OOl, but not with I R S - E C . Table 2.4 lists the correlations between the expression identification tasks, the 47 ms control task, acculturation and gender. There was a weak relationship between 47 ms control task and expression recognition performance at 47 ms and it was not significant, r=.14, p>.120. The 47 ms control task and did not come close to correlating with any other variable in the study, except for V I A - M A with which there was a strong positive correlation, r=.22, p<.011. Thus, the less acculturated subjects had more difficulty making correct judgments at 47 ms, and this difficulty was not just related to facial expression recognition. The 47 ms control task was not correlated with heritage acculturation, r=.06, n.s., or with expression recognition at 2008 ms, r=.07, n.s. While V I A - H I was uncorrected with facial expression recognition variables, V I A - M A was correlated. The less one identified with North American culture, the less well they recognized emotion at 47 ms, r=.27, p<.002, and at 2008 ms, however the relationship at 2008 ms was not significant, r=.14, p<. 103. In terms of empathy, the V I A -M A was negatively significantly correlated with IRS-PT, r=-.17, p<.044, and negatively correlated with IRS-EC, but this was not significant, r= -.03, n.s. The V I A - H I was not 25 significantly correlated with IRS-EC (r=.001, n.s.), IRS-PT (r=.01, n.s.), 47 ms (r=-.13, n.s.) or with 2008 ms (r=-.07, n.s.). To determine the influence o f acculturation and gender on facial expression recognition, a multivariate analysis of covariance ( M A N C O V A ) was computed. The two dependent variables were the 47 ms recognition task and the 2008 ms recognition task. A median split separated mainstream acculturation into a high group and low group. Gender and mainstream acculturation were the two factors and the 47 ms control task was a covariate. There was a main effect of acculturation at both 47 ms, F(l,130)=6.47, p<.012, and 2008 ms, F(l,130)=4.37, p<.039. In both, the less mainstream acculturated participants received lower expression recognition scores. The interaction between gender and acculturation was not significant. Gender approached significance at 2008 ms, F(130)=2.94, n.s., and was not significant at 47 ms, F(130)=1.78, n.s. Analyses of main hypotheses The main hypotheses concerned the relationships between cognitive empathy, emotional empathy and facial expression recognition ability at different speeds. Gender and culture correlated with facial expression recognition. Thus, linear regression analyses determined how much emotional empathy and cognitive empathy related to expression recognition, beyond gender and culture. Three regressions were performed. One regression, shown in Table 2.5, predicted accurate recognition at 47 ms with IRS-EC and IRS-PT as the predictors of interest, and also the remaining factors of the IRS, IRS-PD and IRS-F. Another regression, shown in Table 2.6, predicted accurate recognition at 2008 ms with the same predictors as the regression for 47 ms. For both regressions, the control variables were entered as initial predictors, with the empathy variables predicting 26 beyond them. These control variables were: gender, V I A - M A , V I A - M A x gender interaction, V I A - H I , V I A - H I x gender interaction and the 47 ms control task. A s shown in Table 2.5, of the empathy variables, IRS-EC was the only significant predictor of accurate expression recognition at 47 ms (P=.19, p<.043). IRS-PT, the other main variable of interest, was not a predictor (P=.07, n.s.). The other empathy variables on the IRS were not significant predictors either. In addition, V I A - M A was a significant predictor (P=.29, p<.005). The total variance predicting recognition scores at the automatic processing level in this model was 19% (R =.19). Table 2.6 shows the results of the regression at 2008 ms level. In contrast to 47 ms, this model only predicts 12% (R 2 =.12). A s well , the results differ: no empathy variables significantly predicted accuracy at 2008 ms. The only significant predictor was V I A - M A , P=.23, p<.031. The interaction between V I A - M A and gender approached significance, (P=.18, p<.074). Gender, IRS-EC, IRS-PT, IRS-PD and IRS-F were not significant predictors. A M A N C O V A was performed, predicting accuracy scores at 47 ms and 2008 ms, in one analysis, using high and low scores on the IRS-EC, emotional empathy, as the independent variables. Accuracy at 47 ms and accuracy at 2008 ms were entered as dependent variables. The control variables were gender, V I A - M A , V I A - M A x gender interaction, V I A - H I , V I A - H I x gender interaction and the 47 ms control task. The overall M A N C O V A was significant according to Pi l lai ' s Trace, F(2,126)=3.03, p<05. The univariate A N C O V A was not significant for 2008 ms, 2.60, p>.109. Differences between the high and low emotional empathy group were significant at 47 ms, F(l,127)=5.29, 27 p<.023. The high emotional empathy group performed better than the low emotional empathy group at 47 ms, which is consistent with the regression analyses. Discussion The main result of the present study was that emotional empathy significantly predicted accurate facial expression recognition at 47 ms, but not at 2008 ms. A self-reported predisposition towards emotional empathy was associated with better decoding „ of expression type, when little time was given to process the expressions. Cognitive empathy, the self-reported tendency to focus on the perspective of others, was not significantly related to recognition accuracy at 47 ms, nor 2008 ms, although it approached significance at 2008 ms. Previous studies have shown individual differences in emotional reactivity to briefly presented facial expressions (Dimberg et al., 2000; Whalen et al., 1998). Previous studies have also shown a relationship between facial expression valence judgments to preconsciously presented facial expressions, and emotional empathy (Martin et al., 1996). This is the first study to find that accurately discriminating the basic emotions at brief duration levels relates to a disposition towards emotional empathy. It is important to note that some studies have shown that subjects' self-ratings of their cognitive empathy ability do not relate to actual cognitive empathic accuracy; people lack insight in their, empathic accuracy (e.g. Davis & Kraus, 1997; Ickes et al., 2000). A typical way to measure cognitive empathic ability has been to videotape two subjects interacting. The subjects then watch the videotape twice - first to stop the tape at self-designated points to record their actual thoughts and feelings, and the second time to infer what their interaction partner's thoughts and feelings were at their partner's self-28 designated stop-points. Since accuracy for this cognitive empathy task does not tend to have a relationship on self-rating cognitive scales, it is possible that in this study, the cognitive empathy variable was not related to emotional identification because the measure could not select those subjects that were actually superior in cognitive empathy. Conversely, self-rating one's own emotional empathy is less affected by lack of insight: it seems that people are more aware of whether they readily emotionally react to the emotions of others. Some studies show that subjects who report more emotional empathy do in fact exhibit more physiological reactivity, such as upsetting facial expressions, when exposed to a stimulus designed to induce sympathy (e.g., Eisenberg et al., 1988; Fabes, Eisenberg, & Eisenbud; 1993), greater skin conductance when viewing someone being shocked (Mehrabian, Young, & Sato, 1988), and higher brain activation when observing someone in pain (Singer, Seymour, O'Doherty, Kauge, Dolan, & Frith (2004). Thus, these findings imply that, at least for cognitive empathy, people deceive themselves into believing they are more skilled in reading the behaviour of others, than they actually are. In line with this interpretation, in this study cognitive empathy was significantly correlated with the factor on the social desirability scale that measures self-deceptive enhancement. Conversely, emotional empathy was not correlated with this factor. Emotional reactivity to the emotions of others has specific relevance for the accurate identification of the emotions of others. These results suggest that when expressions are briefly presented, emotional reactivity is a benefit for interpretation. Put another way, this finding implies that people with greater ability to process emotional information at the rapid, automatic level, react more to the emotions of others, as indicated by higher dispositional empathy. A t controlled, more deliberate levels of 29 interpretation, no such effect is found. The advantage the emotionally reactive/good automatic level processors have disappears when the expressions are presented for longer, allowing more interpretation time. This implies that perceiving emotion is multi-faceted and that low insight from emotional sources may be compensated using other methods (i.e. cognitive interpretation), but this compensation does not work as well with briefly presented emotional stimuli, maybe because emotional.processing systems work more rapidly. Thus, people with lower emotional reactivity show lower performance at 47 ms but perform comparatively better at 2008 ms. Alternatively, these findings might not mean that the less emotionally empathic catch up but that the more emotionally empathic fall behind: the more emotionally might perform better when only a gut-level recognition response is required and when more time is allowed, cognitive mechanisms actually spoil their instinct and advantage. It is important to point out that the total amount of predicted variance in performance at the two presentation rates, 47 ms and 2008 ms, was small for both, but much smaller at the 2008 ms, compared to 47 ms. This difference was not due to a ceiling effect at 2008 ms, as shown in Figures A . l and A . 2 in the Appendix. Thus, at 2008 ms, other variables than those included in this study, have greater explanatory power; this finding is unsurprising since 47 ms tests more basic emotional processing. The small amount of predicted variance is also expected, given the complex, multi-faceted nature of emotional expression recognition. It is also important to point out that the more emotionally empathic were better at recognizing the specific types of expressions - not simply better at distinguishing emotional valence, as in the study by Martin et al. (1996). When surprise, happy and 30 neutral expressions were taken out of the analyses, leaving emotional recognition of just the negative expressions, the significance level changed only marginally. These findings are unlikely due to more conscientious participation by the empathic participants, since this advantage was only evident at 47 ms. If the less empathic participants performed the study less diligently, their performance would have been significantly lower at 2008 ms in addition to 47 ms. These findings are also not explained by a greater interpersonal interest in understanding others, resulting in better attention paid to the faces, since participants reporting high levels o f cognitive empathy did not show this advantage. It is also not likely that the more empathic participants have more knowledge about emotional concepts and were better able to readily label the expressions, whereas others lacking discriminating ability would perform better i f they were taught the labels; a previous study found that better simple valence distinctions (i.e. limited analysis and labelling) at brief presentation durations also predicted emotional empathy (Martin et al., 1996). Rather, the results suggest a connection between heightened emotional reactivity to the emotions of others and heightened accuracy to emotional expression decoding to expressions presented briefly. What might underlie this emotional reactivity? There are several possibilities. Perhaps individuals with greater emotional empathy have heightened awareness to all emotional stimuli, enabling more efficient processing of emotional stimuli. Perhaps emotional empathy is a quality that some childhood environments foster, and over time, a person-focussed vigilance translates into a superior emotional decoding ability. Perhaps some people innately mimic the expressions of others more readily and this automatic process enables better expression recognition. This mimicking process happens within 31 300 ms. of exposure to another person's facial expressions (Wild, Erb, & Bartels, 2001), happens in response to facial expressions that are not consciously perceived (Dimberg et al., 2000), and has been found to relate to emotional empathy (Sonny-Borgstrom et al., 2003). Perhaps all three play a role: individuals who have an emotionally reactive temperament and who grow up in an environment that encourages interpersonal observation and empathy, tend to mimic more and together these factors lead to superior emotional recognition at the basic processing levels. The present study cannot begin to answer these complex questions. What is clear from this study is that people who report more emotional empathy seem better able to gather more emotional information in a limited amount of time, from facial expressions. Mainstream acculturation was a highly significant predictor of performance at 47 ms and borderline significant at 2008 ms. Heritage identification was not a significant predictor at either level. These large cultural effects are consistent with research findings. For example, Elfenbeim and Ambady (2003) compared expression recognition in Chinese students in China, Chinese students living in America for an average of 2.4 years, Chinese Americans and Americans of non-Chinese ancestry. They found that the Chinese students l iving in the U S could judge the facial expressions of Americans better than could Chinese students in China. Chinese and Americans living in their own culture responded more quickly to expressions posed by members o f their own culture. The results of the present study fit with Elfenbeim and Ambady's (2003) findings: the present study found that accurate facial expression recognition was strongly related to acculturation in the mainstream culture, not identification with the heritage culture. It is important to point out that research indicates this cultural bias is not a Western/Non-32 Western distinction, but rather largely cultural. For instance, Biehl et al. (1997) found that Hungarians recognized the expressions o f other Hungarians better, Poles recognized Poles better and Vietnamese recognized Vietnamese better. Research points to several explanations for these cultural differences. Theories suggest that there are subtle differences in how cultures express their emotions (Elfenbeim & Ambady, 2003), in the meanings and associations of emotional labels (Biehl et al., 1997), including how intense expressions are reported to be (e.g. Matsumoto & Ekman, 1989), and in facial physiognomy (Matsumoto, 1989). Matsumoto (1989) found that differences in facial physiognomy between American faces and Japanese faces likely resulted in different facial expression judgements. He investigated the effect of upper eyelid raises in Japanese and Americans on judgements of expression clarity and intensity. Since the upper eyelid is more easily perceived in American faces, raising it had more of an impact on emotion judgments in American faces than in Japanese faces. In particular, eyelid raises in the American faces increased intensity ratings for fear and angry expressions, and increased clarity for angry expressions. Thus, there are a variety of contributing factors to lower expression recognition performance due to cultural differences. In light of these issues, it is not surprising that the present study found more pronounced recognition difficulties in the less acculturated subjects at the more difficult 47 ms task. Another important aspect to consider is experiment participation anxiety. A reason to believe that state anxiety may have affected the performance o f the less acculturated participants was the strong positive correlation between V I A - M A and the 47 ms control task, indicating that the less acculturated had 33 difficulty answering quickly presented stimuli, regardless of the type of stimuli. Although there are other possibilities, anxiety is one explanation. The effect of acculturation is an interesting one and it requires further exploration. It is also possible that people recognize expressions better within their own ethnic group regardless o f their degree of exposure to other ethnic groups. Thus, the results could, in part, reflect differences in the ability to recognize emotional expressions from the same vs. a different ethnic group, or could be due to the amount of experience interacting with an ethnic group different than one's own, or some combination of these, and other unknown factors. The present research does not allow the influences of ethnicity and cultural exposure to be teased apart, since the East Asian participants were from different parts of Asia , and the precise length of time spent in North America was not obtained. Gender differences on the expression recognition tasks were not significant, but females performed slightly better than males at both 47 ms and 2008 ms. In terms of gender and emotional recognition tasks, findings are mixed, with females typically slightly outperforming males at nonverbal cue decoding (Hall, 1978). A few studies have compared presentation speed accuracy o f expressions and gender. One study presented facial expressions at 70 ms, 130 ms or 200 ms and found that females were better than males in judging the facial expressions, but only when the task required the participants to rate each expression on several emotions and not just provide a single category judgment (Hall & Matsumoto, 2004). When the participants were required to provide a single category judgment, there were no gender differences. Kirouac and Dore (1984) presented faces for brief durations, from 10 ms to 50 ms, and found that, for almost every emotion, for every time duration, women were more accurate then men. However, these 34 results were not significant, perhaps due to the small sample size. Another study morphed expressions to varying degrees of intensity, from neutral to 100% and the participants labelled the emotion when they first perceived the emotion on the face (Montagne et al., 2005). Males made more mistakes labelling sadness and surprise and required more intense expressions to perceive anger and disgust. A study where subjects rated their emotional responses to facial expressions and compared these ratings to correct classifications of the expressions found that females correctly classified the expressions according to their own felt emotion, more than males did (Thayer & Johnsen, 2002). Similarly, Dimberg and Lundquist (1990) found that females had more pronounced facial E M G reactions to facial expressions, particularly for happy expressions. Together, these results suggest that females are more reactive to facial expressions than males, but are only negligibly superior at emotional recognition tasks. A meta-analysis on empathy and gender (Lennon & Eisenberg, 1987) found that gender differences were largest in self-report studies, possibly due to sex-role stereotypes, moderate for observational methods, such as facial reactions, and small for physiological methods. Likewise, other research indicates that greater empathy in females compared to males rather than differential ability tends to be related to differential motivation (Ickes, Gesn, & Graham, 2000). The gender findings in this study are largely in agreement with the literature; there were large gender differences on the self-report measures of empathic concern but the differences on the expression tasks were not significant. This study has several limitations. First, it is self-report. Relying on participant's own reporting of their empathic behaviour is problematic, limited by, for example, denial, lack of comparison, social desirable responding, and minimal self-awareness. Second, 35 this study did not include a measure of state mood. High state anxiety, for example, may have lowered performance in some participants and these differences were not controlled. Third, the participants were not given examples o f the emotional expressions beforehand, so it is possible that some participants lacked knowledge about the labels to apply to the expressions. However considering that all participants were university student it is reasonable to assume that they knew the meanings of the major emotional expressions. Fourth, given the high level of Asian participants, using a balanced design in terms of an equal number Caucasian and Asian faces, such as the Japanese and Caucasian Facial Expressions of emotion ( J A C F E E ; Matsumoto & Ekman, 1988) may have lowered the strong acculturation effects. Future directions could explore connections between emotional contagion, empathy and emotional recognition. Emotional contagion is defined as, "the tendency to automatically mimic and synchronize facial expressions, vocalizations, postures, and movements with those of another person, and consequently, to converge emotionally" (Hatfield et al., 1992, pp. 153-154). It is unclear from the results whether it is emotional reactivity per se that allows greater emotional information to be gathered, in a limited amount of time, from facial expressions. It could also be the caring component of emotional empathy - being concerned about others, motivated to understand, and being receptive to vicariously experiencing the emotions of others. In addition, it would be interesting to explore the limits of this connection. Do children show this tendency? Can training on accuracy at 47 ms greatly change performance scores? Does this effect show with subtle expressions, or only intense ones? Does this effect only exist for basic emotional expressions or can the effect also be found for expressions that show thought, 36 such as interested, bored or flirtatious expressions, as tested in the Eyes test (Baron-Cohen, Jollife, Mortimore, & Robertson, 1997)? In conclusion, these findings imply that there are large individual differences in emotional processing, and that individuals who report more emotional empathy have an enhanced ability to glean more emotional information from expressions in a limited amount of time. 37 Tables Chapter 2 Table 2.1. Mean scores and gender comparisons for expression identification accuracy at 47 ms and 2008 ms (N=135) Study Group Total Females (n=98) Males (n=37) Student t (Males vs. Females) df=133 P Identification accuracy mean scores at 47 ms (max score 42) 2.7.77 (4.24) 66% 27.99 (4.00) 67% 27.19 (4.84) 65% .98 .330 Identification accuracy mean scores at 2008 ms (max score 42) 34.0(3.74) 81% 34.26 (3.72) 82% 33.22 (3.76) 79% 1.46 .148 Note. Mean scores are listed and standard deviations are in brackets *p<.05. **p<.01. ***p<.001 Table 2.2. Mean scores and gender comparisons for cognitive and emotional empathy (N=135) Study Group Total Females (n=98) Males (n=37) Student t (Males vs. Females) df=133 P Perspective Taking IRS-PT (max score 35) 24.59 (3.96) 24.39 (3.78) 25.11 (4.34) -.94 .347 Empathic Concern IRS-EC (max score 35) 26.87 (3.18) 27.19(2.91) 25.95 (3.69) 1.97 .052* Personal Distress IRS-PD (max score 35) 20.71 (4.26) 21.61 (4.06) 18.32 (3.87) 4.25 .000*** Fantasy IRS-F (max score 35) 25.93 (4.26) 26.51 (3.92) 24.38 (4.77) 2.65 .009** Note. IRS-PT = Interpersonal Reactivity Scale, Perspective Taking factor; IRS-EC = Interpersonal Reactivity Scale, Empathic Concern factor, IRS-PD== Interpersonal Reactivity Scale, Personal Distress factor, IRS-F== Interpersonal Reactivity Scale, Fantasy Scale Mean scores are listed and standard deviations are in brackets *p<.05. **p<.01. ***p<.001 38 Table 2.3. Zero-order Correlations between Social Desirability, Empathy, Gender, and Acculturation (N=135) 1 2 3 4 5 6 7 8 Self- Impression Perspective Empathic Mainstream Heritage 47 ms. 2008 ms. Deceptive Manage- Taking Concern Accultur- Identifi- Accuracy Accuracy Enhance- ment IRS-PT IRS-EC ation cation ot=68 a=.64 ment B I D R - I M oc=74 a=.62 V I A - M A V I A - H I B I D R - a=.71 a=.85 a=.87 S D E a=.49 1 31** 27 * * .14 .04 -.02 .09 -.09 2 30 ** .21* -.1.7 -.03 .09 .04 3 .25** -.17* .01 .09 .12 4 -.03 .001 .22** .13 5 -.03 27** -.07 6 -.13 39** Note. BIDR-SDE = Balanced Inventory of Social Desirable Responding-Self-Deception subscale; BIDR-IM = Balanced Inventory of Social Desirable Responding-Impression Management subscale; IRS-PT = Interpersonal Reactivity Scale, Perspective Taking factor; IRS-EC = Interpersonal Reactivity Scale, Empathic Concern factor; V I A - M A = Vancouver Acculturation Index-Mainstream Acculturation subscale; VIA-HI = Vancouver Acculturation Index - Heritage Identification subscale *p<.05. **p<.01. 39 Table 2.4. Zero-order Correlations between expression identification accuracy, 47 ms control task, acculturation and gender (N=135) 1 2 3 4 5 6 Mainstream Heritage Gender 47 ms. 2008 ms. 47 ms Acculturation Identification Accuracy Accuracy control V I A - M A V I A - H I a=.68 a=.64 a=.71 cc=.86 a=.88 1 - -.03 .11 27** .14* .22** 2 - .12 -.13 -.07 .06 3 ~ -.08 -.13 .10 4 - 39** .14 5 - .07 Note. V I A - M A = Vancouver Acculturation Index-Mainstream Acculturation subscale; VIA-HI = Vancouver Acculturation Index - Heritage Identification subscale *p<.05. **p<.01. 40 Table 2.5. Hierarchical Regression Analysis for 47 ms recognition accuracy (N=135) Predictor Coefficient (beta) t value P R2 (group) A R 2 Control Variables Gender -.13 -1.45 .149 .12 .12 Mainstream Acculturation (VIA-MA) .29 2.88 .005** Mainstream Acculturation (VIA-MA) X Gender .11 1.11 .270 Heritage Identification (VIA-HA) -.04 -.31 .759 Heritage Identification (VIA-HA) X Gender .11 .95 .343 47 ms. control task .08 .97 .337 Empathy Variables Perspective Taking (IRS-PT) .07 .73 .464 .19 .07 Empathic Concern (IRS-EC) .19 2.04 .043* Personal Distress (IRS-PD) -.12 -1.33 .185 Fantasy (IRS-F) .08 .87 .385 variance explained for the group(s) of variables entered. IRS-PT = Interpersonal Reactivity Scale, Perspective Taking factor; IRS-EC = Interpersonal Reactivity Scale, Empathic Concern factor; IRS-PD== Interpersonal Reactivity Scale, Personal Distress factor, IRS-F== Interpersonal Reactivity Scale, Fantasy Scale V I A - M A = Vancouver Acculturation Index-Mainstream Acculturation subscale; VIA-HI = Vancouver Acculturation Index - Heritage Identification subscale *p<.05. **p<.01. ***p<.001 41 Table 2.6. Hierarchical Regression Analysis for 2008 ms recognition accuracy (N=135) • Predictor Coefficient (beta) t value P R2 (group) A R 2 Control Variables Gender -.16 -1.66 .099 .07 .07 Mainstream Acculturation (VIA-MA) .23 2.19 .031* Mainstream Acculturation (VIA-MA) X Gender .18 1.80 .074 Heritage Identification (VIA-HA) -.10 -.86 .394 Heritage Identification (VIA-HA) X Gender -.08 -.64 .522 47 ms. control task .05 .58 .560 Empathy Variables Perspective Taking (IRS-PT) .118 1.29 .199 .12 .06 Empathic Concern (IRS-EC) .08 .85 .396 Personal Distress (IRS-PD) -.10 -1.06 .293 Fantasy (IRS-F) .12 1.25 .214 Note. Coefficients are values at last entry in hierarchical regression. Rz values are cumulative and represent variance explained for the group(s) of variables entered. IRS-PT = Interpersonal Reactivity Scale, Perspective Taking factor; IRS-EC = Interpersonal Reactivity Scale, Empathic Concern factor; IRS-PD== Interpersonal Reactivity Scale, Personal Distress factor, IRS-F== Interpersonal Reactivity Scale, Fantasy Scale, V I A - M A = Vancouver Acculturation Index-Mainstream Acculturation subscale; VIA-HI = Vancouver Acculturation Index - Heritage Identification subscale *p<05. **p<.01. ***p<.001 42 References Baron-Cohen, S. & Wheelwright, S. (2004). The empathy quotient: A n investigation of adults with Asperger Syndrome or High Functioning Autism, and normal sex difference. Journal of Autism and Developmental Disorders, 34(2), 163-175. Baron-Cohen, S., Jolliffe, T., Mortimore, C , & Robertson, M . (1997). Another advanced test of theory of mind: Evidence from very high-functioning adults with autism or Aspergers Syndrome. Journal of Child Psychology and Psychiatry, 38, 813-822. Biehl , M . , Matsumoto, D . , Ekman, P., Hearn, V . , Heider, K . , Kudoh, T. & Ton, V . (1997). Matsumoto and Ekman's Japanese and Caucasian Facial Expressions of emotion ( J A C F E E ) : Reliability data and cross-national differences. Journal of Nonverbal Behaviour, 27(1), 3-21. Blair , R.J.R. (2005). Responding to the emotions of others: Dissociating forms of empathy through the study o f typical and psychiatric populations. Consciousness and Cognition, 14, 698-718. Blairy, S., Herrera, P. & Hess, U . (1999). Mimic ry and the judgment of emotional facial expressions. Journal oj'Nonverbal Behaviour, 23(1), 5-41. Buck, R. (1984). The communication of emotion. New York: Guilford Press. Buck, R. & Ginsburg, R. (1997). Communicative genes and the evolution of empathy. In W . Ickes (Ed.), Empathic Accuracy (pp. 17-43). New York: Guilford Press. Capella J .N. (1993). The facial feedback hypothesis in human interaction: Review and speculation. Journal of Language and Social Psychology, 12(\&2), 13-29. 43 Davis, M . H . (1983). Measuring individual differences in empathy: Evidence for a multidimensional approach. Journal of Personality and Social Psychology, 44(1), 113-126. Davis, M . H . (1980). A multidimensional approach to individual differences in empathy. Catalog of 'Selected Documents in Psychology, 10(4), 1-17'. Davis, M . H . & Kraus, L . A . (1997). Personality and empathic accuracy. In W. Ickes (Ed.). Empathic accuracy (pp. 144-168). New York: Guilford Press. Dimberg, U . & Lundquist, L . (1990). Gender differences in facial reactions to facial expressions. Biological Psychology, JO, 151-159. Dimberg, U . , Thunberg, M . & Elmehed, K . (2000). Unconscious facial reactions to emotional facial expressions. Psychological Science, 11(1), 86-89. Eisenberg & Fabes (1990). Empathy: Conceptualization, assessment, and relation to prosocial behaviour. Motivation and Emotion, 14, 131-149. Eisenberg, N . , Fabes, R. , Schaller, M . , Mi l le r , P. Carlo, G . , Pouline, R. Shea, C , & Shell, R. (1991). Personality and socialization correlates of vicarious responding. Journal of Personality and Social Psychology, 61(3), 459-470. Eisenberg, N . , Fabes, R . A . , Murphy, B . , Karbon, M . , Maszk, P., Smith, M . , O 'Boyle , C , & Suh, K . (1994). The relations of children's dispositional and empathy-related responding to their emotionality, regulation and social functioning. Developmental Psychology, 32, 195-209. Eisenberg, N . , Schaller, M . , Fabes, R . A . , Bustamante, D . , Mathy, R., Shell, R., & Rhodes, K . (1988). The differentiation of personal distress and sympathy in children and adults. Developmental Psychology, 24, 766-775. 44 Ekman, R, & Friesen, W . V . (1976). Pictures of facial affect. Palo Al to , C A . : Consulting Psychologists Press. Ekman, P. & O'Sullivan, M . (1991). Who can catch a liar? American Psychologist, 46(9), 913-920. Elfenbeim, H . A . & Ambady, H . (2003). When.familiarity breeds accuracy: Cultural exposure and facial emotion recognition. Journal of Personality and Social Psychology, 85(2), 276-290. Fabes, R . A . , Eisenberg, N . , & Eisenbud, L . (1993). Behavioural and physiological correlates of children's reactions to others in distress. Developmental Psychology, 29(4), 655-663. Hal l , J A . (1978). Gender effects in decoding nonverbal cues. Psychological Bulletin, 85, 845-857. Hal l , J .A. & Matsumoto, D . (2004). Gender differences in judgments of multiple emotions from facial expressions. Emotion, 4(2), 201-206. Haviland, J. M . , & Lelwica, M . (1987). The induced affect response: 10-week-old infants' responses to three emotion expressions. Developmental Psychology, 23(1), 97-104. Hatfield, E . , Cacioppo, J.T., & Rapson, R . L . (1994). Emotional Contagion. New York: Cambridge University Press. Ickes, W . (1993). Empathic accuracy. Journal of Personality, 61(4), 587-610. Ickes, W. , Buysse, A . , Pham, H . , Rivers, K . , Erickson, J.R., Hancock, M . , Kelleher, J., & Gesn, P.R. (2000). On the difficulty of distinguishing "good" and "poor" 45 perceivers: A social relations analysis of empathic accuracy data. Personal Relationships, 7, 219-234. Ickes, W. , Gesn, P.R., Graham, T. (2000). Gender differences in empathic accuracy: Differential ability or differential motivation. Personal Relationships, 7, 95-109. Kirouac, G . & Dore, F . Y . (1984). Judgment of facial expressions of emotion as a function of exposure time. Perceptual and Motor Skills, 59, 147-150. Lang, P.J. (1995). The emotion probe: Studies of motivation and attention. American-Psychologist, 50, 372-385. Lazarus, R.S . (1991). Progress on a cognitive-motivational-relational theory of emotion. American Psychologist, 46($), 819-834. LeDoux, J. (1996). The emotional brain. New York: Simon & Schuster. Lennon, R. & Eisenberg, N . (1987). Gender and age differences in empathy and sympathy. In N . Eisenberg & J. Strayer (Eds.), Empathy and its development (pp. 195-217). New York: Cambridge University Press. Lipps, T. (1903). X I V . Kapitel: Die einfuhlung. In Leitfaden der psychologie [Guide to psychology] (pp. 187-201). Leipzig: Verlag von Wilhelm Engelmann. Martin R . A . , Berry, G.E . , Dobranski, T. & van Home, M . (1996). Emotion perception threshold: individual differences in emotional sensitivity. Journal of Research in Personality, 38, 290-305. Matsumoto, D . & Ekman, P. (1989). American-Japanese cultural differences in judgments of facial expressions of emotion. Motivation and Emotion, 13, 143-157. 46 Matsumoto, D . (1987). The role of facial response in the experience of emotion: More methodological problems and a meta-analysis, Journal of Personality and Social Psychology, 52(4), 769-774. Matsumoto, D . (1989). Face, culture, and judgements of anger and fear: Do the eyes have it? Journal of Nonverbal Behaviour, 13(3), 171-188.\ Matsumoto, D . , LeRoux, J., Wilson-Cohn, C , Kooken, K . , Ekman, P., Yrizarry, N . , Loewinger, S., Uchida, H . , Yee, A . , Amo, L . & Goh, A . (2000). A new test to measure emotion recognition ability: Matsumoto and Ekman's Japanese and Caucasian Br ief Affect Recognition Tes t ( JACBART) . Journal of Nonverbal Behavior, 24(3), 179-209. Mehrabian, A . , & Epstein, N . (1972). A measure of emotional empathy. Journal of Personality, 40(4), 525-543. Meltzoff, A . N . , & Moore, M . K . (1977). Imitation of facial and manual gestures by human neonates. Science, 198, 75-78. Montagne, B . , Kessels, R .P .C . , Frigerio, E . , de Hann, E .H.F . , & Perrett, E.I. (2005). Sex differences in the perception of affective facial expressions: Do men really lack emotional sensitivity? Cognitive Processes, 6, 135-141. Nowick i , S. & Mitchel l , J . (1998). Accuracy in identifying affect in child and adult faces and voices and social competence in preschool children, Generic, Social, and General Psychology Monographs, 124, 39-59. Ogawa, T. & Suzuki, N . (1999). Response differentiation to facial expressions of emotion at increasing exposure duration. Perceptual and Motor Skills, 89, 557-563. 47 Ohman, A . (1993). Fear and anxiety as emotional phenomena: Clinical phenomemology, evolutionary perspectives and information processing mechanisms. In M . Lewis & J . M . Haviland (Eds.), Handbook of emotion (pp. 511-536). New York: The Guilford Press. LeDoux, J. (1996). The emotional brain. New York: Simon & Schuster. Mehrabian, Young, Sato (1988). Emotional empathy and associated individual differences. Current Psychology: Research and Reviews, 7(3), 221-240. Paulhus, D . L . (1991). Measurement and control of response bias. In J.P. Robinson, P.R. Shaver, & L . Wrightsman (Eds.), Measures of personality and social-psychological attitudes (pp. 17-59). San Diego, C A : Academic Press. Phillips, M . L . , Drevets, W . C . , Rauch, S.L., & Lane, R. (2003). Neurobiology of Emotion Perception I: the Neural Basis o f Normal Emotion Perception. Biological Psychiatry, 54, 504-514. Preston, S.D. & de Waal, F . B . M . (2002). Empathy: Its ultimate and proximate bases. Behavioural and Brain Sciences, 25, 1-72 Russell, J .A. & Bullock, M . (1986). Fuzzy concepts and the perception of emotion in facial expressions. Social Cognition, 4(3), 309-341. Ryder, A . G . , Alden, L . A . , & Paulhus, D . L . (2000). Is acculturation unidimensional or bidimensional?: A head-to-head comparison in the prediction of personality, self-identity, and adjustment. Journal of Personality and Social Psychology, 79, 49-65. 48 Singer, T., Seymour, B . , O'Doherty, J., Kauge, H . , Dolan, R .J . & Frith, C D . (2004). Empathy for pain involves the affective but not sensory components o f pain. Science, 303, 1157-1162. Sonny-Borgstrom, M . (2002). Automatic mimicry reactions as related to differences in emotional empathy. Scandinavian Journal of Psychology, 43, 433-443. Sonny-Borgstrom, M . , Jonsson, P., & Svensson, O. (2003). Emotional empathy as related to mimicry reactions at different levels of information processing. Journal of Nonverbal Behaviour, 27(1), 3-23. Surakka, V . & Heitanen, J .K. (1998). Facial and emotional reactions to Duchenenne and non-Duehennne smiles. International Journal of Psychophysiology, 29,23-33. Termine, N . T . & Izard, C . E . (1988). Infants' responses to their mothers' expressions of joy and sadness. Developmental Psychology, 24, 223-229. Thayer, J.F. & Johnsen, B . G . (2000). Sex differences in judgement of facial affect: A multivariate analysis o f recognition errors. Scandinavian Journal of Psychology, 4f 243-246. Walden, T . A . & Field, T . M . (1982). Discrimination of facial expressions by preschool children. Child Development, 53, 1312-1319. Whalen, P.J., Rauch, S.L., Etcoff, N . L . , Mcinerney, S.C., Lee. M . B . & Jenike, M . A . (1998). Masked presentations of emotional facial expressions modulate amygdala activity without explicit knowledge. The Journal of Neuroseience, 75(1), 411-418. W i l d , B . , Erb, M . & Battels, M . (2001). Are emotions contagious? Evoked emotions while viewing emotionally expressive faces: Quality, quantity, time course and gender differences. Psychiatry Research, 102, 109-124. 49 C H A P T E R 3: E M P A T H Y D E F I C I T S : T H E R O L E OF F A C I A L E X P R E S S I O N R E C O G N I T I O N Introduction This research investigated dispositional empathy deficits and individual differences in facial expression recognition. Accurately identifying the emotional states of others is an essential part of empathic behaviour (Ickes, 1991). Researching factors that relate to empathy deficits may provide help uncover the etiology of empathy-related disorders. Empathy can be defined generally as "the reactions of one individual to the observed experiences of another" (Davis, 1983, p. 113). This recognition involves different degrees o f physiological reactivity and cognitive awareness. The empathy construct has typically been separated into two types: cognitive and emotional (Davis, 1983). The cognitive type refers to the ability to imaginatively understand another person's thoughts, feelings and actions. Emotional empathy, on the other hand, does not involve cognitive apprehension. Rather, it is characterized by visceral, automatic, thoughtless, instinctive, involuntary reactivity (Mehrabian & Epstein, 1972). This cognitive-emotional distinction may be simplistic, however it has utility for describing behaviour and, in particular, deficits in empathy. A s described by Blair (2005), being high on one dimension and low on the other creates descriptions that have potentially good explanatory power. Being deficient in emotional empathy entails lacking normal emotional responding to others, resulting in callous affect. This deficit, combined with normal cognitive empathy, or normal awareness of the thoughts and intentions of 50 others, may correspond to intentional cruelty and manipulative tendencies because this awareness is being callously used for one's own gain. Conversely, being deficient in cognitive empathy entails not understanding social situations and inaccurately reading interpersonal cues. This tendency, combined with normal emotional empathy, may also result in cruel behaviour, but these effects are largely unintentional and are instead caused by low awareness. There are reasons to believe that emotional processing might differ in people with low empathy. What does emotional processing consist of? Importantly, researchers largely agree that emotion can be processed rapidly, without purposeful cognitive interpretation, rather remaining on a "gut feeling" level (Buck, 1984; Buck & Ginsburg, 1997; Ledoux, 1996; Lazarus, 1991; Ohman, 1993). In fact, research shows that even unconsciously perceived facial expressions are reacted to, such as through mimicry and skin conductance responses (e.g., Dimberg, Thunberg, & Elmeged, 2000; Whalen et al., 1998). Also , it is evident that fairly accurate interpretations of the basic emotions occur at very brief presentation duration rates; by 75 ms, facial expression recognition is quite accurate (Kirouac & Dore, 1984; Matsumoto et al., 2000; Ogawa & Suzuki, 1999; Sonny-Borgstrom, 2002). In terms of empathy, research indicates that dispositional empathy relates to more reactivity to facial expressions and better recognition of them. Some studies show that more mimicking to briefly presented emotional stimuli relates to higher levels of emotional empathy (Sonnby-Borgstrom, Jonsson, & Svensson, 2003). Another study found a link between recognition accuracy to briefly presented expressions and emotional empathy (Martin et al., 1996). Likewise, other studies have found links in preschool-aged 51 children between accurate recognition of facial expressions and prosocial behaviour (Nowicki & Mitchel l , 1998; Walden & Field, 1982). There have been no studies; to my knowledge; that have investigated the relationship of empathy deficits to emotional recognition of facial expressions presented for brief durations. Psychopathy, emotional empathy deficits and emotional recognition A s suggested by Blai r (2005), the disorder that typifies deficient emotional empathy and normal cognitive empathy is psychopathy. Psychopathy, as diagnosed by the Psychopathy Checklist-Revised ( P C L - R ; Hare, 1991) is characterized by low empathy and manipulativeness. The P C L - R consists of two factors: an affective factor, measuring, for example, manipulativeness, low empathy and grandiosity; and a behavioural factor, measuring traits, including impulsivity and sensation seeking. Research clearly shows that psychopaths have deficient emotional reactivity. For example, Aniskiewicz (1979) investigated the physiological responses of incarcerated psychopaths while they observed a confederate supposedly being electrically shocked. Psychopaths had little change in their electrodermal skin conductance compared to the control group. Skin conductance is a measure of physiological arousal and therefore, it can be inferred that the psychopath experienced less anxiety, emotional arousal and empathy to the supposed pain of another person. If psychopaths have low emotional reactivity, do they also have a deficient ability to analyze interpersonal situations accurately, or can sufficient cognitive empathy coexist with deficient emotional empathy? A s mentioned above, a manipulative, machiavellian interpersonal style is a key part of psychopathy, in which people are intentionally used for personal gain. Being tuned to the 52 emotions of others improves the ability to manipulate effectively and callousness would also make manipulation easier. Thus, at least on the surface, psychopathy depicts deficient emotional empathy and sufficient cognitive empathy. What might this mean for recognizing emotion in facial expressions? Adult psychopaths do not show a general emotional detection deficit (Habel, Kuhn, Salloum, Devos, & Schneider, 2002; Kosson, Suchy, Mayer, & Libby, 2002). In fact, Habel, Kuhn, Salloum, Devos and Schneider (2002) found that psychopaths who scored highest on the affective/interpersonal factor (i.e. the tendency to manipulate, low empathy, callousness) were better able to discriminate facial expressions presented. Conversely, results consistently demonstrate that children with psychopathic tendencies misjudge emotional facial expressions more frequently than children without psychopathic tendencies (Blair, 1999; Blair, Colledge, Murray, & Mitchel l , 2001; Stevens, Charman & Blair, 2001). Likewise, there is an association between prosocial behaviour and facial expression recognition in children: preschool-aged children who can accurately identify and interpret facial expressions tend to behave prosocially (Nowicki & Mitchel l , 1998; Walden & Field, 1982). However, no study, to my knowledge, has presented facial expressions to psychopaths for less than one second. Thus, low empathy may be associated with deficient emotional perception to stimuli presented at quick speeds that do not allow higher level conscious processing. A method used to assess theory of mind/cognitive empathy skills found psychopaths unimpaired (Richell et al., 2003). This test, called the Eyes test (Baron-Cohen, Jollife, Mortimore, & Robertson, 1997) presents pictures of just the eyes and the objective is to choose an adjective from a list, that describes the person's state of mind 53 (e.g., sarcastic, preoccupied, stern, relaxed). This kind of test does not assess one's emotional awareness, but rather one's awareness o f subtle, social knowledge. Blair (2002) has proposed a model to explain the empathy deficits in psychopathy, called the Violence Inhibition Model (VIM) . This model proposes that psychopaths lack normal responding to specific emotions: sadness and fear. Observing that psychopaths experience less autonomic activity while viewing sad or fearful faces than they do to threat (Blair, Jones, Clark, & Smith, 1997; Aniskiewicz, 1979), Blair (2000) proposed that inadequate responding to sadness and fear causes violent behaviour: normally functioning individuals witness another person's negative emotions, such as sadness or fear and experience vicarious arousal but psychopaths have a fundamental deficit preventing vicarious emotion. According to the model, normal empathy and moral socialization are formed by vicariously feeling the-pain one causes in others, which is experienced as aversive. This aversive arousal becomes classically conditioned to one's poor behaviour preventing it in the future. Psychopaths lack the initial arousal and i f they do not learn the connection between the person's pain and their behaviour, they do not become morally socialized and remain unempathic. Although several studies show this deficient recognition of fear (e.g. Blai r et al., 2004), these have been criticized because fear is typically the most difficult emotion to recognize (e.g. Ekman & Friesen, 1976; Kirouac & Dore, 1984) and therefore the results may be due to this. In addition, others have not found a fear recognition deficit in psychopaths, some finding a deficit in disgust instead (Kosson, Suchy, Mayer, & Libby, 2002). It is unclear what the basis of the V I M theory is and why fearful expressions would be recognized better: what is the mechanism by which we experience vicarious 54 arousal to the emotions of others? The facial feedback hypothesis is one potential explanation (see Capella, 1993; Lipps, 1907), and there is preliminary research support linking expression mimicry to emotional empathy (Sonnby-Borgstrom, Jonsson, & Svensson, 2003). However, mimicry occurs for other emotions too, such as anger and happiness. Maybe vicarious arousal to fear does cause an aversive reaction that helps shape moral behaviour, but it is unclear why fearful expressions would be recognized less well than the other emotions. In summary, the cognitive-emotional empathy distinction appears to capture some fundamental components of the empathy construct. Psychopathic-like empathy deficits can be defined as low emotional empathy and unimpaired cognitive empathy. The relationship between these kinds of deficits and emotional recognition in briefly presented faces has not been investigated. Preliminary research suggests that brief presentations of emotional stimuli might be particularly relevant to emotional empathy (e.g. Martin et al., 1996). It is unknown whether making specific emotional categorizations at brief presentation rates is relevant mainly to emotional empathy, mainly to cognitive empathy, to neither, or to both. The goal of this research was to understand the role of emotional recognition of facial expressions, presented at different speeds, to empathy deficits. This research used a student sample, and a measure suitable for sub-clinical investigations of psychopathy. Research suggests that measuring dimensional psychopathic characteristics in a noncriminal sample can determine valuable distinctions, in terms of providing variation on traits that approximate less severe versions psychopathic traits (Forth, Brown, Hart, & Hare, 1996). In addition, a comprehensive 55 measure of cognitive empathy deficits was administered. Two emotional facial expression recognition tasks with different presentation duration rates were given. It was hypothesized that scores on the Self-Report Psychopathy Scale (SRP; Paulhus, Hemphill , & Hare, in press) would be negatively related with emotional recognition ability at the 47 ms presentation duration rate. It was further hypothesized that the manipulation factor of the SRP (SRP-M) would be related with the cognitive empathy factor of the E Q (EQ-CE) , positively related with the callous affect factor of the SRP (SRP-CA) , positively related with expression recognition performance at 2008 ms, and negatively related with expression recognition at 47 ms. No recognition accuracy differences between for fearful expressions were hypothesized between high and low scorers on total SRP scores. Method Participants The total number of participants was 135 (98 females, 37 males), after the removal of one participant because o f outlier data. They were recruited from the psychology department at the University of British Columbia. Ninety-three percent of the subjects were between the ages of 18 and 23. A l l subjects were under age 35. English was not the first language in 55% of the sample and 47%, were of East Asian descent. A l l participants were university students and therefore their English language skills were presumed to be sufficient. Procedure . 56 After providing written informed consent, participants were seated at a desktop computer in a quiet room by themselves. The computer had a Pentium 3 866 Mhz processor. The monitor size was 17" and had a resolution of 1024x768 with 32 bit colour. The refresh rate was 85Hz. The experiment consisted of one facial expression recognition task presenting stimuli at 47 ms; one facial expression recognition task presenting stimuli at 2008 ms; one facial features judgement task presenting faces at 47 ms that was used to control for individual differences in processing information at 47 ms; and several questionnaires. All tasks and questionnaires were completed on the computer, in random order. Before each task and questionnaire, instructions appeared on the computer screen. Subjects read through the instructions and pressed the spacebar to begin the task. The instructions for the two facial recognition tasks were as follows: YOUR TASK WILL BE TO JUDGE THE EMOTION OF THE FACIAL EXPRESSION IN THE PHOTOGRAPH THAT IS PRESENTED TO YOU. The photographs will be presented for a brief amount of time. If you are unsure, make your best guess. You will see 3 things for each question, presented one after the other: 1) a "+" sign to focus your attention on the centre of the screen 2) a photo of a face 3) a list of possible answer choices with the corresponding number key for each answer. After you give your answer, the next question comes up immediately. Therefore, if you want to break, leave the answer choices on the screen, and answer only when ready to continue. You may take as long as you wish to answer. Press the spacebar to continue. 57 Measures Facial expression recognition tasks Two facial expression recognition tasks were administered to each participant that differed in presentation duration speed. Ogawa and Suzuki (1999) found that between around 36 ms to 48 ms, accurate emotional recognition capability improved greatly, with correct answers reaching nearly 50% for all expressions. Another study found that at 10 ms, accuracy ranged from 16%-20%; at 20 ms 26%-59%; at 30 ms 46%-80%; at 40 ms. 56%-80%; and at 50 ms, 70%-88% (Kirouac & Dore, 1984). The present study presented the facial expressions for 47 ms. This duration time was at the tail end o f the improvement threshold Ogawa and Suzuki (1999) suggested and similar to the 56 ms presentation time employed by Sonny-Borgstrom (2002) that was found to relate to emotional empathy. This study explored individual differences of this hypothesized threshold. The faces were not masked because restricting the analysis to a preconscious, automatic level of processing was not desired in this study. Rather, this study intended to measure individual differences in how much emotional information could be gleaned from briefly presented expressions, and how this information would be interpreted. The controlled, conscious facial expression recognition task presented the expressions for 2008 ms. This presentation time was long enough for a meaningful contrast in interpretation time with 47 ms. It should be noted that the refresh rate on computers determines the exact presentation duration time and that is why the precise duration times of 47 ms. an 2008 ms are not rounded to 2000 ms. and 50 ms. In addition, since there was no mask to stop the processing time of the stimuli the precise presentation duration was likely slightly longer than 47 ms. an 2008 ms. 58 At both presentation duration levels, participants were presented with facial expressions of adult male and female Caucasian faces from the well-used Pictures of Facial Affect photographs (Ekman & Friesen, 1976). These faces showed intense facial expressions depicting either anger, sadness, happiness, disgust, surprise, or fear. Neutral faces were also presented. Reliability data on the expression discrimination of these faces were used to. ensure that the expressions in both tasks were of equal discrimination difficulty. No expressions were presented twice in the entire study. Each type of expression was presented six times. At 47 ms, there were 21 male faces and 21 female faces. A t 2008 ms, there were 22 female faces and 20 male faces. Each of the seven expressions were presented six times, for a total o f 42 trials. There were also 5 practice trials to accustom the participants to the task and these were not calculated into their final score. Therefore, in total, each facial expression recognition task consisted of 47 trials, with a possible total score of 42. Accuracy was scored by the number of expressions answered correctly, out of 42. 47 ms. control task To control for individual differences in general perceptual processing speed, as well as other differences such as problems maintaining attention, another task involving facial judgements at 47 ms was administered. Participants judged whether the face presented had upside-down facial features or right-side-up features. These altered photos were also taken from the Pictures of Facial Affect photographs (Ekman & Friesen, 1976), but were not the same as the photos used in the other tasks. 59 Questionnaires A l l questionnaires are included in Appendix C. The following empathy questionnaires were part of this study: Self-Report Psychopathy Scale (Paulhus, Hemphill , & Hare, in press) and The Empathy Quotient (EQ; Shaw, Baker, Baron-Cohen, & David, 2004). Acculturation was controlled for with The Vancouver Acculturation Scale (Ryder, Alden, & Paulhus, 2000). Social desirable responding was measured with The Balanced Inventory of Desirable Responding (Paulhus, 1991). The Self-Report Psychopathy Scale (Paulhus, Hemphill & Hare, in press) has 60 items and is intended to measure psychopathic characteristics in sub-clinical, normal populations. The SRP does not replicate the structure of the Psychopathy Checklist-Revised (Hare, 1991), the main clinical psychopathy measure, which has 2 factors. The v current SRP version has 4 factors: callous affect (a=74), interpersonal manipulation (a=.76), erratic/impulsive lifestyle (a=.67) and antisocial behaviour (a=.81) (Wil l iam, Nathanson, & Paulhus 2003). It has good construct validity, for example, correlating with violent and antisocial preferences, (Williams, McAndrew, Learn, Harms, & Paulhus, 2001) and behavioural measures of cheating on scholastic tests (Nathanson, Paulhus, & Will iams, 2006). The scale is correlated with measures of narcissism and Machiavellianism (Paulhus & Will iams, 2002) and another self-report measure of psychopathy, the Psychopathy Checklist-Revised: Screening Version ( P C L - R : S V ; Hart, Cox, & Hare, 1995), which has been validated with the P C L clinical version and in forensic populations (Hart, Hare & Forth, 1994). In this study, the reliability coefficients were S R P - C A a=.71, S R P - M a=.78, S R P - E B a=.72 and S R P - A B a=.79. 60 The Empathy Quotient.(EQ); Shaw, Baker, Baron-Cohen, & David, 2004) has 60 items. This scale focuses on different forms of cognitive empathy, emphasizing social and interpersonal understanding. Validated on people with Asperger's Syndrome and high-functioning autism, it is intended to measure obliviousness to interpersonal cues, rather than low emotional reactivity. There are 40 empathy questions and 20 filler questions designed to distract the participant from the purpose of the questions. The scale is rated on a 5-point Likert scale, but it is scored by giving 1 point for mi ld agreement/disagreement and 2 points for strong agreement/disagreement. Early papers on the E Q separated the scale into three factors: 1) cognitive empathy (EQ-CE) , which measures primarily reading the behaviour and emotions of others accurately. This sub-scale does not correlate with the tendency to take another's perspective, as measured by the Interpersonal Reactivity Scale (IRS; Davis, 1983) 2) emotional reactivity (EQ-ER) , which measures insensitivity to the feelings of others due to being unaware 3) social skills (EQ-SS), which measures knowledge about social rules and conventions (EQ; Shaw, Baker, Baron-Cohen, & David, 2004). Later papers have both abandoned the three factor idea (Baron-Cohen & Wheelwright, 2004) while other researchers have found statistical reasons to maintain the three factors (Muncer & Ling , 2006). The current paper maintains the three sub-scale divisions, although acknowledging there is some overlap between the items. In this study, the reliability coefficients were E Q - C E a=.83, E Q - E R a=.71 and EQ-SS a=.61. The Balanced Inventory of Desirable Responding (Paulhus, 1991) is a 40-item inventory with two 20-item subscales: self-deceptive enhancement (BIDR-SDE) (a=.75) measures the tendency to give positively biased but honestly held self-descriptions, and 61 impression management (BIDR-IM) (a=.86) measures denying socially undesirable behaviour, giving favourable descriptions to others. The scale has good concurrent, convergent, and discriminant validity (Paulhus, 1991). In this study; the reliability coefficients were B I D R - S D E a=.50 and B I D R - I M cc=.71. The Vancouver Acculturation Scale (Ryder, Alden, & Paulhus, 2000) is bidimensional measure of acculturation. There are two subscales, one measuring identification with the mainstream North American culture ( V I A - M A ) (a=.75) and the other measuring identification with one's heritage culture (VIA-HI) . (a=.79) The scale shows good concurrent validity with other measures of acculturation, with length of time spent in a new culture and with generational status. The scale is used in this study to control for possible cultural differences in interpersonal and emotional behaviour. In this study, the reliability coefficients were V I A - M A a=85 and V I A - H I cc=88. Results Descriptive statistics Appendix A shows the distribution graphs for 47 ms and 2008 ms, and kurtosis and skewness statistics, which are at acceptable levels. Appendix B shows the correlations between all o f the main variables. Table 3.1 shows the mean scores for expression identification accuracy at 47 ms and 2008 ms for the total sample, for gender, and also compares females and males on these variables using t-tests. There were no significant gender differences on either of the facial expression recognition tasks, although females slightly outperformed males on both. Table 3.2 shows that males scored significantly higher than females on the callous affect factor of the SRP ( S R P - C A ) , 62 t(133)=-4.72,/X.000, rf=.90, on the erratic behaviour factor of the SRP (SRP-EB) , t(133)=-2.55,/?<.012, d=AS, on the antisocial behaviour factor of the SRP ( S R P - A B ) , t(133)=-2.18,/K.031, d=A2 and the total SRP score (SRP-T), t(133)=-3.46, p<.001., d=.64. The genders did not differ on the manipulation factor of the SRP (SRP-M) , t(133)=-1.38, n.s., d=.2S. The genders also did not differ on the cognitive empathy factor of the E Q (EQ-CE) , t(133)=1.33, n.s, d=.26, nor on the social skills factor of the E Q (EQ-SS), t(133)=.47, n.s., d=.09. They did differ on the emotional reactivity factor of the E Q (EQ-EE) , t(133)=3.52, jp<.001, d=.69 and on the E Q total score (EQ-T), t(133)=3.10, p<.002,d=61. Subjects may wish to appear more empathic and socially skilled, or may view themselves as in a more positive light than they actually are. These two aspects of social desirability form the factors on the Balanced Inventory of Desirable Responding (BIDR): Impression Management (IM) and Self-Deceptive Enhancement (SDE). However, people who care about how others view them and about social convention may also be more inclined to have true empathy for others. Therefore, controlling for social desirability when investigating constructs such as empathy is not recommended because part of the essence of the construct is removed. It is recommended instead to report the correlations between social desirability and the variables. These correlations are reported in Table 3.3 and Table 3.4. The B I D R - S D E was not significantly correlated with any of the SRP variables, but was correlated with all o f the E Q variables. The B I D R - I M showed the opposite pattern, correlating with all o f the SRP variables, and only correlating with E Q -E R and EQ-T . 63 There was a weak relationship between 47 ms control task and expression recognition performance at 47 ms and it was not significant, r=. 14, p>. 120. The 47 ms control task and did not come close to correlating with any other variable in the study, except for V I A - M A with which there was a strong positive correlation, r=.22,/?<.011. Thus, the less acculturated subjects had more difficulty making correct judgments at 47 ms, and this difficulty was not just related to facial expression recognition. The 47 ms control task was not correlated with heritage acculturation, r=.06, n.s., or with expression recognition at 2008 ms, r=.07, n.s. The Vancouver Index of Acculturation has two factors, one measuring identification with North American mainstream culture ( V I A - M A ) and the other measuring identification with one's heritage culture (VIA-HI) . While V I A - H I was uncorrelated with facial expression recognition variables, V I A - M A was correlated. The less one identifies with North American culture, the less well they recognized emotion at 47 ms, r=.27,j9<.002, and at 2008 ms, however 2008 ms was not quite significantly related, r=A4,p<A03. In terms of the variables in this study, neither V I A - M A nor V I A -HI, were significantly correlated with any factor of the SRP or E Q , with the exception of the social skills factor of the mainstream acculturation factor of the EQ-SS (r=.27, p<.002). The more acculturated participants were, the more socially skilled they reported being. Analyses of main hypotheses First, SRP-T and E Q - T were entered into regression equations to predict identification accuracy at 47 ms and 2008 ms. Gender, V I A - N A , V I A - N A x gender interaction, V I A - H I , V I A - H I x gender interaction, and the 47 ms control task, were also 64 entered. The results of this regression are shown in Table 3.5 and Table 3.6. SRP-T and E Q - T were not significant predictors at 47 ms, nor at 2008 ms. Next, the different factors of SRP and E Q were entered into hierarchical regressions to assess their individual predictive power on the facial expression recognition tasks. One regression predicted accurate recognition at 47 ms and the other predicted accurate recognition at 2008 ms. The predictor variables for both regressions were the various empathic deficit factors, gender and acculturation ( V I A - H I and V I A -M A ) and the 47 ms control task. The results of the regressions are summarized in Table 3.7, Table 3.8, Table 3.9 and Table 3.10. Table 3.7 shows the results of the regression at 47 ms. V I A - M A predicted a large proportion of the variance (P=.26,/?<.011). The only empathy deficit that significantly predicted performance on the facial expression recognition task at 47 ms was S R P - C A (p=-.25,/K.043). Callous affect was negatively correlated with performance on the task. Being able to read others well , as measured by the cognitive empathy factor of the E Q , and understand social situations, as measured by the social skills factor of the E Q , did not predict performance. Table 3.8 shows the results of the regressions at 2008 ms. The V I A - M A X gender interaction was almost a significant predictor (P=-.19,/?<.062). The only empathy deficit that significantly predicted performance at 2008 ms of processing was E Q - S S (P=.28, p<.005). The higher one's social skills, the higher one's recognition scores at 2008 ms. The manipulation factor of the SRP (SRP-M) was explored in more detail, as shown in Table 3.9. First, a regression analysis predicting manipulation from subtypes of cognitive and emotional empathy measures, was performed. S R P - M was positively 65 correlated with E Q - C E ((3=.26, p<.000), meaning that manipulation is associated with believing one can read others well in combination with a positive relationship with callousness, S R P - C A (P=.29,/?<.001). Second, using multivariate analysis of covariance ( M A N C O V A ) , expression recognition performance at 47 ms and 2008 ms were analyzed, comparing three groups that differed in their level of manipulation. Gender, acculturation and the perceptual control task were entered as covariates. The overall M A N C O V A was significant, using Roy 's Largest Root, F(2,125)=3.47,p<.034. The Univariate A N C O V A showed significance for 2008 ms, F(2,126)=3.26,/?<.042, but not for 47 ms, F(2,126)=.08,/?<.925. Follow-up L S D multiple comparisons showed that group 1, the least manipulative group, performed significantly better at the expression recognition task than group 3. Lastly, specific emotion types were analyzed. The means and SDs for each emotion are listed in Table 3.10. Recognition of fear in total SRP scores were analyzed, in addition to sad, angry and disgust; these findings are listed in Table 3.11. Total SRP scores were separated into high and low groups and a M A N C O V A was performed. The overall M A N C O V A was not significant. However, it is notable that in the univariate A N C O V A , the only expression that came close to significance was fear at 2008 ms, F(l,127)= 3.29,/j<.072. Interestingly, in this study, fear was not the most difficult expression to recognize, neither at the 2008 ms, or at the 47 ms, as shown in Table 3.12. 66 Discussion The results of this study largely support the hypotheses. In a sample of normal subjects, only one factor of the two empathy deficit constructs predicted facial expression recognition performance: callous affect of the self-report psychopathy scale ( S R P - C A ) . Erratic behaviour, manipulation, antisocial behaviour, social skil l deficits and cognitive empathy deficits did not significantly relate at this brief presentation rate. The more callous one is, the less ability one has in accurately recognizing facial expressions presented for 47 ms. Interestingly, this relationship is not present when faces are presented for 2008 ms. A t this longer presentation duration of 2008 ms, the only factor that predicts performance is social skills. People with lower social skills, as measured by the E Q , have more difficulty correctly recognizing facial expressions at this level. It was not the case that the task was overly easy at 2008 ms, and only those subjects with extremely deficient interpersonal understanding performed poorly; there is no evidence of a ceiling effect. Interpreting emotional stimuli can be a rapid, automatic process that results from a more instinctive, gut-level interpretation that cannot be logically analyzed (Buck, 1984; Buck & Ginsburg, 1997; Ledoux, 1996; Lazarus, 1991; Ohman, 1993). Or, processing emotions can be at a higher level, under cognitive control, being both deliberate and propositional (Buck, 1984; Buck & Ginsburg, 1997; Ledoux, 1996; Lazarus, 1991; Ohman, 1993). These results suggest that people who are callous are less able to utilize their rapid, automatic processing system, to interpret emotional information. Conversely, people with greater emotional reactivity and warmth towards others, proficiently interpret briefly presented emotional information. 67 Total SRP and total E Q scores did not predict identification accuracy at 47 ms or at 2008 ms. This suggests that the psychopathy construct as a whole does not relate to emotional recognition impairments, at least in a subclinical sample. Only the callous affect part is related, at 47 ms. Cognitive empathy deficits in general are also not related to identification accuracy, just social ski l l deficits, at 2008 ms. This finding makes sense: deficits at this longer presentation time imply more fundamental problems that might translate into interpersonal difficulties. The manipulation factor of the SRP scale was analyzed separately. Hypothetically, manipulation typifies the cognitive-emotional empathy distinction. Effective manipulation requires complex interpersonal understanding (high cognitive empathy) and a lack of concern for others (low emotional empathy). In line with hypotheses, manipulation was predicted by high callous affect (SRP-CA) , and a heightened self-reported ability to read the emotions and thoughts of others (EQ-CE) . However, the results of the manipulation factor on the emotional tasks were not in line , with predictions. Contrary to expectations, the high manipulation group did not differ from the low manipulation group in performance at 47 ms. A t 2008 ms, the high manipulation group actually performed worse than the low manipulation group. The meaning o f these results is difficult to explain. Perhaps these findings are due to self-report biases. For example, people reporting manipulative tendencies may believe that they are more astute in interpersonal situations than they actually are, and may value manipulative and callous interactions. Manipulative behaviour may be better explained by selfish narcissism, rather than actual emotional processing deficits. 68 Current theory and research findings indicate that there may be specific processing deficits for fear in psychopathy (Blair et al., 2000; 2004). Blair hypothesized that empathy forms from vicariously experiencing the aversive emotions of others, such as fear and sadness, and that experiencing these aversive emotions vicariously prevents us from causing these states in others. Although not quite significant, total SRP scores were associated with lower recognition of fear at 2008 ms. There were no other expression-type differences that approached significance at either 47 ms or 2008 ms. Fear has often been noted as the most difficult expression to recognize (Ekman & Friesen, 1976). However, this study did not find that. A s shown in Table 3.11, fear was not the most difficult expression to recognize either at 2008 ms nor at 47 ms. These results provide tentative support for the link between psychopathy and deficient decoding of fear in facial expressions. There were large cultural effects and moderate gender effects in this study. This study only showed Caucasian faces, yet a large proportion of the sample were Asian participants who reported low levels of acculturation into mainstream, North American culture. This is problematic, since studies show that there are significant differences in the meanings and associations of the emotional labels (Biehl et al., 1997), including how intense expressions are reported to be (e.g. Matsumoto & Ekman, 1989), and in facial physiognomy (Matsumoto, 1989). The effects of culture were mostly found at the more difficult, quick presentation rate of 47 ms, and less at 2008 ms. This finding may be due to the increased difficulty recognizing facial expressions from a culture they are less familiar with. In addition, there was a significant correlation between the control task at 47 ms and the expression task at 47 ms. This finding implies that the less acculturated 69 subjects may have had more experiment anxiety, causing lower scores on a task that required pressurized attention. The effect of acculturation is an interesting one and it requires further exploration. It is also possible that people recognize expressions better within their own ethnic group regardless of their degree of exposure to other ethnic groups. Thus, the results could, in part, reflect differences in the ability to recognize emotional expressions from the same vs. a different ethnic group, or could be due to the amount of experience interacting with an ethnic group different than one's own, or some combination of these, and other unknown factors. The present research does not allow the influences of ethnicity and cultural exposure to be teased apart, since the East Asian participants were from different parts of Asia , and the precise length of time spent in North America was not obtained. There were significant gender differences, as reported in Table 3.1, with males reporting more callous affect, erratic behaviour, antisocial behaviour and less emotional reactivity. There were no significant gender differences in manipulation, cognitive empathy or social skills. There were no significant gender differences on either of the expression recognition tasks, but females performed slightly better. These large gender differences on self-reported empathy and minimal differences on emotional task performance, are consistent with literature (Dimberg & Lundquist, 1990; Hal l , 1978; Hal l & Matsumoto, 2004; Kirouac & Dore, 1984; Lennon & Eisenberg, 1987; Montagne et al., 2005; Thayer & Johnsen, 2002). There are several limitations to this study. A state mood scale was not included, but could have been used to control for the effects of current mood on expression recognition. A second major limitation is the self-report nature of the measures of 70 empathy. A s the correlations with social desirability indicate, participants were l ikely not reporting their true empathic behaviour and were responding in a socially desirable way. Moreover, subjects may have too little self-awareness or lack a comparison to validly report on empathy related questions. In addition, given the large proportion of Asian participants in this study with low levels of acculturation into mainstream North American culture, the large culture effect found in this study could have been mitigated somewhat by using an equal number Caucasian and Asian faces as stimuli, such as the Japanese and Caucasian Facial Expressions of emotion ( J A C F E E ; Matsumoto & Ekman, 1988). Future research might replicate the study with actual psychopaths. Also , focussing on refined subtypes of emotional empathy and cognitive empathy deficits might be informative. First, a subgroup of low emotional reactivity may include people who deny their emotions. Second, a subgroup of low cognitive empathy may be an inability to describe emotions, otherwise known as alexithymia; emotions are normally experienced, and social interactions are appropriate, but emotional language is missing. These subtypes could be investigated in terms of their emotional recognition to basic, intense, briefly presented expressions, as in this study. Exploring other types of emotional recognition, such as measuring awareness of states of mind with the Eyes Test (Baron-Cohen, Jolliffe, Mortimore, & Robertson, 1997), or subtle emotion with the Subtle Expressions Training Task (SETT; Ekman, 2004), would also be useful. This research focussed on the kinds of empathy deficits found in psychopathy in a non-clinical sample and cognitive empathy deficits. In conclusion, the link between callous affect and lower ability to identify facial expressions of emotion that are briefly 71 presented is interesting, especially because this effect is not found when the expressions are presented for a longer durations. The meaning of these findings is unclear, however, and more research is required. 72 Tables Chapter 3 Table 3.1. Mean scores and gender comparisons for expression identification accuracy at 47 ms and 2008 ms (N=135) Study Group Total Females (n=98) Males (n=37) Student t (Males vs. Females) df=133 P Identification accuracy mean scores at 47 ms (max score 42) 27.77 (4.24) 66% 27.99 (4.00) 67% 27.19 (4.84) 65% .98 .330 Identification accuracy mean scores at 2008 ms (max score 42) 34.0 (3.74) 81% 34.26(3.72) 82% 33.22 (3.76) 79% 1.46 .148 Note. Mean scores are listed and standard deviations are in brackets *p<.05. **p<.01. ***p<.001 73 Table 3.2. Mean scores and gender comparisons for Self-Report Psychopathy (SRP) and Empathy Quotient (EQ) (N=135) Group Total (N=138) Females (n=98) Males (n=37) Student t (Males vs. Females) df=133 P Callous Affect S R P - C A 38.11 (7.26) 36.43 (6.63) 42.57 (7.03) -4.72 .000*** Manipulation S R P - M 42.60 (8.13) 42.01 (8.00) 44.32 (8.35) -1.38 .171 Erratic Behaviour S R P - E B 41.62 (7.52) 40.62 (7.13) 44.24 (8.00) -2.55 .012* Antisocial Behaviour S R P - A B 32.17 (9.70) 31.07 (9.55) 35.10 (9.55) -2.18 .031* Self-report psychopathy total SRP-T 122.33 (18.55) 119.06 (16.85) 131.00 (20.22) -3.46 .001*** Cognitive empathy E Q - C E 9.08 (4.06) 9.37(4.13) 8.32(3.81) 1.33 .184 Emotional Reactivity E Q - E R 9.27 (3.76) 9.94 (3.70) 7.49 (3.37) 3.52 .001*** Social Skills EQ-SS 4.42 (2.32) 4.48 (2.40) 4.27(2.11) .466 .642 Empathy Quotient total E Q - T 31.39 (9.31) 32.86(9.33) 27.49 (8.18) 3.10 .002** Note. SRP-CA = Self-Report Psychopathy Scale, Callous Affect subscale; SRP-M = Self-Report Psychopathy Scale, Manipulative subscale; SRP-EB = Self-Report Psychopathy Scale, Erratic Behaviour subscale; SRP-AB = Self-Report Psychopathy Scale, Antisocial Behaviour subscale; SRP-T = Self-Report Psychopathy Scale, Total score; EQ-CE = Empathy Quotient, Cognitive Empathy subscale; EQ-ER = Empathy Quotient, Emotional Reactivity subscale; EQ-SS = Empathy Quotient, Social Skills subscale; EQ-T = Empathy Quotient, total score. Standard deviations are in brackets *p<.05. **p<.01. ***p<.001 74 Table 3.3. Zero-order Correlations between Social Desirability and Self-report Psychopathy (N=135) 1 2 3 4 5 6 7 Self- Impression Self-Report Erratic Callous Manipulation Antisocial deceptive Management Psychopathy Behaviour Affect S R P - M Behaviour enhancement B I D R - P M total score S R P - E B SRP- oc=78 S R P - A B B I D R - S D E oc=72 SRP-T a=.72 C A a=.79 a=.49 a=.86 a=. 71 1 — 31 ** -.00 -.06 -.04 .09 -.06 2 — -.50** -.45** -31** -.45** ' . 44** 3 - §3** .76** .84** .59** 4 - .45** .57** .50** 5 - .43** 3 9 * * 6 - .52** Note. BIDR-SDE = Balanced Inventory of Social Desirable Responding-Self-Deception subscale; BIDR-IM = Balanced Inventory of Social Desirable Responding-Impression Management subscale; SRP-CA = Self-Report Psychopathy Scale, Callous Affect subscale; SRP-M = Self-Report Psychopathy Scale, Manipulative subscale; SRP-EB = Self-Report Psychopathy Scale, Erratic Behaviour subscale; SRP-AB = Self-Report Psychopathy Scale, Antisocial Behaviour subscale; SRP-T = Self-Report Psychopathy Scale, Total score; Standard deviations are in brackets *p<.05. **p<.01. Table 3.4. Zero-order Correlations between Social Desirability and Empathy Quotient (N=135) 1 2 3 4 5 6 Self- Impression Social Cognitive Emotional Empathy deceptive Management Skills Empathy Reactivity Quotient enhancement B I D R - P M EQ-SS C E - E Q E Q - E R Total B I D R - S D E a=.71 a=.61 a=.83 ot=71 E Q - T , a=.49 a=.83 1 — .31** .22** .31** .26** .36** 2 — .01 .15 29** 29** 3 - 3 1 * * 32** .57** 4 - 44** 74** 5 - g3** Note. BIDR-SDE = Balanced Inventory of Social Desirable Responding-Self-Deception subscale; BIDR-IM = Balanced Inventory of Social Desirable Responding-Impression Management subscale; EQ-CE = Empathy Quotient, Cognitive Empathy subscale; EQ-ER = Empathy Quotient, Emotional Reactivity subscale; EQ-SS = Empathy Quotient, Social Skills subscale; EQ-T = Empathy Quotient, total score. *p<.05. **p<.01. 75 Table 3.5. Hierarchical Regression Analysis 47 ms recognition accuracy, with SRP and E Q total scores as predictors (N=135) Predictor Coefficient (beta) t value P R2 (group) A R 2 Control Variables Gender -.09 -.92 .359 .12 .12 VIA-M .28 2.81 .006** VIA-M X Gender ,08 .81 .422 VIA-H -.06 -.50 .618 VIA-H X Gender .10 .89 ' .373 47 ms. control task .10 1.20 .231 Empathy Deficit Variables SRP-T -.03 -.30 .766 .14 .02 EQ-T .14 1.61 .111 Note. SRP-T= Self-Report Psychopathy Scale, Total Score, EQ-T = Empathy Quotient, Total Score; VIA-MA = Vancouver Acculturation Index-Mainstream Acculturation subscale; VIA-HI = Vancouver Acculturation Index - Heritage Identification subscale Coefficients are values at last entry in hierarchical regression. R 2 values are cumulative and represent variance explained for the group(s) of variables entered. *p<.05. **p<.01. ***p<.001 Table 3.6. Hierarchical Regression Analysis 2008 ms recognition accuracy, with SRP and E Q total scores as predictors (N=135) Predictor Coefficient (beta) t value P R2 (group) A R 2 Control Variables Gender -.06 -.64 .521 .02 .07 VIA-M .22 2.17 .032* VIA-M X Gender .18 1.79 .076 VIA-H -.16 -1.39 .169 VIA-H X Gender -.12 -1.01 .313 47 ms. control task .08 .89 .375 Empathy Deficit Variables SRP-T -.15 -1.64 .104 .12 .05 EQ-T .16 1.76 .080 Note. SRP-T= Self-Report Psychopathy Scale, Total Score, EQ-T = Empathy Quotient, Total Score; VIA-MA = Vancouver Acculturation Index-Mainstream Acculturation subscale; VIA-HI = Vancouver Acculturation Index - Heritage Identification subscale Coefficients are values at last entry in hierarchical regression. R 2 values are cumulative and represent variance explained for the group(s) of variables entered. *p<.05. **p<.01. ***p<.001 76 Table 3.7. Hierarchical Regression Analysis 47 ms recognition accuracy, including SRP-C A but not E Q - E R as a predictor (N=135) Predictor Coefficient (beta) t value P R2 (group) A R 2 Control Variables Gender -.07 -.68 .496 .12 .12 VIA-M .26 2.59 .011** VIA-M X Gender .12 1.20 .231 VIA-H -.05 -.41 .680 VIA-H X Gender .14 1.15 .254 47 ms. control task .12 1.39 .168 Empathy Deficit Variables SRP-CA -.25 -2.05 .043* .19 .07 SRP-EB .06 .50 .615 SRP-M .02 .12 .904 SRP-AB .09 .83 .407 EQ-CE .06 .60 .547 EQ-SS .03 .28 .782 EQ-GE .003 .03 .974 Note. SRP-CA = Self-Report Psychopathy Scale, Callous Affect subscale; SRP-M = Self-Report Psychopathy Scale, Manipulative subscale; SRP-EB = Self-Report Psychopathy Scale, Erratic Behaviour subscale; SRP-AB = Self-Report Psychopathy Scale, Antisocial Behaviour subscale; SRP-T = Self-Report Psychopathy Scale, Total score; EQ-CE = Empathy Quotient, Cognitive Empathy subscale; EQ-ER = Empathy Quotient, Emotional Reactivity subscale; EQ-SS = Empathy Quotient, Social Skills subscale; EQ-T = Empathy Quotient, total score; VIA-MA = Vancouver Acculturation Index-Mainstream Acculturation subscale; VIA-HI = Vancouver Acculturation Index - Heritage Identification subscale Coefficients are values at last entry in hierarchical regression. R2 values are cumulative and represent variance explained for the group(s) of variables entered. *p<.05. **p<.01. ***p<.001 77 Table 3.8. Hierarchical Regression Analysis for 2008 ms recognition accuracy. (N=135) Predictor Coefficient (beta) t value P R2 (group) A R 2 Control Variables Gender -.10 -.99 .323 .04 .04 VIA-M .17 1.65 .100 VIA-M X Gender .19 1.88 .062 VIA-H -.11 -.90 .373 VIA-H X Gender -.07 -.56 .574 47 ms. control task .07 .79 .433 Empathy Deficit Variables SRP-CA -.10 -.82 .413 .51 .47 SRP-EB .05 .48 .636 SRP-M -.09 -.78 .440 SRP-AB -.06 -.58 .566 EQ-CE -.08 -.82 .416 EQ-SS .28 2.85 .005** Note. SRP-CA = Self-Report Psychopathy Scale, Callous Affect subscale; SRP-M = Self-Report Psychopathy Scale, Manipulative subscale; SRP-EB = Self-Report Psychopathy Scale, Erratic Behaviour subscale; SRP-AB = Self-Report Psychopathy Scale, Antisocial Behaviour subscale; SRP-T = Self-Report Psychopathy Scale, Total score; EQ-CE = Empathy Quotient, Cognitive Empathy subscale; EQ-ER = Empathy Quotient, Emotional Reactivity subscale; EQ-SS = Empathy Quotient, Social Skills subscale; EQ-T = Empathy Quotient, total score; VIA-MA = Vancouver Acculturation Index-Mainstream Acculturation subscale; VIA-HI = Vancouver Acculturation Index - Heritage Identification subscale Coefficients are values at last entry in hierarchical regression. R 2 values are cumulative and represent variance explained for the group(s) of variables entered. *p<.05. **p<.01. ***p<.001 78 Table 3.9. Hierarchical Regression Analysis for Manipulation factor of the SRP (N=135) Predictor Coefficient (beta) t value P R (group) A R 2 Control Gender -.08 -1.08 .284 .04 .04 VIA-M -.07 -.86 .393 VIA-M X Gender -.09 -1.13 .260 VIA-H .01 .05 .958 VIA-H X Gender -.03 -.27 .786 47 ms. control task .03 .51 .608 Empathy Deficit Variables SRP-ER .28 3.31 .001*** .51 .47 SRP-AB .28 3.60 .000*** EQ-SS .10 1.38 .169 EQ-CE .26 3.66 .000*** SRP-CA .29 3.29 .001*** Note. SRP-CA = Self-Report Psychopathy Scale, Callous Affect subscale; SRP-M = Self-Report Psychopathy Scale, Manipulative subscale; SRP-EB = Self-Report Psychopathy Scale, Erratic Behaviour subscale; SRP-AB = Self-Report Psychopathy Scale, Antisocial Behaviour subscale; SRP-T = Self-Report Psychopathy Scale, Total score; EQ-CE = Empathy Quotient, Cognitive Empathy subscale; EQ-ER = Empathy Quotient, Emotional Reactivity subscale; EQ-SS = Empathy Quotient, Social Skills subscale; EQ-T = Empathy Quotient, total score; VIA-MA = Vancouver Acculturation Index-Mainstream Acculturation subscale; VIA-HI = Vancouver Acculturation Index - Heritage Identification subscale Coefficients are values at last entry in hierarchical regression. R 2 values are cumulative and represent variance explained for the group(s) of variables entered. *p<.05. **p<.01. ***p<.001 Table 3.10. Identification accuracy means for each expression type at 47 ms and 2008 ms (N=135) Happy Sad Angry Fear Surprise ' Disgust Neutral 47 ms identification accuracy means (max. score 7) 5.73 (63) 3.50 (1.32) 2.77 (1.36) 3.24 (1.47) 5.10 (1.00) 2.93 (1.50) 4.45 (1.30) 2008 ms identification accuracy means (max. score 7) 5.64 (.58) 4.62 (1.19) 4.86 (1.09) 4.24 (1.50) 5.34 (922) 4.13 (1.29) 5.15 (1.05) Note. Mean values for each expression are listed. Each expression was presented 7 times, thus the maximum score is 7 Standard deviations are in brackets 79 Table 3.11. Comparison between groups with high and low Self-Report Psychopathy total scores (SRP-T) on recognition of specific types of expressions at 47 ms and 2008 ms (N=135) 47 ms 2008 ms Sad F(l,127)=.09, n.s. F(l,127)=47, n.s. Angry F(l,127)=.22, n.s. F(l,127)=.18,n.s. Fear F(l,127)=1.13; n.s. F(l,127)= 3.29, p<.072 Surprise F(l,127)=.001,n.s. F(l,127)=.21,n.s. Disgust F(l,127)=24, n.s. F(l,127)=.25,n.s. Note. These results were obtained with MANCOVA 80 References Aniskiewicz, A . S . (1979). Autonomic components of vicarious conditioning and psychopathy. Journal of Clinical Psychology, 35, 60-67. Baron-Cohen, S., Jolliffe, T., Mortimore, C , & Robertson, M . (1997). Another advanced test o f theory o f mind: Evidence from very high-functioning adults with autism or Aspergers Syndrome. Journal of Child Psychology and Psychiatry, 38, 813-822. Baron-Cohen, S., Wheelwright, S., H i l l , J., Raste, Y . & Plumb, I. (2001). The "reading the mind in the eyes" test revised version: A study with normal adults, and adults with asperger syndrome or high-functioning autism. Journal of Clinical Psychology, 42(2), 241-251. Biehl , M . , Matsumoto, D . , Ekman, P., Hearn, V . , Heider, K . , Kudoh, T. & Ton, V . (1997). Matsumoto and Ekman's Japanese and Caucasian Facial Expressions of emotion ( J A C F E E ) : Reliability data and cross-national differences. Journal of Nonverbal Behaviour, 27(1), 3-21 Blair, R J . R . (1999). Responsiveness to distress cues in the child with psychopathic tendencies. Personality and individual Differences, 27, 135-145. Blair, R J . R . (2005). Responding to the emotions of others: Dissociating forms o f empathy through the study o f typical and psychiatric populations. Consciousness and Cognition, 14, 698-718. Blair , R.J.R., Colledge, E . , & Mitchell , D . G . V . (2001). Somatic markers and response reversal: Is there orbitofrontal cortex dysfunction in boys with psychopathic tendencies? Journal of Abnormal Child Psychology, 29(i6), 499-512. 81 Blair, R.J.R., Jones, L . , Clark, F. & Smith, M . (1997). The psychopathic individual: a lack o f responsiveness to distress cues? Psychophysiology, 34, 192-198. Blair , R.J.R., Mitchel l , D . G . V . , Peschardt, K . S . , Colledge, E . , Leonard, R . A . , Shine, J .H. , Murray, L . K . & Perrett, D .L (2004). Reduced sensitivity to others' fearful expressions in psychopathic individuals. Personality and Individual Differences, 37,1111-1122. Buck, R. (1984). The communication of emotion. New York: Guilford Press. Buck, R. & Ginsburg, R. (1997). Communicative genes and the evolution of empathy. In W . Ickes (Ed.), Empathic Accuracy (pp. 17-43). New York: Guilford Press. Cappella J .N. (1993). The facial feedback hypothesis in human interaction: Review and speculation, Journal of Language and Social Psychology, 12(\&2), 13-29. Davis, M . H . (1980). A multidimensional approach to individual differences in empathy. Catalog of Selected Documents in Psychology, 10(4), 1-17. Davis, M . H . (1983). Measuring individual differences in empathy: Evidence for a multidimensional approach. Journal of Personality and Social Psychology, 44(\), 113-126. Dimberg, U . & Lundquist, L . (1990). Gender differences in facial reactions to facial expressions. Biological Psychology, 30, 151-159. Dimberg, U . , Thunberg, M . & Elmehed, K . (2000). Unconscious facial reactions to emotional facial expressions. Psychological Science, 11(1), 86-89. Ekman, P. (2004). Group M E T T / S E T T Training/Research C D . Ekman, P., & Friesen, W . V . (1976). Pictures of facial affect. Palo Al to , C A . : Consulting Psychologists Press. 82 Eisenberg, N . , Fabes, R . A . , Murphy, B . , Karbon, M . , Maszk, P., Smith, M . , O 'Boyle , C , & Suh, K . (1994). The relations of emotionality and regulation to dispositional and situational empathy-related responding. Journal of Personality and Social Psychology, 68(4), 776-797. Elfenbein, H . A . & Ambady, H . (2003). When familiarity breeds accuracy: Cultural exposure and facial emotion recognition. Journal of Personality and Social Psychology, 85(2), 276-290. Forth, A . , Brown, Shelley, L . , Hart, S.D., & Hare, R . D . (1996). The assessment o f psychopathy in male and female noncriminals: Reliability and validity, Personality and Individual Differences, 20(5), 531 -543. Habel, U . , Kuhn, E . , Salloum, J., Devos, H . & Schneider, F. (2002). Emotional processing in psychopathic personality. Aggressive Behaviour, 28, 394-400. Ha l l , J .A . (1978). Gender effects in decoding nonverbal cues. Psychological Bulletin, 85, 845-857. Hal l , J .A. & Matsumoto, D . (2004). Gender differences in judgments of multiple emotions from facial expressions. Emotion, 4(2), 201-206. Hare, R . D . (1991). The Hare Psychopathy Checklist-Revised. Toronto, O N : Mutli-Health Systems. Hart, S.D., Cox, D . N . & Hare, R . D . (1995). Manual for the Screening Version of the Hare Psychopathy Checklist-Revised (PCL-R.SV). Toronto: Multi-Health Systems. Hart, S.D., Forth, A . E . , & Hare, R . D . (1991). Assessing psychopathy in male criminals using the M C M I - I I . Journal of Personality Disorders, 5, 318-327. 83 Kirouac, G . & Dore, F . Y . (1984). Judgment of facial expressions of emotion as a function of exposure time. Perceptual and Motor Skills, 59, 147-150. Kosson, D.S. , Suchy, Y . , Mayer, A . R . & Libby, J. (2002). Facial affect recognition in criminal psychopaths, Emotion, 2(4), 398-411. Lawrence, E.J . , Shaw, P., Baker, D . , Baron-Cohen, S., & David, A . S . (2004). Measuring empathy: Reliability and validity o f the Empathy Quotient. Psychological Medicine, 34, 911-924. Lazarus, R.S . (1991). Progress on a cognitive-motivational-relational theory of emotion. American Psychologist, 46(8), 819-834. Lipps, T. (1903). X I V . Kapitel: Die einfuhlung. In Leitfaden derpsychologie [Guide to psychology] (pp. 187-201). Leipzig: Verlag von Wilhelm Engelmann. Lazarus, R .S . (1991). Progress on a cognitive-motivational-relational theory o f emotion. American Psychologist, 46(S), 819-83.4. LeDoux, J. (1996). The emotional brain. New York: Simon & Schuster. Lennon, R. & Eisenberg, N . (1987). Gender and age differences in empathy and sympathy. In N . Eisenberg & J. Strayer (Eds.), Empathy and its development (pp. 195-217). New York: Cambridge University Press. Martin R . A . , Berry, G .E . , Dobranski, T. & van Home, M . (1996). Emotion perception threshold: individual differences in emotional sensitivity. Journal of Research in Personality, 38, 290-305. Matsumoto, D . & Ekman, P. (1989). American-Japanese cultural differences in judgments of facial expressions of emotion. Motivation and Emotion, 13, 143-157. 84 Matsumoto, D . (1989). Face, culture, and judgements of anger and fear: Do the eyes have if! Journal of Nonverbal Behaviour, 13(3), 171-188. Matsumoto, D . , LeRoux, J., Wilson-Cohn, C , Kooken, K . , Ekman, P., Yrizarry, N . , Loewinger, S., Uchida, H . , Yee, A . , Amo, L . & Goh, A . (2000). A new test to measure emotion recognition ability: Matsumoto arid Ekman's Japanese and Caucasian Br ie f Affect Recognition Test ( J A C B A R T ) . Journal of Nonverbal Behavior, 24(3), 179-209. Mehrabian, A . , & Epstein, N . (1972). A measure of emotional empathy. Journal of Personality, 40(4), 525-543. Montagne, B . , Kessels, R .P .C. , Frigerio, E. , de Harm, E .H.F . , & Perrett, E.I. (2005). Sex differences in the perception of affective facial expressions: Do men really lack emotional sensitivity? Cognitive Processes, 5, 135-141. Muncer, S.J. & Ling , J. (2006). Psychometric analysis of the empathy quotient (EQ) scale. Personality and Individual Differences, 40, 1111-1119. Nathanson, C , Paulhus, D . L . , & Will iams, K . M . (2006). Predictors of a behavioral measure of scholastic cheating: Personality and competence but not demographics. Contemporary Educational Psychology, 31, 97-122. Nowick i , S. & Mitchell , J. (1998). Accuracy in identifying affect in child and adult faces and voices and social competence in preschool children, Generic, Social, and General Psychology Monographs, 124, 39-59. Ogawa, T. & Suzuki, N . (1999). Response differentiation to facial expressions of emotion at increasing exposure duration. Perceptual and Motor Skills, 89, 557-563. 85 Ohman, A . (1993). Fear and anxiety as emotional phenomena: Clinical phenomemology, evolutionary perspectives and information processing mechanisms. In M . Lewis & J . M . Haviland (Eds.), Handbook of emotion (pp. 511-536). New York: The Guilford Press. Paulhus, D . L . (1991). Measurement and control of response bias. In J.P. Robinson, P.R. Shaver, & L . Wrightsman (Eds.), Measures of personality and social-psychological attitudes (pp. 17-59). San Diego, C A : Academic Press. Paulhus, D . L . , Hemphill , J.D., & Hare, R . D . (in press). Self-Report Psychopathy scale (SRP-III). Toronto/Buffalo: Multi-Health Systems. Richell , R . A . , Mitchel l , D . G . V . , Newman, C , Leonard, A . , Baron-Cohen, S., & Blair , R J . R . (2002). Theory o f mind and psychopathy: Can psychopathic individuals read the 'language of the eyes?' Neuropsychologia, 4f 523-526. Ryder, A . G . , Alden, L . A . , & Paulhus, D . L . (2000). Is acculturation unidimensional or bidimensional?: A head-to-head comparison in the prediction of personality, self-identity, and adjustment. Journal of Personality and Social Psychology, 79, 49-65. Sonnby-Borgstrom, M . , Jonsson, P., & Svensson, O. (2003). Emotional empathy as related to mimicry reactions at different levels of information processing, Journal of Nonverbal Behaviour, 27(1), 3-23. Sonny-Borgstrom, M . (2002). Automatic mimicry reactions as related to differences in emotional empathy. Scandinavian Journal of Psychology, 43, 433-443. 86 Stevens, D., Charman, T., & Blair, R.J.R. (2001). Recognition of emotion in facial expressions and vocal tones in children with psychopathic tendencies. The Journal of Genetic Psychology, 162(2), 201-211. Thayer, J.F. & Johnsen, B.G. (2000). Sex differences in judgement of facial affect: A multivariate analysis of recognition errors. Scandinavian Journal of Psychology, 41, 243-246. Walden, T.A. & Field, T.M. (1982). Discrimination of facial expressions by preschool children. Child Development, -53, 1312-1319. Whalen, P.J., Rauch, S.L., Etcoff, N.L., Mcinerney, S.C., Lee. M.B. & Jenike, M.A. (1998). Masked presentations of emotional facial expressions modulate amygdala activity without explicit knowledge. The Journal of Neuroscience, 18(1), 411-418. Williams, K.M. & Paulhus, D.L. (2004). Factor structure of the Self-Report Psychopathy scale (SRP-II) in non-forensic samples. Personality and Individual Differences, 57(2004) 765-778. Williams, K. M., Nathanson, C.,,Paulhus, D. L. & (2003). Structure and validity of the Self-Report Psychopathy scale-Ill in Normal Populations'. Poster presented at the meeting of the American Psychological Association, Toronto, Canada Williams, K.M., McAndrew, A., Learn, T., Harms, P.D., & Paulhus, D.L. (2001). The Dark Triad returns: Antisocial behaviour and entertainment preferences among narcissists, Machivellians, and psychopaths. Poster presented at the meeting of the American Psychological Association, San Francisco. 87 C H A P T E R 4: S U M M A R Y This research suggests a connection between heightened emotional reactivity to the emotions of others and more accurate expression decoding at brief presentation durations. This relationship does not hold for expressions presented for longer durations. This effect was shown in two different ways. First, a self-reported predisposition towards emotional empathy was associated with better decoding of expression type when only 47 ms was given to recognize the expressions. This effect was not found at 2008 ms. Second, analyses on various types of empathic deficits that comprise empathy-related disorders, found that only callous affect scores predicted performance at 47 ms; manipulation, erratic behaviour, antisocial behaviour, deficient cognitive empathy (low ability to read people), and low social skills were not associated. Higher callous affect scores were associated with lower ability to recognize expressions at 47 ms. These findings are unlikely due to more conscientious participation by the empathic participants, since this advantage was only evident at 47 ms. If the less empathic participants performed the study less diligently, their performance would have been significantly lower at 2008 ms in addition to 47 ms. These findings are also not explained by a greater interpersonal interest in understanding others, resulting in better attention paid to the faces, since participants reporting high levels of cognitive empathy did not show this advantage. It is also not likely the case that the more empathic participants are more learned about emotional concepts and better able to readily label the expressions, whereas others lacking discriminating ability would perform better i f they were taught the labels, for two reasons: 1) a previous study found simple valence distinctions at brief presentation durations also predicted emotional empathy (Martin et 88 a l , 1996) 2) E Q - C E , a measure of reported ability to read the. emotions and behaviour of others, is not a significant predictor of expression identification at 47 ms. A second, less significant, but consistent result in the present study, was a connection between a particular kind o f cognitive empathy and heightened accuracy to emotional expression recognition at 2008 ms presentation duration. The only deficit that predicted performance at 2008 ms was social skills; people with lower social skills had more difficulty correctly recognizing the facial expressions at this level. Social skil l deficits are one of the effects of deficient cognitive empathy: not being able to take the perspective of others leads to inappropriate behaviour in social situations. These results are not surprising: expression recognition detection at 2008 ms should be relatively straightforward and low performers who have difficulty recognizing these intense, basic emotions, would also be expected to have difficulty in social situations because o f an inability to read interpersonal cues. W h y might emotional empathy levels be related to decoding of facial expressions? There are several possibilities, including: an emotionally reactive temperament might cause both efficient processing of emotional stimuli in general, and an empathic disposition; a childhood environment might foster this kind of interpersonal awareness and concern; some people might innately mimic the expressions of others more readily and this automatic process enables expression recognition and incites emotional responding. Referring back to the parts of emotional processing (detection of salient emotional stimuli; production of emotion, interpretation of emotion; emotion regulation) and the levels of interpretation (automatic vs. controlled), these results suggest that when 89 expressions are briefly presented, emotional reactivity is a benefit. Moreover, people who are dispositionally more emotionally reactive - more active at the emotion production phase - have an advantage for interpreting of emotion. People with greater ability to process emotional information at the rapid, automatic level, react more to the emotions of others, as indicated by higher dispositional empathy. A t controlled, more deliberate levels of interpretation, no such effect is found. This advantage disappears when the expressions are presented for longer, allowing more interpretation time. This might imply that perceiving emotion is multi-faceted and that low insight from emotional sources may be compensated using other methods (i.e. cognitive interpretation), but this compensation does not work as well with briefly presented emotional stimuli, presumably because emotional processing systems work more rapidly. Thus, people with lower emotional reactivity show lower performance at 47 ms but perform comparatively better at 2008 ms. Or, these findings might not mean that the less emotionally empathic catch up but that the more emotionally empathic fall behind: the more emotionally might perform better when only a gut-level recognition response is required and when more time is allowed, cognitive mechanisms actually spoil their instinct and advantage. Several hypotheses were not significant. First, as hypothesized, the present study found that manipulation was related to self-reported callous affect and a self-reported ability to read the behaviour o f others. However, contrary to expectations, manipulation was not associated with the expression recognition profile of low emotional empathy/high cognitive empathy; they did not show low performance at 47 ms and high performance at 2008 ms. Instead, the high manipulation group did not differ from the low manipulation group in performance at 47 ms. A t 2008 ms, the high manipulation group actually 90 performed worse than the low manipulation group. Possibly what comprises manipulative behaviour is not a primary deficit in emotional reactivity and emotional recognition that results in callous affect and low emotional empathy, but rather a narcissistic predisposition, where selfish motives and callous behaviour are valued. Second, current theories hypothesize deficits in recognizing fear in psychopathy. Observing that psychopaths experience less autonomic activity while viewing sad or fearful faces (Aniskiewicz, 1979) and recognize fear less well (e.g., Blair et al., 1997), Blair (2000) proposed that inadequate responding to sadness and fear causes low empathy. In this study, the only expression that came close to being significantly different between the high and low SRP groups, was fear at 2008 ms. Interestingly, although fear has often been noted as the most difficult expression to recognize (Ekman & Friesen, 1976), in this study it was not. Nearly half of the participants in the present study were of East Asian descent. The Vancouver Index of Acculturation (VIA) shows a low rate of acculturation into mainstream, North American culture in this sample. Level of acculturation into mainstream, North American culture was significantly related to lower performance on the 47 ms expression recognition task. Research has shown that expressions are easier to judge in one's own culture and the ability to decode expression from another culture increases with the amount of exposure to that other culture (e.g., Elfenbeim & Ambady, 2003). Reasons why decoding the expressions o f our own culture is easier, include: subtle differences in the meanings and associations of the emotional labels (Biehl et al., 1997), including how intense expressions are reported to be (e.g. Matsumoto & Ekman, 1989), and in facial physiognomy (Matsumoto, 1989). For example, Matsumoto (1989) found 91 that facial physiognomy differences between American faces and Japanese faces, influenced judgements of emotion. Upper eyelid raises in American faces produced higher ratings of intensity for faces of anger and fear, and increased the clarity of angry expressions, presumably because the upper eyelid is more visible. Wi th these differences, it is not surprising that a strong positive correlation was found between accuracy rates at 47 ms and level of acculturation into the mainstream culture. Gender differences on the expression recognition tasks in this study were not significant, although they approached significance, with females performing slightly better than males at both 47 ms and 2008 ms. These findings match previous research: females typically are better at decoding nonverbal cues, but the differences are generally not substantial (e.g. Ha l l , 1978; Hal l , Carter, & Horgan, 2000; Hal l & Matsumoto, 2004; Montagne et al., 2005; Kirouac & Dore, 1984; Thayer & Johnsen, 2002). There were • significant gender differences on the self-report empathy scales, with females reporting significantly more emotional empathy, but no differences in cognitive empathy. This fits with previous research: a meta-analysis found gender differences in empathy to be largest in self-report studies, possibly due to sex-role stereotypes, moderate for observational methods, such as facial reactions, and small for physiological methods (Eisenberg & Fabes, 1998). This study assessed empathy by self-report and this is a substantial limitation. Relying on participant's own reporting of their empathic behaviour is problematic, limited by, for example, denial, lack of comparison, social desirable responding, and minimal self-awareness. Second, this study did not include a measure of state mood. State mood may have effected how participants perceived the emotion in facial expressions. 92 High state anxiety may have lowered performance in some people. The 47 ms control task likely controlled for this problem somewhat, since anxious responding would be reflected on scores on the task too. Third, the participants were not given examples of the different emotions beforehand, so it is possible that some participants lacked knowledge about the labels to apply to the expressions. However considering that all participants were university student it is reasonable to assume that they knew the meanings of the major emotional expressions. Fourth, given the high level of Asian participants, using a balanced design in terms of equal number Caucasian and Asian faces, such as the Japanese and Caucasian Facial Expressions of emotion ( J A C F E E ; Matsumoto & Ekman, 1988) may have lowered the strong acculturation effects. Fifth, this research may lack ecological validity and may be generalized only to the recognition of posed photographs. For example, perhaps people with high personal distress have extremely impaired facial expression recognition when they are in a stressful, chaotic environment. Or, perhaps when face-to-face, people high in personal distress are superior in their recognition. Related to this, maybe there is something unique in face-to-face interactions such that people decode facial expressions in a different manner. These findings have implications for understanding the underlying basis of empathy. Such knowledge on how empathy functions can potentially help foster empathy development in people. Future directions could explore the limits of the connection between emotional empathy and greater recognition accuracy of briefly presented expressions. Is this phenomenon more or less pronounced in children? Can training significantly improve accuracy at brief rates? Does this effect show with subtle expressions too, or only intense ones? Future directions could explore the connections 93 between emotional contagion, empathy and emotional recognition. Emotional contagion is defined as, "the tendency to automatically mimic and synchronize facial expressions, vocalizations, postures, and movements with those of another person, and consequently, to converge emotionally" (Hatfield et al., 1992, pp. 153-154). It is unclear from the results exactly what part of emotional empathy is key to improved recognition: is it physiological/emotional reactivity, such as mimicry that allows greater emotional information to be gathered in a limited amount of time? Does it result from longstanding receptivity to the emotions of Others, and a motivation to be concerned? This research contributes a connecting piece to the empathy, expression recognition and emotional reactivity constructs. However the connection cannot be readily explained with our current knowledge, raising the need for more research into these questions. 94 References Aniskiewicz, A . S . (1979). Autonomic components of vicarious conditioning and psychopathy. Journal of Clinical Psychology, 35, 60-67. Biehl , M . , Matsumoto, D . , Ekman, P., Hearn, V . , Heider, K . , Kudoh, T. & Ton, V . (1997). Matsumoto and Ekman's Japanese and Caucasian Facial Expressions of emotion ( J A C F E E ) : Reliability data and cross-national differences. Journal of Nonverbal Behaviour, 27(1), 3-21. Blair, R.J.R. (2005). Responding to the emotions of others: Dissociating forms of empathy through the study of typical and psychiatric populations. Consciousness and Cognition, 14, 698-718. Blair , R.J.R. , Jones, L . , Clark, F. & Smith, M . (1997). The psychopathic individual: a lack of responsiveness to distress cues? Psychophysiology, 34, 192-198. Blair, R.J.R., Mitchel l , D . G . V . , Peschardt, K . S . , Colledge, E . , Leonard, R . A . , Shine, J .H. , Murray, L . K . & Perrett, D.I. (2004). Reduced sensitivity to others' fearful expressions in psychopathic individuals. Personality and Individual Differences, 37, 1111-1122. Davis, M . H . (1980). A multidimensional approach to individual differences in empathy. Catalog of Selected Documents in Psychology, 10(4), 1-17. Davis, M . H . (1983). Measuring individual differences in empathy: Evidence for a multidimensional approach. Journal of Personality and Social Psychology, 44(1), 113-126. Eisenberg & Fabes (1990). Empathy: Conceptualization, assessment, and relation to prosocial behaviour. Motivation and Emotion, 14, 131-149. 95 Ekman, P. (1982). Emotion in the human race. New York: Cambridge University Press. Elfenbein, H . A . & Ambady, H . (2003). When familiarity breeds accuracy: Cultural exposure and facial emotion recognition. Journal of Personality and Social • Psychology, 85(2), 276-290. Ekman, R, & Friesen, W . V . (1976). Pictures of facial affect. Palo Al to , C A . : Consulting Psychologists Press. Ekman, P. (1992). A n argument for basic emotions. Cognition and Emotion, 6, 169-200. Habel, U . , Kuhn, E . , Salloum, J., Devos, H . & Schneider, F. (2002). Emotional processing in psychopathic personality. Aggressive Behaviour, 28, 394-400. Hal l , J .A. (1978). Gender effects in decoding nonverbal cues. Psychological Bulletin, 85, 845-857. Hal l , J .A. & Matsumoto, D . (2004). Gender differences in judgments of multiple emotions from facial expressions. Emotion, 4(2), 201-206. Hare, R . D . (1991). The Hare Psychopathy Checklist-Revised. Toronto, O N : Mutli-Health Systems. Hatfield, E . , Cacioppo, J.T., & Rapson, R . L . (1994). Emotional Contagion. New York: Cambridge University Press. Kirouac, G . & Dore, F . Y . (1984). Judgment of facial expressions of emotion as a function of exposure time. Perceptual and Motor Skills, 59, 147-150. Martin R . A . , Berry, G .E . , Dobranski, T. & van Home, M . (1996). Emotion perception threshold: individual differences in emotional sensitivity. Journal of Research in Personality, 38, 290-305. 96 Matsumoto, D . & Ekman, P. (1989). American-Japanese cultural differences in judgments of facial expressions of emotion. Motivation and Emotion, 13, 143-157. Matsumoto, D . (1989). Face, culture, and judgements of anger and fear: Do the eyes have it? Journal of Nonverbal Behaviour, 13(3), 171-188. Mehrabian, A . , & Epstein, N . (1972). A measure of emotional empathy. Journal of Personality, 40(4), 525-543. Montagne, B . , Kessels, R .P .C. , Frigerio, E . , de Harm, E .H.F . , & Perrett, E.I. (2005). Sex differences in the perception of affective facial expressions: Do men really lack emotional sensitivity? Cognitive Processes, 6, 135-141. Phillips, M . L . , Drevets, W . C . , Rauch, S.L., & Lane, R. (2003). Neurobiology o f Emotion Perception I: the Neural Basis of Normal Emotion Perception. Biological Psychiatry, 54, 504-514. Sonny-Borgstrom, M . , Jonsson, P., & Svensson, O. (2003). Emotional empathy as related to mimicry reactions at different levels of information processing. Journal of Nonverbal Behaviour, 27(1), 3-23. Stevens, D . , Charman, T., & Blair , R.J.R. (2001). Recognition of emotion in facial Is the basic level in the eye of the beholder? Cognitive Psychology, 23, 457-482 Thayer, J.F. & Johnsen, B . G . (2000). Sex differences in judgement of facial affect: A multivariate analysis of recognition errors. Scandinavian Journal of Psychology, 41, 243-246. 97 A P P E N D I C E S Appendix A : Dependent Variable Distributions Table A . 1: Kurtosis and skewness statistics Kurtosis Skewness Variable 47 ms Raw Score: -.26 Z score: 0.64 Raw Score: -.21 Z score: -0.99 2008 ms Raw Score: .43 Z score: 1.04 Raw Score:-.52 Z score: -2.50 Figure A . l Distribution of 47 ms variable 40 17.5 22.5 27.5 32.5 37.5 20.0 25.0 30.0 35.0 Distribution of 47 ms accuracy scores 98 Figure A .2 Distribution of 2008 ms variable 50 40 22.5 27.5 32.5 37.5 42.5 25.0 30.0 35.0 40.0 Distribution of 2008 ms accuracy scores Appendix B : Correlation Matrices 2008 47 SRP- BIDR- BIDR- GEN- IRS- IRS- SRP- SRP-ms ms AB SDE IM DER PT EC TOT EB 2008 ms 1 39** -.13 -.09 .05 -.13 .12 ' .13 -.18* -.09 • 47 ms 39** 1 .03 .09 .09 -.08 .09 22** -.08 -.00 SRP-AB -.13 .03 1 -.06 _ 44** .19* _ 22** -.24** .59** .50** BIDR-SDE -.09 .09 -.06 1 .31** -.03 27** .14 .00 -.06 BIDR-IM .04 .09 . 44** .31** 1 -.08 .30** .21* -.50** -.45** GENDER -.13 -.08 .19* -.03 -.08 1 .08 -.17 29** .22* IRS-PT .12 .09 -.22** 27** 30** .08 1 .25** -.30** -.24** IRS-EC .13 22** -.24** .14 .21* -.17 .25** 1 -.36** -.19* SRP-T -.18* -.08 .59** .00 -.50** 29** -.30** -.36** 1 .83** SRP-EB -.09 -.00 .50** -.06 -.45** .22* -.24** -.18* g3** 1 SRP-CA -.21* -.20* 39** -.04 _ 31 ** 3g** -.25** -.58** .76** 445** SRP-M -.14 .00 .52** .09 -.45** .19 -.25** -.14 .84** .56** VIA-HI -.07 -.13 -.14 -.02 -.03 .12 .01 .00 -.08 -.05 VIA-MA .14 27** .12 .04 -.17* .12 -.17* -.03 .06 .12 EQ-SS 29** .18* -.01 .22** .01 -.04 .02 .15 -.12 -.06 EQ-CE .02 .13 -.11 .31** .15 -.12 22* .16 .02 .05 EQ-T .21* .19* -.25** .36** 29** -.26** .30** 44** _ 32** -.18* 47 ms .07 .14 -.02 .06 .12 .10 .06 .03 .01 -.03 control IRS-PD -.09 -.13 -.25** -.11 .13 _ 35** -.05 .15 -.15 -.11 IRS-F .17* .17 -.08 .14 -.13 _ 22** .19* 39** .08 .16 Note. BIDR-SDE = Balanced Inventory of Social Desirable Responding-Self-Deception subscale; BIDR-IM = Balanced Inventory of Social Desirable Responding-Impression Management subscale; IRS-PT = Interpersonal Reactivity Scale, Perspective Taking factor; IRS-EC = Interpersonal Reactivity Scale, Empathic Concern factor; IRS-PD = Interpersonal Reactivity Scale, Personal Distress; IRS-F = Interpersonal Reactivity Scale, Fantasy; VIA-MA = Vancouver Acculturation Index-Mainstream Acculturation subscale; VIA-HI = Vancouver Acculturation Index - Heritage Identification subscale SRP-CA = Self-Report Psychopathy Scale, Callous Affect subscale; SRP-M = Self-Report Psychopathy Scale, Manipulative subscale; SRP-EB = Self-Report Psychopathy Scale, Erratic Behaviour subscale; SRP-AB = Self-Report Psychopathy Scale, Antisocial Behaviour subscale; SRP-T = Self-Report Psychopathy Scale, Total score; EQ-CE = Empathy Quotient, Cognitive Empathy subscale; EQ-ER = Empathy Quotient, Emotional Reactivity subscale; EQ-SS = Empathy Quotient, Social Skills subscale; EQ-T = Empathy Quotient, total score *p<.05. **p<.01. 100 Correlation Matrices cont'd SRP- SRP- VIA- VIA- EQ- EQ- EQ-T 47 ms IRS- IRS-F CA M HI MA SS CE control PD 2008 ms -.21* -.14 -.07 .14 29** .02 .21* .07 -.09 .17* 47 ms -.20* .00 -.13 .27** .18* .13 .19* .14 -.13 .17 SRP-AB .39** .52** -.14 .12 -.01 -.11 -.25** -.02 -.25** -.08 BIDR-SDE -.04 .09 -.02 .04 .22** .31** .36** .06 -.11 .14 BIDR-IM -31** -.46** -.03 -.17* .01 .15 29** .12 .13 . -.12 Gender .38** .12 .12 .11 -.04 -.12 -.26** .10 -.35** -.22** IRS=PT -.25** -.25** .01 -.17* .02 .22* 30** .06 -.05 .19* IRS-EC -.58** -.14 .00 -.03 .15 .16 44** .03 .15 39** SRP-T .76** g4** -.08 .06 -12 .02 _ 32** .01 -.15 .08 SRP-EB .45** .56** -.05 .12 -.06 .05 -.18* -.03 -.11 .16 SRP-CA 1 .43** -.10 , -.05 -31** -.22** -.58** .03 -.13 . 24** SRP-M .43** 1 -.05 .07 .06 .20* -.05 .02 -.13 .24** VIA-HI -.10 -.05 1 -.03 -.07 .09 .01 .06 .13 .04 VIA-MA -.05 .07 -.03 1 27** .03 .08 .22* l 27** .03 EQ-SS -.31** .06 -.07 .27** 1 .31** .57** .06 -.25** .05 EQ-CE _ 22** .10* .09 .03 31** 1 .75** -.07 .02 32** EQ-T -.58** -.05 .01 .08 .57** 7^ ** 1 -.04 .01 .33** 47 ms .03 '.02 .06 .22* .06 -.07 -.04 1 -.08 .06 control IRS-PD -.13 -.13 .13 _ 27** -.25** .02 .01 -.08 1 .19* IRS-F -.24** 24** .04 .03 .05 32** 33** .06 .19* 1 Note. BIDR-SDE = Balanced Inventory of Social Desirable Responding-Self-Deception subscale; BIDR-IM = Balanced Inventory of Social Desirable Responding-Impression Management subscale; IRS-PT = Interpersonal Reactivity Scale, Perspective Taking factor; IRS-EC = Interpersonal Reactivity Scale, Empathic Concern factor; IRS-PD = Interpersonal Reactivity Scale, Personal Distress; IRS-F = Interpersonal Reactivity Scale, Fantasy; VIA-MA = Vancouver Acculturation Index-Mainstream Acculturation subscale; VIA-HI = Vancouver Acculturation Index - Heritage Identification subscale SRP-CA = Self-Report Psychopathy Scale, Callous Affect subscale; SRP-M = Self-Report Psychopathy Scale, Manipulative subscale; SRP-EB = Self-Report Psychopathy Scale, Erratic Behaviour subscale; SRP-AB = Self-Report Psychopathy Scale, Antisocial Behaviour subscale; SRP-T = Self-Report Psychopathy Scale, Total score; EQ-CE = Empathy Quotient, Cognitive Empathy subscale; EQ-ER = Empathy Quotient, Emotional Reactivity subscale; EQ-SS = Empathy Quotient, Social Skills subscale; EQ-T = Empathy Quotient, total score *p<.05. **p<.01. 101 Appendix C: Scales The Interpersonal Reactivity Scale (Davis, 1980) Personal Distress Factor 1. In emergency situations, I feel apprehensive and ill-at-ease. 2. I tend to lose control during emergencies. 3. Being in a tense emotional situation scares me. 4. When I see someone who badly needs help in an emergency, I go to pieces. 5. When I see someone get hurt, I tend to remain calm. 6. I am usually pretty effective in dealing with emergencies. 7. I sometimes feel helpless when I am in the middle of a very emotional situation Perspective-taking Factor 1. I believe that there are two sides to every question and try to look at them both. 2. When I 'm upset at someone, I usually try to "put myself in his shoes" for a while. 3. I try to look at everybody's side of a disagreement before I make a decision. 4. I sometimes find it difficult to see things from the "other guy's point of view. 5. Before criticizing somebody, I try to imagine how I would feel i f I were in their place. 6. If I 'm sure I 'm right about something, I don't waste much time listening to other people's arguments. 7. I sometimes try to understand my friends better by imaging how things look from their perspective. 102 Empathic Concern Factor 1. When I see someone being taken advantage of, I feel kind of protective towards them. 2. When I see someone being treated unfairly, I sometimes don't feel very much pity for them. 3. I often have tender, concerned feelings for people less fortunate than me. 4. I would describe myself as a pretty soft-hearted person. 5. Sometimes I don't feel very sorry for other people when they are having problems. 6. Other people's misfortunes do not usually disturb me a great deal. 7. I am often quite touched by things that I see happen. Fantasy Factor 1. When I am reading an interesting story or novel, I imagine how I would feel i f the events in the story were happening to me. 2. I really get involved with the feelings of the characters in the novel. 3. I am usually objective when I watch a movie or play and I don't often get completely caught up in it. 4. After seeing a play or movie, I have felt as though I were one of the characters. 5 . 1 daydream and fantasize, with some regularity, about things that might happen to me. 6. Becoming extremely involved in a good book or movie is somewhat rare for me. 7. When I watch a good movie, I can very easily put myself in the place of the leading character. 103 The Empathy Quotient (Baron-Cohen & Wheelwright, 2003) Cognitive Empathy Factor 1. I can easily tell i f someone else wants to enter a conversation. 2. I can pick up quickly i f someone says one thing but means another. 3. I am good at predicting how someone w i l l feel. 4. I am quick to spot when someone in a group is feeling awkward or uncomfortable. 5. I can easily tell i f someone is interested or bored with what I am saying. 6. I can sense i f I am intruding, even i f the other person doesn't tell me. 7. I can tune into how someone else feels rapidly and intuitively. 8. I can easily work out what another person might want to talk about. 9. I can tell i f someone is masking their true emotion. 10. I am good at predicting what someone w i l l do. 11. I can usually appreciate the other person's viewpoint, even i f I don't agree with it. Social Skills Factor 1 I find it difficult to explain to others things that I understand easily, when they don't understand it the first time. 2 I find it hard to know what to do in a social situation. 3 Friendships and relationships are just too difficult, so I tend not to bother with them. 4 I often find it difficult to judge i f something is rude or polite. 5 I don't tend to find social situations confusing. 6 I don't consciously work out the rules o f social situations. 104 Emotional Reactivity Factor 1. I really enjoy caring for other people. 2. It is hard for me to see why some things upset people so much. 3. I find it easy to put myself in somebody else's shoes. 4. If I say something that someone else is offended by, I think that that's their problem, not mine. 5. I can't always see why someone should have felt offended by a remark. 6. Seeing people cry doesn't really upset me. 7. I get upset i f I see people suffering on news programmes. 8. Friends usually talk to me about their problems as they say that I am very understanding. 9. Other people often say that I am insensitive, though I don't always see why. 10.1 usually stay emotionally detached when watching a film. 11.1 tend to get emotionally involved with a friend's problems. No specific factor, part of EQ total score 1. People often tell me that I went too far in driving my point home in a discussion. 2. It doesn't bother me too much i f I am late meeting a friend. 3. In a conversation, I tend to focus on my own thoughts rather than on what my listener might be thinking. 4. When I was a child, I enjoyed cutting up worms to see what would happen. 5. If anyone asked me i f I like their haircut, I would reply truthfully, even i f I didn't like it. 105 6. I am very blunt, which some people take to be rudeness, even though this is unintentional. 7. When I talk to people, I tend to talk about their experiences rather than my own. 8. It upsets me to see an animal in pain. 9. I am able to make decisions without being influenced by people's feelings. 10. People sometimes tell me that I have gone too far with teasing. 11. If I see a stranger in a group, I think that it is up to them to make an effort to jo in in. 106 The Balanced Inventory of Desirable Responding (Paulhus, 1991) 1. M y first impressions of people usually turn out to be right. 2. It would be hard for me to break any of my bad habits. 3. Many people I meet are rather stupid. 4. I have not always been honestwith myself. 5. I always know why I like things. 6. When my emotions are aroused, it biases my thinking. 7. Many people think that I am exceptional. 8. I am not a safe driver when I exceed the speed limit. 9. I am fully in control of my own fate. 10. It's hard for me to shut off a disturbing thought. 11. I never regret my decisions. 12. I sometimes lose out on things because I can't make up my mind soon enough. 13. The reason I vote is because my vote can make a difference. 14. People don't seem to notice me and my abilities. 15. I am a completely rational person. 16. I rarely appreciate criticism. 17. I am very confident of my judgments 18. I have sometimes doubted my ability as a lover. 19. It's all right with me i f some people happen to dislike me. 20. I 'm just an average person. 21.1 sometimes tell lies i f I have to. 22. I never cover up my mistakes. 107 23. There have been occasions when I have taken advantage of someone. 24. I never swear. 25. I sometimes try to get even rather than forgive and forget. 26. I always obey laws, even i f I'm unlikely to get caught. 27. I have said something bad about a friend behind his/her back. 28. When I hear people talking privately, I avoid listening. 29. I have received too much change from a salesperson without telling him/her. 30. I always declare everything when asked by police or customs officials. 31. When I was young I sometimes stole things. 32.1 have never dropped litter on the street. 33. I sometimes drive faster than the speed limit. 34. I never read sexy books or magazines. 35.1 have done things that I don't tell other people about. 36. I never take things that don't belong to me. 37. I have pretended to be sick to get out of work or school. 38.1 have never damaged a library book or store merchandise without reporting it. 39. I have some pretty awful habits. 40. I don't gossip about other people's business. 108 Self-report psychopathy scale (SRP-III; Paulhus, Hemphill , & Hare, in press) Callous Affect Factor 1. I often feel guilty about hurting other peoples' feelings. 2. Most people are wimps. 3. It tortures me to see an injured animal. 4. I am sometimes mean to people who love me. 5. M y friends would probably say that I am a warm person. 6. I avoid horror movies. 7. I look after my own needs and other people have to look after theirs. 8. When I do something wrong, I feel ashamed afterward. 9. I never cry at movies. 10. People sometimes say that I 'm cold-hearted. 11. I love violent sports and movies. 12. I'm a soft-hearted person. 13. People are too sensitive when I tell them the truth about themselves. 14. People cry way too much at funerals. 15.1 never feel guilty over something I've done. 16. I sometimes dump friends that I don't need any more. Interpersonal Manipulation Factor 1. I think I could "beat" a lie detector. 2. I purposely flatter people to get them on my side. 3. I have pretended to be someone else in order to get something. 4. I don't think of myself as tricky or sly. 5. I would get a kick out of 'scamming' someone. 6. I tend to trust other people to be honest. 7. It's fun to see how far you can push people before they get upset. 8. I find it difficult to manipulate people 9. Most people have stolen something from a store. 10. Sometimes you have to pretend you like people to get something out of them. 11. People can usually tell i f I am lying. 12. I can talk people into anything. 13. Most people tell lies everyday. 14. Y o u can get what you want by telling people what they want to hear. 15. A lot of people are "suckers" and can easily be fooled. 16. I would never step on others to get what I want. Antisocial Behaviour Factor During my teenage years I have... 1. Stolen money from my parents. 2. Tried to force someone to have sex. 3. Skipped out on paying for things such as movies, bus or subway rides, or food. 4. Cheated on school tests. 5. Spread an untrue story about someone else. 6. Handed in a school essay that I copied at least partly from someone else. 7. Been involved in delinquent gang activity. 8. Stolen (or tried to steal) a motor vehicle, such as a car or motorcycle. During my teenage years I N E V E R . . . 110 9. Broke into a building or vehicle in order to steal something or to vandalize. 10. Attacked someone physically with the idea of hurting him or her. 11. Talked back to a teacher. 12. Swore at my mother or father. 13. Took an illegal drug. 14. Shoplifted from a store. 15. Sneaked out of the house at night. 16. Got arrested. Vancouver Index of Acculturation ( V I A ; Ryder, Alden, & Paulhus, 2000) 1. I often participate in my heritage cultural traditions 2. I often participate in mainstream North American cultural traditions 3. I would be wi l l ing to marry a person from my heritage culture 4. I would be wi l l ing to marry a North American person 5. I enjoy social activities with people from the same heritage culture as myself 6. I enjoy social activities with typical North American people 7. I am comfortable working with people o f the same heritage culture as myself 8. I am comfortable working with typical North American people 9. I enjoy entertainment (e.g. movies, music) from my heritage 10. I enjoy North American entertainment (e.g. music, movies) 11. I often behave in ways that are typical of my heritage culture 12. I often behave in ways that are 'typically North American 13. It is important for me to maintain or develop the practices o f my heritage culture 14. It is important for me to maintain or develop North American cultural 15.1 believe in the values of my heritage culture 16. I believe in mainstream North American values 17. I enjoy the jokes and humor of my heritage culture 18. I enjoy typical North American jokes and humor 19. I am interested in having friends from my heritage culture 20. I am interested in having North American friends. 112 

Cite

Citation Scheme:

        

Citations by CSL (citeproc-js)

Usage Statistics

Share

Embed

Customize your widget with the following options, then copy and paste the code below into the HTML of your page to embed this item in your website.
                        
                            <div id="ubcOpenCollectionsWidgetDisplay">
                            <script id="ubcOpenCollectionsWidget"
                            src="{[{embed.src}]}"
                            data-item="{[{embed.item}]}"
                            data-collection="{[{embed.collection}]}"
                            data-metadata="{[{embed.showMetadata}]}"
                            data-width="{[{embed.width}]}"
                            data-media="{[{embed.selectedMedia}]}"
                            async >
                            </script>
                            </div>
                        
                    
IIIF logo Our image viewer uses the IIIF 2.0 standard. To load this item in other compatible viewers, use this url:
https://iiif.library.ubc.ca/presentation/dsp.831.1-0100322/manifest

Comment

Related Items