UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Motion cues enhance gaze processing Anderson, Nicola Christine Cole 2012

Your browser doesn't seem to have a PDF viewer, please download the PDF to view this item.

Item Metadata

Download

Media
24-ubc_2012_fall_anderson_nicola.pdf [ 638.48kB ]
Metadata
JSON: 24-1.0072979.json
JSON-LD: 24-1.0072979-ld.json
RDF/XML (Pretty): 24-1.0072979-rdf.xml
RDF/JSON: 24-1.0072979-rdf.json
Turtle: 24-1.0072979-turtle.txt
N-Triples: 24-1.0072979-rdf-ntriples.txt
Original Record: 24-1.0072979-source.json
Full Text
24-1.0072979-fulltext.txt
Citation
24-1.0072979.ris

Full Text

MOTION CUES ENHANCE GAZE PROCESSING  by Nicola Christine Cole Anderson  B.Sc., The University of British Columbia, 2009  A THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF  MASTER OF ARTS in THE FACULTY OF GRADUATE STUDIES (Psychology)  THE UNIVERSITY OF BRITISH COLUMBIA (Vancouver)  August 2012  © Nicola Christine Cole Anderson, 2012  Abstract In four experiments, we investigated the role of motion in gaze perception. In Experiment 1, we developed and evaluated a comprehensive stimulus set of small eye movements at three different gaze angles (1, 2 and 3 degrees visual angle) and demonstrated that observers were able to detect and discriminate these small eye movements with a high degree of fidelity. In Experiments 2 and 3, we evaluated discrimination accuracy and confidence to dynamic and static gaze stimuli. We demonstrated that the reason for the high sensitivity to gaze in Experiment 1 was due predominantly to the presence of the motion signal in the video stimuli. Accuracy to dynamic gaze was significantly higher than accuracy to static gaze. In addition, the size of the gaze angle (i.e. signal strength) increased accuracy for static gaze despite the fact that confidence for these stimuli was consistently moderate. This latter result suggests that the dynamic gaze signal is qualitatively different from the static gaze signal. In Experiment, 4 we tested this possibility by reversing the contrast polarity of half of the gaze stimuli. This manipulation has been shown to disrupt normal gaze processing. We reasoned that if the perception of static and dynamic gaze are fundamentally different, contrast reversal will differentially effect these two types of gaze stimuli. Indeed, contrast reversal impaired the perception of static, but not dynamic gaze, confirming that the perception of dynamic and static gaze are qualitatively different.  ii  Preface All research reported was conducted at UBC’s Brain & Attention Research Laboratory, and was supervised by Dr. A. Kingstone. I was responsible for all program creation, completion or supervision of data collection data analysis and writing of any work that resulted from the research. Ethical approval for this research was provided by UBC’s Behavioural Research Ethics Board under the approval number of H10-00527. A version of Chapter 2 has been published. Anderson, N.C., Risko, E.F., and Kingstone, A. (2011). Exploiting human sensitivity to gaze for tracking the eyes. Behavior Research Methods, 43, 843-852. I conducted the program creation, data collection, analysis and writing in close collaboration with Dr. Evan Risko.  iii  Table of Contents Abstract ................................................................................................................................... ii Preface .................................................................................................................................... iii Acknowledgements ................................................................................................................ xi Dedication .............................................................................................................................. xii 1  2  Introduction .....................................................................................................................1 1.1  Human Sensitivity to Gaze Direction ....................................................................... 1  1.2  Specialized Gaze Processing .................................................................................... 4  1.3  Sources of Gaze Direction Information .................................................................... 7  1.4  Potential Role of Motion ........................................................................................ 10  1.5  The Present Investigation ....................................................................................... 13  Experiment 1: Evaluating Accuracy to Gaze Direction .............................................15 2.1  Introduction............................................................................................................. 15  2.2  Eye Movement Database ........................................................................................ 16  2.3  Gaze Sensitivity Evaluation .................................................................................... 18  2.3.1 Participants ......................................................................................................... 18 2.3.2 Apparatus ............................................................................................................ 18 2.3.3 Stimuli ................................................................................................................ 18 2.3.4 Procedure ............................................................................................................ 18 2.4  Results .................................................................................................................... 19  2.4.1 Data Handling ..................................................................................................... 19 2.4.2 Detection ............................................................................................................. 20 2.4.3 False Alarms ....................................................................................................... 22  iv  2.4.4 Signal Detection.................................................................................................. 23 2.4.5 Summary for Correct Detection and False Alarms............................................. 23 2.4.6 Direction Discrimination .................................................................................... 24 2.4.7 Summary for Direction Discrimination .............................................................. 25 2.4.8 Model Analysis ................................................................................................... 26 2.5 3  Experiment 1 Discussion ........................................................................................ 28  Experiment 2: Motion and Gaze Direction: Videos vs. Static Stimuli......................32 3.1  Introduction............................................................................................................. 32  3.2  Method .................................................................................................................... 33  3.2.1 Participants ......................................................................................................... 33 3.2.2 Apparatus ............................................................................................................ 33 3.2.3 Stimuli ................................................................................................................ 34 3.2.4 Procedure ............................................................................................................ 34 3.3  Results .................................................................................................................... 35  3.3.1 Design ................................................................................................................. 35 3.3.2 Accuracy ............................................................................................................. 35 3.3.3 Confidence .......................................................................................................... 37 3.4 4  Experiment 2 Discussion ........................................................................................ 39  Experiment 3: Motion and Gaze Direction: Dynamic vs. Static Displays ................40 4.1  Introduction............................................................................................................. 40  4.2  Method .................................................................................................................... 41  4.2.1 Participants ......................................................................................................... 41 4.2.2 Apparatus ............................................................................................................ 41  v  4.2.3 Stimuli ................................................................................................................ 41 4.2.4 Procedure ............................................................................................................ 42 4.3  Results .................................................................................................................... 42  4.3.1 Accuracy ............................................................................................................. 42 4.3.2 Confidence .......................................................................................................... 44 4.3.3 Summary of Accuracy and Confidence .............................................................. 46 4.3.4 Decision Time ..................................................................................................... 47 4.4 5  Discussion ............................................................................................................... 49  Experiment 4: Motion, Luminance and Gaze Direction ............................................51 5.1  Introduction............................................................................................................. 51  5.2  Method .................................................................................................................... 52  5.2.1 Participants ......................................................................................................... 52 5.2.2 Apparatus ............................................................................................................ 52 5.2.3 Stimuli ................................................................................................................ 53 5.2.4 Procedure ............................................................................................................ 54 5.3  Results .................................................................................................................... 55  5.3.1 Accuracy ............................................................................................................. 55 5.3.2 Confidence .......................................................................................................... 60 5.3.3 Summary of Accuracy and Confidence Results ................................................. 64  6  5.4  Accuracy When Confidence = 0 ............................................................................. 64  5.5  Discussion ............................................................................................................... 65  General Discussion ........................................................................................................67 6.1  Experiment Summaries and Conclusions ............................................................... 67  vi  7  6.2  Multiple Cues in Gaze Perception .......................................................................... 70  6.3  Study Limitations ................................................................................................... 72  6.4  Future Directions .................................................................................................... 74  Conclusions.....................................................................................................................76  References ...............................................................................................................................77  vii  List of Tables Table 2.1: Detection and discrimination means and standard deviations for each model. ..... 27  viii  List of Figures Figure 2.1: Proportion of correctly detected eye movements. Error bars represent the standard error of the mean. .................................................................................................................... 22 Figure 2.2: Proportion of correctly discriminated eye movements. Error bars represent the standard error of the mean. ..................................................................................................... 26 Figure 2.3: Proportion of responses to each direction. Each axis on the graph represents the model's actual eye movement direction. ................................................................................. 29 Figure 3.1: Accuracy for static and dynamic gaze at each gaze angle. Error bars represent the standard error of the mean. ..................................................................................................... 36 Figure 3.2: Confidence for static and dynamic stimuli at each gaze angle. Error bars represent the standard error of the mean. ............................................................................................... 38 Figure 4.1: Timeline of a static trial. ...................................................................................... 42 Figure 4.2: Accuracy for static and dynamic stimuli at each gaze angle. Error bars represent the standard error of the mean. ............................................................................................... 43 Figure 4.3: Confidence for static and dynamic stimuli at each gaze angle. Error bars represent the standard error of the mean. ............................................................................................... 46 Figure 4.4: Decision time for static and dynamic stimuli at each gaze angle. Error bars represent the standard error of the mean................................................................................. 49 Figure 5.1: Example reverse contrast stimulus. ...................................................................... 53 Figure 5.2: Accuracy for left and right gaze at each gaze angle. Error bars represent the standard error of the mean. ..................................................................................................... 56 Figure 5.3: Accuracy for static and dynamic stimuli at each gaze angle. Error bars represent the standard error of the mean. ............................................................................................... 58  ix  Figure 5.4: Accuracy for normal and reverse contrast gaze in the static condition. Error bars represent the standard error of the mean................................................................................. 59 Figure 5.5: Accuracy for normal and reverse contrast gaze in the dynamic condition. Error bars represent the standard error of the mean. ........................................................................ 60 Figure 5.6: Confidence responses for static and dynamic gaze at each gaze angle. Error bars represent the standard error of the mean................................................................................. 62 Figure 5.7: Confidence for normal and reverse contrast gaze in the static condition. Error bars represent the standard error of the mean. ........................................................................ 63 Figure 5.8: Confidence for normal and reverse contrast gaze in the dynamic condition. Error bars represent the standard error of the mean. ........................................................................ 63 Figure 5.9: Proportion correct responses when confidence was reported as 0 (i.e., a "guess" response). Error bars represent standard error of the mean. ................................................... 65  x  Acknowledgements A huge thanks to my supervisor Dr. Alan Kingstone, for his endless support and constant enthusiasm. Thank you for giving me this wonderful opportunity, it has changed my life. I would also like to thank all of the Brain and Attention Research Lab members for their support and their willingness to spend hours at the white board. A special thank you to Dr. Evan Risko for reading my extensive emails (and then answering them!), for teaching me how to love data, and for providing so much inspiration to all of us. I wouldn’t be here without the continuing support of my family. Thank you to my wonderful grandpa and my big brother for sparking my curiosity in vision and perception. Mike, I've always wanted to see through your eyes. Thank you to my grandma for teaching me that all education is worth the trouble (even if it’s not a degree in literary theory!). Bill, thank you for all the chats by the fireplace and for helping me to think things through. Mommy, you’re my best friend. Thank you for always truly listening to me. I owe a heartfelt thank you to Christopher for simply being there at the end of the day and for your encouragement and support throughout. I love you.  xi  Dedication  For Grandpa.  xii  1  Introduction The eyes represent an important social stimulus. They indicate when, and where, others  might be directing their attention in the world, which in turn helps to support complex social behaviour and cooperation among individuals. The ability to detect, make contact and follow gaze develops early (Maurer, 1985; Vecera & Johnson, 1995) and is thought to underpin the development of a theory of mind (Baron-Cohen, 1995). In later life, gaze contact and following in more dynamic settings has been shown to be important in regulating conversational turntaking (Kendon, 1967), effectively communicating ideas and promoting conversational understanding (Richardson and Dale, 2005), and determining social status within a group (Foulsham, Cheng, Tracy, Henrich & Kingstone, 2010). In order to support these functions, the human attentional system appears to have evolved to prioritize gaze as a critical stimulus (e.g. Birmingham, Bischof & Kingstone, 2008) and to react to gaze direction efficiently (even automatically; Friesen & Kingstone, 1998). Indeed, there is much evidence suggesting that humans are experts at determining the location of another’s gaze (e.g. Gibson & Pick, 1963; Cline, 1967; Anstis, Mayhew & Morely, 1969; Anderson, Risko & Kingstone, 2011). In the present investigation we focus on understanding the mechanisms that support this ability. 1.1  Human Sensitivity to Gaze Direction Studies of human sensitivity to gaze direction typically fall into two camps: those that  examine dyadic; and those that examine triadic sensitivity to gaze. Dyadic investigations typically ask whether a “looker” or model is looking at or away from the participant. Triadic gaze, in contrast, examines three-dimensional perception of gaze direction to an object in space.  1  In the following section I will outline previous work that examined dyadic gaze, and subsequently review work examining triadic gaze. Humans are remarkably sensitive to whether or not someone is making eye contact with them. One of the first demonstrations of this sensitivity was in the seminal work of James Gibson and Anne Pick (1963). In this work, observers judged whether a "looker" was gazing into their eyes or not at a distance of 200 cm. The "looker" looked either at their nasion (the indentation where the nose meets the forehead directly between the eyes) or at several targets arrayed horizontally behind the observer. When the lookers' head was oriented to the front, observers could correctly judge gaze direction (i.e. “the looker is no longer looking in my eyes”) when the model’s iris was deviated approximately 1 minute of arc from the observer’s perspective. Cline (1967) later corroborated these results, finding an acuity for horizontal iris deviations of 0.5 minutes of arc. Accuracy for gaze direction, particularly for that of eye-contact, appears to be close to average human levels of visual acuity. As gaze direction discrimination moves from dyadic to triadic, accuracy appears to decline. Cline (1967) found that when asking participants to judge the location of a model’s gaze to points arrayed away from mutual gaze, accuracy declined. Judgments of gaze deviations 8 and 12 degrees visual angle in eccentricity on both the vertical and horizontal plane were much less accurate than judgments of mutual gaze. Furthermore, as gaze deviations increased, accuracy decreased. Perhaps though, Cline’s most significant contribution to the gaze perception literature was his conclusion that dyadic gaze may constitute a special condition of gaze direction judgment that is extremely accurate compared to judgments of triadic gaze. In a similar method to Cline’s (1967), Anstis and colleagues (1969) found that at large gaze eccentricities, participants tend to overestimate gaze direction by up to 10 degrees visual angle.  2  In the work by Cline (1967) and Anstis et al. (1969), the participant could not see the object at which the model was looking, thus there was no joint visual attention between the model and participant. To fill this gap, Symons, Lee, Cedrone and Nishimura (2004) conducted an experiment investigating joint attention, where the object that the model was looking at was visible to the participant. In this work, participants judged whether the model was looking to the left or right of a coloured peg on a target board. In Experiment 1, the model was sitting directly across from the participant, while in Experiment 2 participants watched a digital version of the model making eye movements to the target peg. Participants were accurate to within approximately 1 to 1.5 degrees visual angle (Symons et al., 2004, p. 13), which represents iris deviations of a few minutes of arc. Accuracy appeared to decline slightly in the digital condition (but not significantly). Across both experiments, accuracy significantly decreased when the model covered one eye, a result that led the authors to suggest that both eyes are critically important in gaze direction judgements. In addition, Symons et al., (2004) explicitly compared both static and dynamic gaze judgements in Experiment 2, finding that motion had no appreciable influence on gaze perception. If anything, it interfered with gaze judgements. This result will be discussed in detail in Section 1.3. Bock, Dieke and Their (2008) examined participants accuracy to a model’s gaze while looking at targets arrayed in a circular response board located between the model and participant. The gaze deviation of the model was approximately 15 degrees visual angle when looking at any point around the circular board. This work examined accuracy to gaze directions other than on the horizontal and vertical planes. Participants showed both a cardinal and upward bias. Participants were more likely to indicate that the model was looking at targets located on the  3  cardinal plane (straight vertical and horizontal) when gaze was directed diagonally. In addition, they were more likely to indicate that the model was looking above where gaze was directed. Taken together, the work reviewed above demonstrates that humans are very good at judging both dyadic and triadic gaze. The former is limited solely by human levels of visual acuity, whereas the latter appears to contain some systematic elements of bias. 1.2  Specialized Gaze Processing The sensitivity to both dyadic and triadic gaze demonstrated by the psychophysical work  reviewed above has led to the more recent speculation of a system or brain network specialized for the processing of gaze direction (e.g. the Eye Direction Detector, Baron-Cohen, 1995; Perrett. & Emery, 1994; Langton, Watt & Bruce, 2000; Birmingham & Kingstone, 2009). There is evidence for high-level mechanisms that process gaze direction. Seyama and Nagayama (2006) presented participants with multiple images of models looking to the left or right repeatedly. Following an adaptation phase of gaze directed to the left, direct gaze in the test phase appeared shifted to the right (and vise versa). Similarly, Jenkins, Beaver and Calder (2006) demonstrated that adaptation to left and right gaze caused a reliable shift in gaze direction at test to straight ahead. This after effect persisted across changes in stimulus size and head orientation, suggesting that gaze-specific adaptation is not the result of low-level stimulus properties. Taken together, these adaptation studies suggest that there are distinct populations of neurons that specifically code for different gaze directions. Recent brain imaging and neurophysiological work supports the idea of a specialized gaze processing network. Work using an fMRI adaptation procedure similar to the adaptation method of Jenkins and colleagues (2006) indicated that different gaze directions (left and right)  4  are coded separately in the superior temporal sulcus (STS) and inferior parietal lobule (Calder, et al., 2007). However, gaze information has also been shown over a variety of studies through a multitude of techniques to be processed in the amygdala (Kawashima et al., 1999; Akiyama et al., 2007), the medial prefrontal cortex (Calder et. al., 2002), and the fusiform gyrus (George, Driver & Dolan, 2001; Hoffman & Haxby, 2000). Most commonly, gaze is thought to be processed in the posterior STS (pSTS) (Nummenmaa & Calder, 2009; Allison, Puce & McCarthy, 2000; Nummenmaa, Passamonti, Rowe, Engell & Calder, 2010), an area that has been implicated in the perception of biological and implied motion (Allison et al., 2000), theory of mind (Gallagher & Frith, 2003; Saxe & Kanwisher, 2003), face processing (Haxby, Hoffman & Gobbini, 2000), as well as speech, lipreading and audiovisual integration (for review see: Hein & Knight, 2008). Differential effects between mutual and averted gaze have been found in which averted gaze elicits stronger activation than mutual gaze in the pSTS (Nummenmaa & Calder, 2009). Autism spectrum disorders (ASD) are associated with changes in white and grey matter in the pSTS and superior temporal gyrus (for review, see Zilbovicius et al., 2006). ASD is characterized by social and communicative deficits in behaviour (Dalton et al., 2005) along with pronounced impairments in coordinating visual attention with others and understanding their intentions, thoughts and feelings, especially when making these inferences from images of the eyes (Baron-Cohen et al., 1999). In addition, neurological evidence has shown that autistic traits in the general population, as measured by the autism-spectrum quotient (Baron-Cohen, Wheelwright, Skinner, Martin & Clubley, 2001), are correlated with decreased white matter volume in the pSTS (von dem Hagen et al., 2010). ASD has been largely studied in reference to impairments in gaze processing, although it has been demonstrated that children with ASD  5  perform comparably to controls in a task involving direct verses averted gaze (e.g., "which one is looking at you?"), with deficits appearing only when they are asked to infer the gazer's intention (Baron-Cohen et al., 1995). For each experiment that localizes a specific aspect of gaze processing to a specific brain area (Hoffman & Haxby, 2000; Calder et al., 2007), as often as not, there is a study with conflicting findings (Pelphrey, Viola & McCarthy, 2004; Kawashima et al., 1999; George et al., 2001). It has recently been proposed that the reason for these conflicting findings is that the experimental context changes across experiments (Birmingham & Kingstone, 2009). For example, Pelphrey and his colleagues (2004) found stronger STS activation for direct compared to averted gaze when employing a task involving a virtual actor dynamically shifting gaze either toward or away from the participant. This is in direct contrast to reports of stronger STS activation for averted compared to direct gaze in a task employing static stimuli (Hoffman & Haxby, 2000). Pelphrey and colleagues (2004) suggest that the reason for this difference lies in the use of dynamic as opposed to static gaze. Taken together, there is strong evidence to suggest that a specialized gaze processing system does exist. However, in order to narrow gaze processing down to a specific area or network of areas, future work will need to take the context into consideration (Birmingham & Kingstone, 2009), especially considering that a subtle difference between static and dynamic gaze in the scanner may have a profound impact on the conclusions drawn (e.g. Hoffman & Haxby, 2000 verses Pelphrey et al., 2004). One critical question in the literature is whether gaze perception involves specialized and dedicated brain architecture or whether it is the result of an integration of low-level featural information extracted from the eye as a stimulus. Humans possess the largest ratio of white sclera to dark iris relative to their primate conspecifics (Kobayashi & Kohshima, 2001). One  6  explanation for this difference is that the structure of the eye may have evolved to simplify the computation of gaze direction when gaze signalling became particularly adaptive in cooperative societies. In principle, properties of simple cells found in the visual cortex can signal gaze direction (Watt, 1999). 1.3  Sources of Gaze Direction Information Early psychophysical work demonstrated that the perception of gaze direction is a  function of the location of the iris within the visible part of the white sclera. As work has progressed, it appears that two fundamental information sources contribute to gaze direction processing: luminance and geometrical cues (Ando, 2002; 2004; Ricciardelli, Baylis & Driver, 2000; Bock et al., 2008; Symons et al., 2004; Jenkins, 2007; Olk, Symons & Kingstone, 2008). The luminance distribution across the eye is the relative amount of light and dark lowspatial frequency information from the sclera and iris that is visible to an observer. It has been proposed that the visual system uses the heuristic that it is the dark part of the eye - the iris - that "does the looking" (Ricciardelli et al., 2000, p. B12). When the luminance distribution is altered, the discriminability of gaze direction becomes impaired. Indeed, altering the luminance characteristics across the sclera and iris can even cause a change in the perceived direction of gaze. In a series of experiments, Ando (2002) presented participants with images of a model looking straight ahead or with eyes shifted to the left or right. Critically, on successive images, either the left or right side of the sclera was shaded slightly darker than the other side. Small shifts in iris deviation created rather large overestimation of gaze shifts (an iris displacement of about 2.5 minutes of arc produced an apparent gaze shift of 7.7 degrees visual angle). In other  7  words, the shading of the white sclera caused perceived gaze to shift dramatically in the direction of the darker sclera. The effect of luminance changes are so strong that altering the luminance characteristics across the sclera and iris can also cause a complete reversal in perceived gaze direction. Sinha (2000) demonstrated this effect with the development of the “Bogart Illusion.” When the luminance characteristics of the eye region were altered by reversing the contrast polarity of the sclera and iris, perceived gaze direction correspondingly reversed. Sinha (2000) suggested that the estimation of gaze direction is “cognitively impenetrable” because participants reported that they knew that the iris was represented by the light region of the eye, but nevertheless perceived gaze direction consistent with the dark sclera (Sinha, 2000, p. 1005). Ricciardelli and colleagues (2000) extended this work in a more thorough experimental investigation of this phenomenon. Participants were presented with images of normal and contrast reversed gaze, with gaze directed left, right or straight ahead, crossed with left, right or straight ahead head orientations. Perceived gaze direction was reversed for contrast reversed gaze, except when both the gaze and the head of the model was positioned straight ahead. This was true whether or not the surrounding face context was positive or negative polarity. Ricciardelli and colleagues (2000) suggested, like Sinha (2000), that the visual system indeed uses an inflexible contrast rule when determining gaze direction. Geometrical cues refer to the high spatial frequency information present in the form of the iris and its location within the sclera and surrounding eyelid. The work reviewed above highlights that the observers are extremely sensitive to the gaze of others (Gibson & Pick, 1963; Cline, 1967; Anstis et al., 1969); however, at larger deviations from central gaze, observers tend to overestimate gaze direction (Anstis et al., 1969). Anstis and colleagues (1969) 'removed' the  8  eyelids from an artificial eye and socket, and showed that the observed overestimation of gaze direction was attenuated. This was taken as evidence that the position of the eyes are judged "largely by the position of the iris relative to the eyelids" (p. 488), supporting a geometrical account of gaze discrimination. It is possible that the overestimation effect is a result of the iris not being visible between the two sides of the sclera, reducing the amount of geometrical information available and therefore, reducing accuracy to gaze direction. Thus, the geometrical account posits that the circular (and darker) iris deviates relative to the amount of sclera visible between the eyelids, signalling the direction of gaze. The calculation of gaze direction appears to rely on both the luminance distribution across the eye, and the geometrical cues provided by the circular iris and its spatial location within the sclera. Recently, Jenkins (2007) developed a visual illusion wherein one half of a model’s eyes contained high-spatial frequency pupil information, while the other half contained low-spatial frequency, darker pupil information. Perceived gaze direction was shifted toward the darker side at large viewing distances, but toward the lighter, more detailed side at closer viewing distances. This indicates that both geometrical and luminance information is useful for gaze direction discrimination, depending on the particular context or task constraint employed. The multiple-cue account of gaze perception was further emphasized by recent work by Olk and colleagues (2008). In this work, participants were presented with images of a model looking either left, right or straight ahead. These gaze directions were crossed with images where the model’s head was directed left, right or straight ahead. The images were either normal or contrast reversed. Head direction shifted to the left or right was proposed to alter geometrical information by changing the shape of the eyes. Reversing the contrast polarity was proposed to alter luminance information. Critically, rather than making a left-right forced-choice decision,  9  participants were allowed to respond “straight ahead” in contrast to the earlier work by Ricciardelli and colleagues (2000). Results suggested that when head direction and perceived gaze direction was in opposition (e.g. head left, gaze right or reverse contrast gaze left), the most likely participant response was for the gaze position to be straight ahead. Based on this result, the authors suggest that perceived gaze direction reflects the outcome of a competition between geometrical and luminance information. Similarly, Ando (2002) hypothesized, based on his work, that gaze discrimination may be a compromise between slow processing of high spatial frequency geometrical cues, and fast processing of low spatial frequency luminance cues. Taken together, these accounts indicate that the perception of gaze direction likely utilizes the cues that provide the most information given the particular context. 1.4  Potential Role of Motion Although previous work suggests that luminance and geometrical cues are critical for  efficient processing of the direction of another’s gaze, in the present investigation we were interested in another potential source of directional information, the movement of the eyes. Motion is a rich signal that provides cues about object speed, velocity and direction (Hubel & Wiesel, 1979; Maunsell & Van Essen, 1983; Movshon & Newsome, 1992), depth (e.g. Rogers & Graham, 1979), and whether an object is biological (e.g. Johansson, 1973). In addition, the human perceptual system is highly sensitive to motion signals (Nakayama, 1985). Thus, it seems possible that motion can provide a cue to the direction of gaze. Motion is said to be a "primary sensation" (Coren, Ward & Enns, 1999, p. 405). Evidence for this claim comes from work examining adaptation, where repeatedly exposing the visual system to a moving pattern causes fatigue in movement-sensitive neurons, resulting in a  10  decreased ability to detect that motion in the same direction (e.g. Hunzelmann & Spillman, 1984; Coren et al., 1999). In addition, motion-selective cells have been observed in the primary visual cortex of monkeys (Hubel & Wiesel, 1968). A candidate pathway for the perception of motion information of the eye is the magnocellular visual pathway of the geniculostriate system. The geniculostriate system is divided into two distinct pathways, the slower parvocellular pathway primarily implicated in form and colour perception, and the faster magnocellular pathway primarily implicated in luminance changes (i.e. motion perception; Coren et al., 1999). If motion acts as a cue to gaze direction, it may do so independently of luminance and geometrical cues, a point that will be revisited in the General Discussion. Interestingly, research to date assessing the contribution of eye motion to gaze discrimination has found little support for the intuitive notion that motion contributes to the perception of gaze direction. For example, Symons et al., (2004) asked observers to indicate whether a model was looking to the left or right of a series of defined targets on a peg board located below and between the observer and model. The model either made approximate eye contact with an observer, then fixated a peg on the board (dynamic condition), or occluded her eyes while they were moving and revealed them to the observer once she fixated the appropriate peg (static condition). No difference in accuracy was found between the dynamic and static conditions. Bock et al., (2008) asked observers to triangulate a model's gaze to targets closely spaced around a circular board, the circumference of which was at approximately 15 degrees visual angle in eccentricity from the model's straight gaze. Bock et al., (2008) compared dynamic and static gaze in both live and photographed conditions. Again, no significant differences in acuity were found between dynamic and static gaze.  11  A close reading of these studies suggests that the tasks used in the Symons et al., (2004) and Bock et al., (2008) investigations may have minimized the contribution of motion information by requiring participants to make fine discriminations by triangulating a model's gaze to targets at large eccentricities from mutual gaze. If the motion of the eyes acts as a cue to gaze direction, then it may provide little information relevant to this type of eye-target triangulation. For example, in Symons et al., (2004), the observer was asked to report whether the model was looking slightly to the left or right of a given target peg. This target peg could appear either to the left or right of the model. The global motion of a model's eye movement would always be consistent in this task (e.g. a large eye movement to the left or right), regardless of which side of the target peg the model looks at. In this case, motion may not provide an effective cue for fine target triangulation. Based on the work reviewed above, motion does not appear to influence gaze perception. The question remains whether this conclusion is valid specifically for situations involving fine levels of target triangulations, or whether it extends to a more general principle of motion in gaze processing. For example, in the triangulation tasks used by Symons and colleagues (2004) and Bock and colleagues (2008), participants had to follow the gaze of a model who looked first at their nasion and then at the target board in the dynamic condition. Once the model has initiated an eye movement, it seems plausible that the participant would extract the initial directional information from the model's eyes from their movement and then move their own eyes to the approximate area on the target board. The task was to match the perceived direction of the model's gaze once the model has already fixated (i.e. static eye position) the appropriate target. In addition, in these tasks, the model's eye movement to the target board would be large enough to require the participant to make an eye movement themselves in order to follow the model's  12  gaze to the target. This would obscure any motion of the model's eyes near the target. In contrast, in a left/right discrimination task, in line with the work investigating the cues used in gaze perception (Ricciardelli et al., 2000; Ando, 2002), motion may provide a much stronger signal. In a discrimination task, the participant would compare successive gaze positions as they change. Thus, in order to investigate motion as a cue used for gaze perception, it is critical to examine it in a discrimination, rather than triangulation, paradigm. This is the focus of Experiments 2 - 4. 1.5  The Present Investigation In a series of experiments, we examined first the sensitivity of the human perceptual  system to detect and discriminate small eye movements. Previous work has focussed on very small deviations of gaze in a dyadic setting (Gibson & Pick, 1963; Cline, 1967), and very large deviations of gaze in triadic gaze triangulation (Symons et al., 2004; Bock et al., 2008). In Experiment 1, we examined gaze sensitivity to small eye movements in a detection and discrimination task, rather than a dyadic or triadic task. We developed a new stimulus set of small (1, 2 and 3 degree visual angle) eye movements in 8 radial directions. This stimulus set is then evaluated in an eye movement detection and discrimination task. In Experiment 2, we evaluated the role of motion in fine gaze direction discrimination judgments using a left-right, forced-choice design more in line with previous work examining the cues used in gaze perception (Ando, 2002; Ricciardelli et al., 2000). We compared accuracy to left-right gaze in static (final eye position only) and dynamic (video of the eye looking straight ahead, then to the left or right) conditions. For completeness, in Experiment 3, we matched the static and dynamic stimuli used in Experiment 2 by creating apparent motion using two frames  13  of the videos used in Experiment 2 or by eliminating the motion signal by presenting a short flicker (blank screen) between the two motion frames. In Experiment 4 we evaluated whether dynamic gaze is qualitatively different from static gaze. We manipulated the contrast polarity of static and dynamic gaze stimuli in order to disrupt normal gaze processing across static and dynamic stimuli. If this contrast reversal affects both static and dynamic gaze equally, then the two types of gaze stimuli draw from similar sources of information. Alternatively, if contrast reversal differentially affects static and dynamic gaze, we can conclude that the two types of gaze stimuli are qualitatively different.  14  2 2.1  Experiment 1: Evaluating Accuracy to Gaze Direction1 Introduction The aim of the present experiment was first, to generate a comprehensive stimulus set of  small eye movements suitable for psychophysical exploration and second, to evaluate human sensitivity to small changes in gaze position. This is in contrast to previous work that utilized rather large eye movements in examining sensitivity to gaze (Symons et al., 2004; Bock et al., 2008). Given that in the previous psychophysical work, high sensitivity was found for dyadic gaze when the head is directed straight, we chose a paradigm similar to that used by Cline (1967) and Anstis and colleagues (1969). Participants judged whether a model made an eye movement in one of 8 radial directions. In addition, like Cline (1967) and Anstis and colleagues (1969), no third object was visible to participants; they simply responded whether an eye movement was made, and if so, in which direction. This eliminated any distortions from having the 'third' object visible to the participants (Lobmaier, Fischer & Schwaninger, 2006), the necessity of complicated set-ups, and the need to have the model look down at a response board. In addition, this methodology allowed for the detailed recording of small changes in the model's eye movements using a high-resolution web camera. We expected that even though the model was not looking directly at the camera, these small eye movements would be detectable to observers, given that the head is directed straight ahead.  1  A version of Chapter 2 has been published. Anderson, N.C., Risko, E.F., and Kingstone, A. (2011). Exploiting  human sensitivity to gaze for tracking the eyes. Behavior Research Methods, 43 843-852.  15  2.2  Eye Movement Database We recorded a total of 1512 gaze stimuli from four models (2 men and 2 women) of  which 1344 contained an eye movement in one of 8 radial directions at three possible gaze angles (1, 2 or 3 degrees of visual angle) and 168 videos of the same models not making an eye movement (i.e., fixating the center of a display). To create the gaze stimuli each model was placed in a chin rest and asked to make eye movements in response to targets that were presented on a Dell 2407WFP 17 inch (diagonal) monitor located approximately 60 cm in front of the model. An SR Research Eyelink 1000 was used in conjunction with SR Research Experiment Builder to present the stimuli and verify the eye movements made. Eye gaze position was tracked by pupil and corneal reflection and was sampled at a rate of 1000 Hz. Before beginning, a 9 point calibration and validation was conducted. The web camera was centered at the top of the monitor and angled slightly downward to record videos of the model’s gaze. The web camera’s native software was used to position and zoom the camera such that each person's head covered roughly 10 cm of the video window. Each model participated in two sessions of eye movement recording in order to generate stimuli at two different web camera resolutions. One in which the camera was set to encode at 640x480 pixels at a frame rate of 15 fps, referred to as the low resolution condition and one in which the camera was set to encode at 960x720 pixels at a frame rate of 15 fps, referred to as the high resolution condition. Sessions were counterbalanced across gender, such that one female and one male started with the high resolution session and the other male and female started with the low resolution session. Each session lasted approximately one hour. The high resolution video clips were resized to 640x480 pixels. Thus they were identical in size to the low resolution videos, but contained more pixels per inch and were subjectively more clear and detailed. 16  Each individual “trial” consisted of the presentation of a central fixation cross in conjunction with a tone that was used to signal the beginning of each trial. The model was required to fixate the cross for up to 1000 ms after which a circular target (12 pixels in diameter, subtending 0.68º visual angle) would appear in one of the eight radial directions and at one of the three gaze angles as described above. The model was then instructed to move their eyes to this circular target. In the no eye movement condition, the central fixation cross was replaced by the target. To ensure that the model’s eye movements corresponded to the correct gaze angle and direction (within the limits of the Eyelink 1000's eye-tracking capabilities) each target was surrounded by an invisible interest area (circular, 32 pixels in diameter, subtending 1.8º visual angle) and the person was required to accurately fixate within this interest area before the trial would terminate. Each of the possible 24 eye movement locations (8 directions by 3 gaze angles) was repeated 7 times throughout stimulus generation, with 3 no eye movement trials for each of the 3 targets that would appear in any one direction for a total of 189 trials: 168 eye movement trials and 21 no eye movement trials. Trials were randomized with breaks and re-calibrations as needed throughout. These videos were then manually cut into 1.5 second long segments containing the gaze stimuli using ffmpeg, open source video processing software published under the GNU Lesser General Public License version 2.1. The tone that occurred in conjunction with the central fixation during stimulus generation was used as a reference for the beginning of the trial. A typical clip would show the person looking at the fixation point, then executing an eye movement or not (depending on the particular trial). Clips did not show the person moving their eyes back to the central fixation cross.  17  2.3  Gaze Sensitivity Evaluation  2.3.1 Participants Eight participants (4 men and 4 women) volunteered to participate in the experimental phase. Presentation order was counterbalanced such that half saw the high resolution videos first, and the other half saw the low resolution videos first. 2.3.2 Apparatus Stimuli were presented on a Dell 2407WFP 17 inch monitor with a resolution of 1024x768 pixels. Participants were instructed to sit comfortably and adjust the chair such that their eyes were slightly higher than the middle of the monitor, which was 60 cm away. 2.3.3 Stimuli Stimuli were the 1512 web camera video clips from the stimulus generation phase described above. The SR Research Experiment Builder was used to display the stimuli. In order to load the videos into Experiment Builder, they needed to be passed through the DivX codec (DivX Inc, San Diego, CA) and have their audio removed. SR Research’s SplitAVI tool was used to batch convert and remove the audio of all the video files. Excess space around the clip was coloured black. On average, the horizontal angular resolution of the models’ heads was 9.76º of visual angle. 2.3.4 Procedure Participants were instructed that on each trial they would be presented with a video clip and that they had to decide the direction in which an eye movement was made or if no eye movement was made. Each trial consisted of one video clip presented for its entire duration (i.e., 1.5 s). After the video clip ended, the last frame of the clip remained visible on the screen until a 18  response was made or 10 s elapsed. Participants responded to the direction of the eye movement that they perceived using the number pad. Each number on the number pad corresponded to one of the possible radial directions presented in the clip. Thus if a participant perceived that the eye moved up and to the left, they would respond using the “7” key on the number pad. If they perceived the eye to be moving down and to the right, they would respond using the “3” key on the number pad, etc. If the participant did not perceive an eye movement at all, they would press the 5 key on the number pad. After the participants responded, a blank screen was presented, and participants pressed the space-bar to proceed to the next trial. Each participant completed two separate testing sessions counterbalanced across resolution. Within each session, trials were randomly blocked by model, such that participants would respond to all 189 gaze clips of one model, then move on to the next, etc. Each session lasted approximately 45 minutes. 2.4  Results  2.4.1 Data Handling Responses were broken down into two components: detection and discrimination. Correct detection occurred if the participant responded that the eye moved on eye movement trials, regardless of whether they were correct in their direction assessment. Direction discrimination was defined as the number of eye movements at each gaze angle and direction that were correctly discriminated (i.e. participants responded with the correct direction) out of the number of eye movements at each gaze angle and direction that were correctly detected. These two measures, detection and discrimination, were analysed separately and the results are presented  19  below. Figure 2.1 and Figure 2.2 show the proportion correctly detected and discriminated, respectively, for each gaze angle and at each direction. 2.4.2 Detection A 2 (resolution) x 3 (gaze angle) x 8 (direction) analysis of variance was conducted on the proportion of eye movements that were correctly detected. There was a marginal main effect of resolution, F(1,7) = 3.79, MSE = 0.11, p = .093, such that eye movement detection was better for the high resolution videos (0.72) than the low resolution videos (0.69). There was a main effect of gaze angle, F(2, 14) = 266.31, MSE = 0.27, p < .001, such that 2 degree eye movements (0.79) were more likely to be detected than 1 degree eye movements (0.41), t(7) = 28.5, p < .001, and 3 degree eye movements (0.92) were more likely to be detected than 2 degree eye movements, t(7) = 5.52, p < .001. In addition, the difference in detection between 1 and 2 degree eye movements (0.38) was significantly larger than the difference in detection between 2 and 3 degree eye movements (0.12), t(7) = 10.11, p < .001. There was also a main effect of direction, F(7,49) = 19.54, MSE = 0.27, p < .001. Inspection of the mean detection rates at each direction showed an advantage for upward eye movements (NE, N and NW) and a paired samples t-test confirmed that these eye movements (0.80) were significantly easier to detect than the eye movements in all other directions (0.67) , t(7) = 8.81, p < .001. There was a significant interaction between resolution and gaze angle, F(2,14) = 13.44, MSE = 0.06, p < .001. Paired t-tests comparing the effect of resolution (high – low) at the three gaze angles revealed that the effect of resolution decreased with increases in the size of the eye movement. Specifically, the effect of resolution at a gaze angle of 1 degree (0.08) was marginally larger than the effect of resolution at 2 degrees (0.03), t(7) = 2.14, p = .070, and the effect of resolution at 2 degrees was significantly larger than the effect of resolution at 3 degrees 20  (-0.01), t(7) = 3.01, p = .020. Only the effect of resolution in the 1 degree eye movement condition was significantly different from 0, t(7) = 6.32, p < .001. There was a marginally significant interaction between resolution and direction, F(7, 49) = 2.01, MSE = 0.01 p = .072. Inspection of the means suggested a large effect of resolution in the straight downward direction and in fact, the effect of resolution (high – low) on detection of straight downward eye movements (0.09) was significantly greater than the effect of resolution in all other directions (0.03), t(7) = 4.89, p = .002. There was a significant interaction between gaze angle and direction, F(14, 98) = 8.43, MSE = 0.06, p < .001. In order to interpret this interaction, the slope relating the size of the eye movement to detection was calculated for each subject at each direction. Thus, shallower slopes indicate a smaller effect of eye movement size on detection. Visual inspection of the gaze angle by direction slopes indicated that eye movement size had less of an effect on the upward eye movements. A paired samples t-test confirmed that the average slope of the upwards directions (NW, N, NE; 0.19) was significantly shallower than the average slope of the other directions (0.29), t(7) = 5.724, p < .001. Lastly, there was a significant resolution by gaze angle by direction interaction, F(14, 98) = 2.88, MSE = 0.01, p = 0.001. The slopes relating gaze angle to detection were again calculated, this time for each subject at each direction and resolution. The interaction appears to result from the fact that increasing resolution reduces the effect of gaze angle, but only in certain directions, S (slope difference, 0.07), t(7) = 3.61, p = .009, NW (slope difference, 0.09), t(7) = 8.96, p < .001 and N (slope difference, 0.07), t(7) = 3.56, p = .009. Figure 2.1 shows the proportion of correctly detected eye movements across all conditions.  21  1 Proportion Correctly Detected  0.9  low resolution high resolution  0.8 0.7 0.6  0.5 0.4 0.3 0.2 0.1 0 SW S SEW ENWNNE  SW S SEW ENWNNE  1 degree  2 degrees  SW S SEW ENWNNE 3 degrees  Gaze Angle (Degrees Visual Angle) Figure 2.1: Proportion of correctly detected eye movements. Error bars represent the standard error of the mean. 2.4.3 False Alarms A univariate analysis of variance was conducted on the proportion of false alarms on no eye movement trials (i.e., the participant indicated that an eye movement was made when it was not). False alarms were rare (0.12) and a paired samples t-test revealed that there was no significant differences between high (0.125) and low (0.121) resolution trials, t(7) = 0.166, p = .873.  22  2.4.4 Signal Detection We also conducted a signal detection analysis. A measure of sensitivity, d', was calculated for each subject across each resolution, gaze angle and direction. In order to account for cases where the hit rate was 100% and d' takes on an unlimited value, hit rate proportions of one were converted as suggested by Macmillan and Creelman (1991) to [1–1/(2N)]. d' values were submitted to a 2 (resolution) x 3 (gaze angle) x 8 (direction) analysis of variance. The pattern of results were similar, as might be expected, to the detection results above, however, the marginal main effect of resolution (in the detection analysis) disappeared F(1,7) = 0.245, MSE = 0.723, p = .636. The only other difference in this analysis was in the resolution by gaze angle interaction, F(2,14) = 6.74, MSE = .706, p < .009. The effect of resolution at 1 degree (0.20) was no longer marginally different from the effect of resolution at 2 degrees (0.14), t(7) = 0.606, p = .564, and in addition, it was no longer significantly different from zero, t(7) = 1.24, p = .256. Nonetheless, the pattern of results using d’ were largely equivalent to the detection results. Critically, the d' values at 1 degree (1.04) were significantly different from zero in a one-sample t-test, t(7) = 17.91, p < .001, indicating that viewers were indeed sensitive to these small eye movements, over and above chance. In the next section, direction discrimination while holding eye movement detection constant is assessed. 2.4.5 Summary for Correct Detection and False Alarms In summary, detection was better for larger eye movements and for eye movements in the upward directions. The size of the eye movement had less effect on the detection of upward eye movements than those in the other directions. Resolution improved the detection of 1 degree eye movements in general and also improved detection of straight downward eye movements, but had no effect on the proportion of false alarms, which were relatively infrequent. Sensitivity 23  measures closely matched those of the detection results, but also revealed a significant advantage of the 1 degree eye movements over chance. 2.4.6 Direction Discrimination Correct discrimination was calculated as the number of correct responses divided by the total number of eye movements that were detected in each direction and at each gaze angle. This provided a discrimination value that was corrected for detection. A 2 (resolution) x 3 (gaze angle) x 8 (direction) analysis of variance was conducted on these values. There was a main effect of gaze angle, F(2, 14) = 180.23, MSE = 0.02, p < .001, such that responses were significantly more accurate for 2 degree eye movements (0.68) than 1 degree eye movements (0.44), t(7) = 10.82, p < .001, and significantly more accurate for 3 degree eye movements (0.81) than 2 degree eye movements, t(7) = 15.07, p < .001. In addition, the difference in accuracy between 1 and 2 degree eye movements (0.24) was significantly greater than the difference in accuracy between 2 and 3 degree eye movements (0.13), t(7) = 4.86, p = .002. There was a main effect of direction, F(7,49) = 21.60, MSE = 0.05, p < .001. Inspection of the direction means showed an advantage for the cardinal directions (N, E, S, W) over the oblique directions (NW, NE, SW, SE), and averaging accuracy into cardinal and oblique groups showed a significant advantage in terms of accuracy for cardinal (0.77) relative to oblique (0.51) directions, t(7) = 9.32, p < .001. There was a significant interaction between resolution and direction, F(7,49) = 5.31, MSE = 0.01, p < .001. Inspection of the means indicated that resolution had less effect on cardinals than obliques. Again, means were grouped into cardinal and oblique directions and the effect of resolution was significantly higher in obliques (0.09) than cardinals (0.02), t(7) = 3.30, p = .013. In addition, single sample hypothesis tests showed that the effect of resolution on 24  oblique directions was significantly different from 0, t(7) = 3.57, p = .009, and that the effect of resolution on the cardinal directions was not significantly different from 0, t(7) = 0.388, p = .709. There was a significant interaction between gaze angle and direction, F(14, 98) = 8.52, MSE = 0.01, p < .001. The slope relating gaze angle to detection was calculated for each subject at each direction, shallower slopes indicating a smaller effect of eye movement size on detection. Again, grouping slope scores into cardinal and oblique directions showed that cardinals (mean slope, 0.12) were less affected by gaze angle than obliques (mean slope, 0.25), t(7) = 6.69, p < .001. No other significant interactions were found. 2.4.7 Summary for Direction Discrimination Direction discrimination improved with the size of the eye movement and this improvement was more pronounced between 1 and 2 degree eye movements than 2 and 3 degree eye movements. Cardinal eye movements were better discriminated than oblique eye movements and discrimination of cardinal eye movements was less affected by the size of the eye movement than discrimination of the obliques. Lastly, resolution improved discrimination of the oblique eye movements but not eye movements in the cardinal directions. Figure 2.2 shows the proportion of correctly discriminated eye movements across all conditions.  25  Proportion Correctly Discriminated  1 0.9  low resolution  0.8  high resolution  0.7 0.6 0.5 0.4 0.3 0.2 0.1 0 SW S SE W ENWN NE  SW S SE W ENWN NE  1 degree  2 degrees  SW S SE W ENWN NE 3 degrees  Gaze Angle (Degrees Visual Angle) Figure 2.2: Proportion of correctly discriminated eye movements. Error bars represent the standard error of the mean. 2.4.8 Model Analysis We conducted a 2 (resolution) x 3 (gaze angle) x 4 (model) x 8 (direction) analysis of variance on both the detection and discrimination results. In the detection results, there was a significant resolution by model interaction, F(3,21) = 8.85, MSE = 0.05, p < .001, a gaze angle by model interaction, F(6,42) = 6.74, MSE = 0.03, p < .001, and model by direction interaction, F(21,147) = 5.22, MSE = 0.02, p < .001. In addition, there was a significant 3-way interaction between gaze angle, model and direction, F(42,294) = 2.36, MSE = 0.02, p < .001. All other interactions were the same as in the previous detection results. In the discrimination results, there was a significant main effect of model, F(3,21)=13.37, MSE = 0.06, p < .001, a significant  26  resolution by model interaction, F(3,21) = 5.03, MSE = 0.07, p = .009, a significant gaze angle by model interaction, F(6,42) = 2.46, MSE = 0.06, p = .040, a significant model by direction interaction, F(21,147) = 3.15, MSE = 0.07, p < .001, a significant resolution by model by direction interaction, F(21,147) = 1.75, MSE = 0.05, p = .030, as well as a significant gaze angle by model by direction interaction, F(42,294) = 2.37, MSE = 0.04, p < .001. All other interactions were the same as in the previous discrimination results. Inspection of the means suggested that the model's gender may have influenced the detection and discrimination of the eye movements. We report the previous analysis to support the notion that accuracy of eye movement detection will vary with model as can be expected given variation in eye shape, pupil size etc. We do not attempt to break down or provide explanations for these interactions provided (a) there were only four models and (b) any differences are likely idiosyncratic to the models we used and thus of little value to researchers. However, the idea of individual differences in eye movement discriminability certainly warrants further investigation if for no other reason than the potential evolutionary consequences of variation in this aspect of our physical makeup (Kobayashi & Kohshima, 2001). Table 2.1 provides a breakdown of the detection and discrimination means for each model. Table 2.1: Detection and discrimination means and standard deviations for each model.  Detection Mean Standard Deviation Discrimination Mean Standard Deviation  Model 1 (M)  Model 4 (M)  Model 2 (F)  Model 3 (F)  0.64 0.48  0.61 0.49  0.64 0.48  0.63 0.48  0.59 0.35  0.59 0.35  0.62 0.33  0.69 0.34  27  2.5  Experiment 1 Discussion In this first investigation we sought to develop a stimulus set for examining human  sensitivity to very small eye movements. The results revealed that eye movements as small as 1 degree of visual angle can be detected and discriminated by human coders with a high degree of fidelity. Once a saccade is a mere 3 degrees, detection is over 90% and correct discrimination over 80%. In addition, detection and discrimination improved across gaze angles in all directions. One degree of visual angle from the model’s perspective in this experiment corresponds to an iris deviation of approximately 1.2 minutes of arc2 from the observer’s perspective; a value that is comparable to previous reports of dyadic (Cline, 1967) and triadic (Symons et al., 2004) gaze. Directional effects revealed that eye movements in the upward directions were particularly noticeable, and eye movements in the cardinal directions were better discriminated than those in the oblique directions. For clarity, Figure 2.3 shows the responses to each direction. The peak along the NW axis represents accurate responses to NW eye movements. Any deviation (in this case toward the N axis) results in greater spread, indicating error. In this figure, the cardinal bias is clear, in that responses are arrayed more tightly around the N, W, E and S axes. In addition, the upward bias is also clear in that NW and NE errors are clustered more around the N axis. The upward bias may have resulted from the position of the camera above the model, which may have exaggerated the upward eyelid movement in this experiment and revealed more iris and sclera to the participants. This latter result corroborates work by Anstis and colleagues (1969) where revealing more sclera by widening the eyelids of an artificial 2  This estimate is based on a participant sitting 60 cm from the screen. As we did not restrain participants during the  task, this value is a rough estimate that will be used for illustration and comparison to previous work.  28  eyeball improved the estimation of gaze direction. The cardinal bias is in line with the only other work to examine eye movements in directions other than the cardinal directions (Bock et al., 2008). It is possible that this bias represents a perceptual effect in that gaze perception preferentially codes cardinal eye movements. This point is corroborated by work suggesting that there exists a cardinal bias in motion perception (Coletta, Segu & Tiana, 1993). However, it is also possible that the models’ eye movements were simply more accurate in the cardinal directions despite the requirement that they land within a given area of the target. This latter is a possibility, particularly for the smaller eye movements, as the invisible boundary used for defining an accurate trial was the same size regardless of the eccentricity of the target. The boundary for targets arrayed closer to the initial fixation position actually overlapped.  N 0.9 NW  NE  0.6 0.3  W  E  0  SW  SE  S Figure 2.3: Proportion of responses to each direction. Each axis on the graph represents the model's actual eye movement direction.  29  One common theme in the psychophysical gaze following literature is that without proper stimulus control it is difficult to make conclusions about the specific cues involved in accuracy to gaze information (Symons et al., 2004). By using a gaze-contingent design in the development of our stimuli, we were guaranteed a certain measure of control during stimulus generation. This particular method can be adapted for even tighter control than used in this experiment (i.e. a more stringent invisible boundary) and may be useful in future experiments exploring gaze perception itself. Specifically, in the ability to know at any given time during stimulus generation, where the model fixated in relation to the target and the amplitude, speed and accuracy of their saccade. Thus, any changes in these parameters can be compared to observer accuracy. Although this work demonstrated that humans are indeed remarkably sensitive to small eye movements, it is not clear why this is the case. Given that observers are above chance at discriminating eye movements even at the smallest gaze angles, it appears that the sensitivity to gaze in this experiment is comparable to that for dyadic gaze (Cline, 1967). It is possible that observers were exhibiting a form of gaze sensitivity akin to that used for dyadic gaze. One could imagine that participants were viewing the straight gaze of the models as eye contact (although they were looking approximately 12.18 degrees below the camera). If participants had adopted this strategy, then some of the mechanisms involved in processing direct gaze that are privileged according to Cline (1967), may have been co-opted for use in this task. This is an intriguing possibility given the now widespread use of video conferencing in day-to-day life that typically do not allow for mutual gaze. Recent work has suggested that individuals very quickly learn which gaze direction signals "looking at me" in desktop video conferencing (Grayson & Monk, 2003). Future work will need to tease out whether this is the case.  30  Another possibility is that observers were using the motion of the iris from the videos to determine gaze direction. Subtle changes may be more perceptible when accompanied by a motion signal (especially given the visual systems' sensitivity to motion). It was challenging for the models to keep their eyes absolutely still for the unnatural amount of time (1.5 s) that was required. As a result, small motion signals may have been present in some of the no eye movement trials. Nevertheless, for 1 degree eye movements with the participant sitting approximately 60 cm from the screen, the model's iris deviation from the participants' perspective would be roughly less than 1 mm (or approximately 1.2 minutes of arc). This small change in iris position is well above detectable motion thresholds (Nakayama & Silverman, 1985). It seems reasonable that a motion signal associated with such a small deviation would provide an additional boost in detection and discrimination. This possibility is the focus of Experiments 2, 3 and 4.  31  3 3.1  Experiment 2: Motion and Gaze Direction: Videos vs. Static Stimuli Introduction Experiment 1 demonstrated that people are remarkably sensitive to the gaze direction of  others. Participants could reliably judge whether an eye movement was made, and if so, its direction, even when the eye movements were as small as 1 degree of visual angle (an iris deviation of approximately 1.2 minutes of arc). This sensitivity is comparable to judgments of mutual gaze (1.10 minutes of arc; Gibson & Pick, 1963). In Experiment 1, participants judged the gaze direction of a model making dynamic eye movements. One possibility is that the sensitivity observed in Experiment 1 was due to the dynamic nature of the stimulus. Given that people move their eyes roughly three times per second (Henderson, 2003), it seems likely that this motion signal would be noticed by conspecifics and may provide a useful cue to gaze direction. To directly test whether the motion signal was instrumental in the gaze sensitivity demonstrated in Experiment 1, in Experiment 2 we compared gaze discrimination of static (final eye position only) images against the video stimuli of Experiment 1 in a left/right, forced-choice design, more in line with work examining the cues used in gaze perception (Ando, 2002; Ricciardelli et al., 2000). If motion is indeed a critical source of information for gaze discrimination, then we can predict that observers will be more accurate at discriminating dynamic compared to static gaze. In addition, across the next three experiments we asked observers to rate their confidence in their direction responses in order to assess the perceived task difficulty in responding to static compared to dynamic gaze. This is taken for an indirect measure of the quality of the signal that is generated for static and dynamic gaze. This was particularly important given the variability of  32  responses found in Experiment 1 to eye movements at the smaller gaze angles and allowed us to assess the signal quality across static and dynamic gaze as gaze angle increases (i.e. as signal strength increases). Studies to date have not included a measure of confidence in conjunction with a gaze discrimination task. If the signal quality of dynamic gaze is higher than that of static, then we would expect confidence to be high across all dynamic stimuli. Here, and for the next two experiments, we examined only left and right gaze directions. This brings the present work more closely in line with previous investigations of the mechanisms gaze perception (Ando, 2002; Ricciardelli et al., 2000), as well as controls for the upward bias present in Experiment 1. A forced-choice design, combined with confidence responses, will provide us with measures of direction discrimination ability and a sense of whether these directional decisions are based on low or high quality gaze information. 3.2  Method  3.2.1 Participants 25 participants were recruited from the University of British Columbia Human Subjects Pool in exchange for course credit or 5 dollars (CAD). 3.2.2 Apparatus Stimuli were presented on a Dell 2407WFP 17 inch (diagonal) flat screen monitor and participants were seated comfortably approximately 60 to 80 cm away. Responses were recorded on a standard keyboard number pad that was within easy reach of the participants.  33  3.2.3 Stimuli Stimuli were adapted from those developed in Experiment 1 (see Section 2.2). Only videos where eye movements were made to the left or right were selected for the purposes of this experiment. To recap, these eye movement videos were recorded with a web camera placed on top of a computer monitor and depicted 4 different models' (50% female) eye movements. Eye movements were made from a central fixation point to a target to the left or right. Targets were positioned at 1, 2 or 3 degrees of visual angle (from the model's perspective) away from the central fixation point located in the middle of the screen. Each video was 640x480 pixels in size and lasted approximately 1.5 seconds and constituted the 'dynamic' stimuli used in the current work. In order to create 'static' stimuli, a frame was extracted from the videos where the model’s gaze was already deviated left or right using ffmpeg. There were 192 left and right eye movements from the four models (48 from each), 64 each of the total eye movements were made at one of the three gaze angles from fixation and half of the total were leftward, the other half rightward. 3.2.4 Procedure Participants were seated comfortably in front of the computer monitor and were given detailed instructions about the procedure. Static and dynamic gaze stimuli were presented to participants in a random blocked design of 24 trials in each block. Participants were made aware of the blocked design during instruction and were told before each block which type of stimulus they would encounter in the next block, either static “pictures” or dynamic “videos.” All 192 different gaze stimuli were displayed in this fashion and were counterbalanced across participants such that across two participants all 192 eye movements would be seen in both static and dynamic form. 34  During each trial, participants were presented with the static or dynamic gaze stimulus and were asked to respond to the direction that they perceived the eye looked, in the dynamic case, or was already looking, in the static case. Participants were required to respond with the number pad such that if they perceived rightward gaze, they were to press “6” and for leftward gaze, they were to press “4.” Stimuli remained on screen until the direction response. After each direction judgment, participants were asked to rate their confidence in their direction response. Confidence responses were given using number pad numbers “0” to “9,” where “0” corresponded to a guess response and “9” to an absolutely sure response. To advance to the next trial, participants were required to press the space bar. 3.3  Results  3.3.1 Design There are three factors in the design of this experiment. "Motion" is defined as the stimulus type, either dynamic (video) or static (image). "Direction" indicates whether the gaze stimulus moved left or right, in the case of dynamic stimuli, or has already moved left or right, in the case of the static stimuli. "Gaze angle" refers to the magnitude of the gaze deviation from centre (1, 2 or 3 degrees visual angle). 3.3.2 Accuracy Accuracy is defined as the proportion of correct direction judgments made by participants. A 2 (motion) by 2 (direction) by 3 (gaze angle) analysis of variance was conducted on the proportion of correct responses. There was a main effect of motion, F(1, 24) = 24.10, MSE = 0.05, p < .001, such that accuracy was significantly higher for dynamic gaze stimuli (M = 0.84) than static gaze stimuli (M = 0.72). There was a main effect of gaze angle, F(2, 48) = 110.39, 35  MSE = 0.01, p < .001. Inspection of the means and paired-samples t-tests revealed that accuracy for gaze angles of 2 degrees was significantly higher than gaze angles of 1 degree (M = 0.67), p(24) = 2.43, p < .001, and accuracy for gaze angles of 3 degrees (M = 0.87) was significantly higher than gaze angles of 2 degrees (M = 0.80), p(24) = 8.18, p < .001. Trend analysis revealed that this main effect contained both linear, F(1,24) = 269.14, MSE = 0.01, p < .001, and quadratic trends, F(1, 24) = 5.36, MSE = 0.01, p < .05. Inspection of the means reveals that this quadratic trend arises from the larger effect of gaze angle between 1 and 2 degree gaze angles (mean difference = 0.14) than that between 2 and 3 degree gaze angles (mean difference = 0.07). Figure 3.1 shows the accuracy to static and dynamic stimuli across the three gaze angles. All other main effects and interactions were non-significant (all p's > 0.05). 1.00 .95  Proportion Correct  .90 .85 .80 .75 .70 .65 .60  dynamic  .55  static  .50 1°  2° Gaze Angle (Degrees Visual Angle)  3°  Figure 3.1: Accuracy for static and dynamic gaze at each gaze angle. Error bars represent the standard error of the mean.  36  3.3.3 Confidence Confidence responses were submitted to a 2 (motion) by 2 (direction) by 3 (gaze angle) repeated measures analysis of variance. There was a main effect of motion, F(1, 24) = 36.78, MSE = 6.95, p < .001, such that confidence was higher for dynamic stimuli (M = 6.08) than static stimuli (M = 4.24). There was also a main effect of gaze angle, F(2, 48) = 71.78, MSE = 1.06, p < .001. Inspection of the means along with paired samples t-tests revealed that confidence was higher for gaze angles of 2 degrees (M = 5.32) than gaze angles of 1 degree (M = 4.19), t(24) = 7.49, p < .001, and similarly, confidence was higher for gaze angles of 3 degrees (M = 5.89) than 2 degrees, t(24) = 5.36, p < .001. There was a significant interaction between motion and gaze angle, F(2, 48) = 36.10, MSE = 0.71, p < .001. To help interpret this interaction, slopes were calculated for the confidence responses for each participant across the three possible gaze angles (1, 2 and 3 degrees), separately for static and dynamic stimuli. Thus, each participant's slope value represents the effect of gaze angle for static and dynamic stimuli. Shallower slopes indicate that confidence was uninfluenced by gaze angle, whereas steeper, positive slopes indicate that confidence is increasing with the magnitude of the gaze angle. A paired samples t-test across the slope values for static and dynamic stimuli indicated a much steeper slope, and therefore a stronger effect of gaze angle, for the dynamic stimuli (mean slope = 1.32) as opposed to the static stimuli (mean slope = 0.40), t(24) = 6.33, p < .001. From the inspection of Figure 3.2, it is clear that confidence is is moderate across all gaze angles for static gaze, whereas confidence for dynamic gaze increases sharply at 2 and 3 degree gaze angles.  37  9  Confidence Rating  8 7 6 5 4 3  2  dynamic  1  static  0 1°  2° Gaze Angle (Degrees Visual Angle)  3°  Figure 3.2: Confidence for static and dynamic stimuli at each gaze angle. Error bars represent the standard error of the mean. There was also a significant interaction between motion, direction and gaze angle, F(2, 48) = 3.43, MSE = 0.48, p = .041. Similar to the above, slopes were calculated that related gaze angles to confidence responses at each direction for both static and dynamic gaze stimuli. No systematic differences were found in comparison of these slope values outside of what might be expected from the motion by gaze angle interaction and the main effects, however, inspection of the means and paired samples t-tests, revealed a marginally significant drop in confidence in the dynamic condition for rightward gaze angles of 2 degrees (M = 6.17) versus leftward gaze angles of 2 degrees (M = 6.64), t(24) = 2.05, p = .052. All other main effects and interactions were nonsignificant (all p's > 0.05).  38  3.4  Experiment 2 Discussion Dynamic gaze stimuli were easier to discriminate than static gaze stimuli. In addition,  larger gaze angles were easier to discriminate than smaller gaze angles. This latter result is consistent with geometric accounts of gaze perception where participants use the position of the iris to determine gaze direction (Anstis et al., 1969). Accuracy was never below chance, even for the 1 degree static stimuli; again corroborating earlier findings of human sensitivity to gaze direction, even with static stimuli (Symons et al., 2004). However, the advantage for stimuli that contained a motion signal reveals that gaze direction discrimination can be significantly aided by a dynamic signal. Thus, it seems likely that the motion signal present in the video stimuli was driving the sensitivity to gaze discrimination seen in Experiment 1. There is an interesting dissociation between confidence and accuracy in these results, namely the presence of a motion by gaze angle interaction in confidence but not accuracy. In the confidence results, there is a sharper increase in confidence as gaze angle increases for dynamic stimuli, whereas confidence to static stimuli was generally low. This interaction does not occur in accuracy. Although confidence was low for all static stimuli, as the iris deviation increased, accuracy increased. Participants were not confident of their decisions about static stimuli, even when the signal quality (gaze angle) increased. This suggests that there may be a qualitative difference between static and dynamic stimuli.  39  4  Experiment 3: Motion and Gaze Direction: Dynamic vs. Static  Displays 4.1  Introduction In Experiment 2, participants were allowed as much time as needed to respond to  both the static and dynamic stimuli. In addition, when presented with the dynamic stimuli participants were able to use the initial straight ahead gaze position as a comparison with the final gaze position. For completeness, and in order to better equate the static and dynamic stimuli, Experiment 3 is an extension of Experiment 2 in which the motion signal is manipulated through the use of a flicker paradigm (Rensink, O'Regan & Clark, 1997). Participants were presented with two frames from the videos developed in Experiment 1, one frame from the beginning of the video with the model looking straight ahead, and the second frame from the end of the video with the model looking to the left or right. Critically, to eliminate the motion signal in the static condition, a blank screen was presented briefly (200 ms) between the two frames reminiscent of a flicker paradigm (Rensink et al., 1997)3. In the dynamic condition, the gaze stimulus frames were displayed without this blank screen, thus creating the perception of apparent motion of the eye between the frames. According to psychophysical accounts the displacements of the iris at each gaze angle (1.2, 2.4 and 3.6 minutes of arc, respectively) is clearly within the range of detectable motion (Nakayama & Silverman, 1985). In this latter condition, the blank (or flicker screen) was presented either before or after the two gaze frames.  3  The 200 ms blank was chosen based on pilot testing. Although theoretically, the motion signal should be  eliminated with a much smaller stimulus onset asynchrony (80-90 ms; Rensink et al., 1997; Coren et al., 1999).  40  4.2  Method  4.2.1 Participants 35 participants were recruited from the University of British Columbia Human Subjects Pool for course credit or 5 dollars (CAD). 4.2.2 Apparatus Stimuli were presented on a Dell 2407WFP 17 inch (diagonal) flat screen monitor and participants were seated comfortably approximately 60 to 80 centimetres away. Responses were recorded on a standard keyboard number pad that was within easy reach of the participants. 4.2.3 Stimuli Stimuli were extracted from the same database described in Experiment 1, however, the first and last frame of the gaze stimuli videos were extracted for the purposes of the current work. Thus, two frames from each of the 192 videos were extracted, one in which the model was looking at the central target on the screen in front of them (from the participant's perspective, this is the model looking straight ahead), as well as the last frame of the video, where the model is looking either 1, 2 or 3 degrees to the left or right (see Figure 4.1). To create the static stimuli, the first extracted image (the model looking straight ahead) was displayed for 1000 msec, followed by a brief, 200 msec blank display, then finally, by the second extracted image (the model looking left or right) for 1000 msec. For the dynamic condition, the perception of smooth motion was produced by displaying the two frames one after the other, each for 1000 msec, and the blank display was counterbalanced to appear either before or after the successive images. 41  Figure 4.1: Timeline of a static trial. 4.2.4 Procedure The procedure was identical to Experiment 2, except that the final gaze stimulus frame in the static condition and half of the dynamic trials was presented for only 1000 msec before the response screens were presented. The other half of the dynamic trials displayed a blank stimulus screen for 200 ms before the final response screens were presented. 4.3  Results  4.3.1 Accuracy A 2 (motion) by 2 (direction) by 3 (gaze angle) repeated measures analysis of variance was conducted on the proportion of correct direction responses. Critically, there was a main effect of motion, F(1, 34) = 399.77, MSE = 0.02, p < .001, such that accuracy was significantly higher for dynamic stimuli (M = 0.89) compared to static stimuli (M = 0.64). There was a main effect of direction, F(1, 34) = 9.39, MSE = 0.07, p = .004, such that accuracy was higher for leftward gaze stimuli (M = 0.81) compared to rightward gaze stimuli 42  (M = 0.73). In addition, there was a main effect of gaze angle, F(2, 68) = 78.75, MSE = 0.01, p < .001. Inspection of the means and paired samples t-tests revealed that 2 degree gaze angles (M = 0.69) were easier to discriminate than 1 degree gaze angles (M = 0.77), t(34) = 6.03, p < .001, and 3 degree gaze angles (M = 0.84) were easier to discriminate than 2 degree gaze angles, t(34) = 7.88, p < .001. Figure 4.2 shows the accuracy responses for static and dynamic stimuli across the three gaze angles. 1.00 .95 Proportion Correct  .90 .85 .80 .75 .70 .65 .60  dynamic  .55  static  .50 1°  2° Gaze Angle (Degrees Visual Angle)  3°  Figure 4.2: Accuracy for static and dynamic stimuli at each gaze angle. Error bars represent the standard error of the mean. The main effects of direction and gaze angle were qualified by a significant direction by gaze angle interaction, F(2, 68) = 11.05, MSE = 0.01, p < .001. Slopes relating gaze angle to accuracy were calculated for each participant across left and right gaze stimuli. These slopes represent the effect of gaze angle on leftward and rightward gaze stimuli, respectively. Paired samples t-tests on these slopes revealed that the effect of gaze angle is greater for 43  rightward gaze stimuli (mean slope = 0.10) compared to leftward gaze stimuli (mean slope = 0.05), t(34) = 4.13, p < .001. Accuracy is generally higher for leftward gaze stimuli across all gaze angles, whereas accuracy for rightward gaze stimuli declines significantly at the smaller gaze angles. Specifically, accuracy for 1 degree gaze angles is significantly higher for leftward stimuli (M = 0.76), than rightward stimuli (M = 0.62), t(34) = 4.39, p < .001 and similarly, accuracy for 2 degree gaze angles is significantly higher for leftward stimuli (M = 0.80) than rightward stimuli (M = 0.74), t(34) = 2.06, p = .047. These directional effects may reflect a general bias to respond leftward when in doubt4. These directional effects may reflect a fruitful avenue of further investigation; however, given that direction does not interact with motion in a meaningful way, they will not be discussed further in the context of these experiments. Similar to Experiment 2, there was again no interaction between motion and gaze angle, F(2, 68) = 1.65, MSE = 0.01, p = .200. All other main effects and interactions were non-significant (all p's > 0.05). 4.3.2 Confidence Confidence responses were submitted to a 2 (motion) by 2 (direction) by 3 (gaze angle) repeated measures analysis of variance. There was a main effect of motion, F(1, 34) = 296.46, MSE = 5.26, p < .001, such that confidence was significantly higher for dynamic stimuli (M = 6.78) compared to static stimuli (M = 2.91). There was a main effect of gaze angle, F(2, 68) = 102.70, MSE = 0.66, p < .001. Inspection of the means and paired samples t-tests revealed that confidence was higher for 2 degree gaze stimuli (M = 4.92) 4  Participants made significantly more leftward responses (M = 104) than rightward responses (M = 88) in  general, t(34) = 3.22, p = .003.  44  compared to 1 degree stimuli (M = 4.10), t(34) = 8.79, p < .001 and similarly, confidence was higher for 3 degree gaze stimuli (M = 5.50) compared to 2 degree gaze stimuli, t(34) = 6.83, p < .001. These main effects were qualified by a motion by gaze angle interaction, F(2, 68) = 46.28, MSE = 0.63, p < .001 and a direction by gaze angle interaction, F(2, 68) = 9.14, MSE = 0.34, p < .001. To investigate the motion by gaze angle interaction, the slopes relating gaze angle to confidence (effect of gaze angle) were calculated for the dynamic and static stimuli, respectively. The effect of gaze angle was significantly higher for dynamic stimuli (mean slope = 1.10) than for static stimuli (mean slope = 0.30), t(34) = 7.69, p < .001. Essentially, confidence slightly increases across gaze angles for static stimuli, but increases significantly as gaze angles become larger for dynamic stimuli. For example, confidence for the 3 degree gaze angles in the static condition (M = 3.30) was lower than the 1 degree gaze angles in the dynamic condition (M = 5.51). Figure 4.3 shows confidence responses for static and dynamic stimuli at each of the three gaze angles.  45  9  Confidence Rating  8 7 6 5 4 3 2  dynamic  1  static  0 1°  2° Gaze Angle (Degrees Visual Angle)  3°  Figure 4.3: Confidence for static and dynamic stimuli at each gaze angle. Error bars represent the standard error of the mean. To investigate the gaze angle by direction interaction, the slopes relating gaze angle to confidence were calculated again, this time for each direction. The effect of gaze angle was significantly higher for rightward gaze stimuli (mean slope = 0.83) than leftward gaze stimuli (mean slope = 0.56), t(34) = 3.91, p < .001. Inspection of the means revealed that this slope difference (effect of gaze angle) seems to result from significantly increased confidence for rightward 3 degree gaze angles (M = 5.68) compared to leftward 3 degree gaze angles (M = 5.31), t(34) = 2.54, p = .016. All other main effects and interactions were non-significant (all p's > 0.05). 4.3.3 Summary of Accuracy and Confidence Accuracy and confidence both increased when the stimuli contained a motion signal. Again, similar to Experiment 2, the size of the gaze angle increased both accuracy and 46  confidence. In addition, there was a motion by gaze angle interaction in confidence, but not accuracy. Confidence remained low across all three gaze angles for the static stimuli, but increased significantly for larger, dynamic eye movements, whereas accuracy increased linearly as signal strength (gaze angle) increased. Similar to Experiment 2, this dissociation between confidence and accuracy suggests that there is a qualitative difference between participants' perceptions of static and dynamic gaze. Even though participants were not confident of their decisions regarding static stimuli, their accuracy increased with signal strength. Another interesting dissociation between confidence and accuracy was found in the direction by gaze angle interaction. The main effect of direction in the accuracy results showed that participants were more accurate in response to leftward gaze stimuli, where no main effect of direction was found in the confidence responses. Taken together, these results suggest that although accuracy is higher for leftward gaze, participants are not aware of it. This could reflect a leftward response bias, but considering that direction does not interact with motion, we do not explore these directional effects further. 4.3.4 Decision Time In Experiment 3, because the static and dynamic stimuli were equated in terms of how long participants spend with the gaze stimuli, and that responses were given after the stimuli were removed, it was reasonable to conduct an analysis of decision time. The following analysis should be taken with some caution, however, as participants were not explicitly instructed to respond as quickly as possible. We encouraged them to maintain a fast pace, but the task itself was not explicitly speeded.  47  In order to equate decisions times across the two types of dynamic trials and the static trials, 200 ms (the length of the blank screen presentation) was subtracted from the decision time for the dynamic trials in which the blank screen was presented last. Participant response times were submitted to an outlier trimming procedure. Decision times that were 2.5 standard deviations or more from the mean decision time for each participant were considered outliers. Means for this comparison were based on participant responses within each condition. Thus, means were computed for each participant at each level of the independent variable (e.g.static, left, gaze angle of 1). This resulted in 3.37% of trials removed across the entire data set. Trimmed decision times were submitted to a 2 (motion) by 2 (direction) by 3 (gaze angle) analysis of variance. There was a main effect of motion, F(1, 34) = 75.22, MSE = 319,424.96, p < .001, such that decision time was significantly faster for dynamic (M = 697.54 ms) compared to static (M = 1175.90 ms) stimuli. In addition, there was a main effect of gaze angle, F(2, 68) = 15.73, MSE = 57,697.47, p < .001. Inspection of the means along with paired sampled t-tests revealed that decision time was significantly faster for 2 degree gaze stimuli (M = 931.18) than 1 degree gaze stimuli (M = 1019.87), t(34) = 2.89, p = .007, and faster for 3 degree gaze stimuli (M = 859.11) than 2 degree gaze stimuli, t(34) = 2.56, p = .015. These main effects were qualified by a significant motion by gaze angle interaction, F(2, 68) = 7.44, MSE = 83,562.63, p = .001. The slopes relating the size of the gaze angle to decision time were calculated for static and dynamic stimuli. Paired samples t-tests revealed that the effect of gaze angle was significantly higher for dynamic (mean slope = -133.02)  48  compared to static (mean slope = -27.73), t(34) = 3.15, p = .003. This difference is illustrated in Figure 4.4. Interestingly, decision times for static stimuli do not seem to rely on the gaze angle. The slope relating gaze angle to reaction time for static and dynamic stimuli were compared to 0 in a one-sample t-test. Only the dynamic slope was different from 0, t(34) = 6.35, p < .001.  1400  Decition Time (ms)  1200 1000 800 600 400  static  200  dynamic  0 1°  2° Gaze Angle (Degrees Visual Angle)  3°  Figure 4.4: Decision time for static and dynamic stimuli at each gaze angle. Error bars represent the standard error of the mean. 4.4  Discussion In Experiment 3, we manipulated the availability of a motion cue in a gaze direction  discrimination task. As in Experiment 2, results clearly indicated that this motion cue plays a  49  significant role in the discrimination of gaze direction, evidenced by increased accuracy and confidence when the motion cue is present. Again, the presence of a motion by gaze angle interaction in confidence, but not accuracy, suggests that there are qualitative differences in the perception of static and dynamic gaze stimuli. This point is further corroborated by the motion by gaze angle interaction in decision time. Not only were participants more accurate and more confident of dynamic stimuli as gaze angle (i.e. signal strength) increased, their decisions were also significantly faster. In the static condition accuracy increased with signal strength, lending support to geometrical accounts of gaze perception (Symons et al., 2004; Bock et al., 2008). However, overall accuracy, confidence and decision times were lower for static compared to dynamic stimuli. This suggests that the signal quality was relatively poor for static compared to dynamic stimuli.  50  5 5.1  Experiment 4: Motion, Luminance and Gaze Direction Introduction Experiments 2 and 3 demonstrated that the high sensitivity to gaze direction found in  Experiment 1 is likely due to the presence of the motion signal. Results suggested that the perception of dynamic gaze may be fundamentally different from the perception of static gaze. Participants responded more accurately and with higher confidence to dynamic stimuli compared to static, and these values increased with gaze angle. In contrast, accuracy for static stimuli increased with increasing gaze angle without much change in the relatively low levels of confidence. In addition, Experiment 3 demonstrated that participants take more time to respond to static stimuli regardless of gaze angle, while decision time decreased with increasing gaze angle for dynamic stimuli. Collectively, these data converge on the hypothesis that motion serves as a qualitatively different kind of gaze signal than that arising from the perception of a static gaze signal. An alternative and more parsimonious hypothesis is that gaze direction is ultimately based on the same type of information for both dynamic and static shifts in gaze, but the quality of that information is profoundly greater for dynamic images. As a result, subjects are more confident, accurate, and faster with dynamic compared to static images. Experiment 4 tests between these two possibilities by examining the perception of gaze direction when iris deviation cues are made ambiguous through contrast reversal. As mentioned in the introduction, contrast reversal impairs the perception of gaze direction by altering the luminance distribution across the eye, often causing a reversal in perceived gaze direction. This has led to the theory that the visual system relies on an “inflexible” contrast  51  rule that is cognitively impenetrable, such that the dark part of the eye, the iris, exclusively signals the direction of gaze (Sinha, 2000; Ricciardelli et al., 2000, p. B12). This theory has recently been qualified by accounts suggesting that the influence of luminance and geometrical (iris deviation) cues on gaze perception reflect the outcome of a competition between rival gaze signals (Olk et al., 2008). This suggests that luminance and geometrical cues influence a common gaze processing mechanism. If the perception of gaze direction draws on the same type of information for dynamic and static stimuli, then we would expect contrast reversal to impair gaze perception across both dynamic and static gaze stimuli. On the other hand, if dynamic gaze perception is based on a qualitatively different signal than static gaze perception, the impact of contrast reversal should be very different for these two types of stimuli. Using a paradigm similar to that of previous work (Ricciardelli et al., 2000; Olk et al., 2008), in Experiment 4 we reversed the contrast polarity of half the gaze stimuli from Experiment 3. 5.2  Method  5.2.1 Participants 24 participants were recruited from the University of British Columbia, and participated for course credit or 5 dollars (CAD). 5.2.2 Apparatus Stimuli were presented on a Dell 2407WFP 17 inch (diagonal) flat screen monitor and participants were seated comfortably approximately 60 to 80 centimetres away.  52  Responses were recorded on a standard keyboard number pad that was within easy reach of the participants. 5.2.3 Stimuli Stimuli were adapted from those used in Experiment 3, where 2 frames were extracted from the video files initially created in Experiment 1. The first frame was one in which the model was looking at the central target on the screen in front of them (from the participant's perspective, this is the model looking straight ahead). The second frame was extracted from a few tens of milliseconds before the last frame of the video, where the model is looking 1, 2 or 3 degrees to the left or right. Each of these frames was then converted to greyscale and reverse contrast polarity (see Figure 5.1).  Figure 5.1: Example reverse contrast stimulus. To create the dynamic condition, the perception of smooth motion was produced by displaying the two frames one after the other. For the static stimuli, only the second extracted  53  image (the model looking left or right) was presented. In contrast to Experiment 3, the blank stimulus was not included. This was to eliminate any detrimental effects of the flicker when it occurred just before a response was made5. 5.2.4 Procedure Participants were seated comfortably in front of the computer monitor and were given detailed instructions about the procedure. Half of the trials were normal and the other half in reverse contrast polarity, blocked across half of the experiment, the ordering of which was counterbalanced. Static and dynamic gaze stimuli were presented to participants in a random blocked design of 24 trials in each block, 4 blocks each for normal and reverse contrast stimuli. Participants were made aware of the blocked design during instruction and were told before each block whether the model will "have already looked to the left or right" (static trials) or "will look straight ahead, then to the left or right" (dynamic trials). All 192 different gaze stimuli were displayed in this fashion and were counterbalanced across participants. Thus, across four participants all 192 eye movements would be seen in static, dynamic, normal and reverse contrast form. During each trial, participants were presented with the static or dynamic gaze stimulus and were asked to respond to the direction that they perceived the eye looked, in the  5  In order to assess the effect of the blank stimulus on accuracy in the dynamic condition of Experiment 3, a 2  (blank location: first, last) by 2 (direction) by 3 (gaze angle) analysis of variance was conducted on the proportion of correct responses to dynamic stimuli. There was a main effect of the blank location, F(1, 34) = 5.314, MSE = .014, p = .027, such that accuracy was greater when the blank stimulus occurred before the two gaze stimulus frames (M = 0.90), rather than after (M = 0.87). For this reason, we eliminate the use of the flicker stimulus in Experiment 4.  54  dynamic case, or was already looking, in the static case. Participants were required to respond with the number pad such that if they perceived rightward gaze, they were to press “6” and for leftward gaze, they were to press “4.” Stimuli remained on screen until the direction response. After each direction judgement, participants were asked to rate their confidence in their direction response. Confidence responses were given using number pad numbers “0” to “9,” where “0” corresponded to a "guess" response and “9” to an "absolutely sure" response. To advance to the next trial, participants were required to press the space bar. 5.3  Results  5.3.1 Accuracy A 2 (contrast - reverse contrast vs. normal contrast) by 2 (motion) by 2 (direction) by 3 (gaze angle) within subjects analysis of variance was conducted on the proportion of correct responses. There was a main effect of contrast, F(1, 23) = 14.04, MSE = 0.051, p = .001, such that accuracy was significantly better for normal contrast (M = 0.78) than reverse contrast stimuli (M = 0.71). There was a main effect of motion, F(1,23) = 109.11, MSE = 0.08, p < .001, such that accuracy was greater for dynamic (M = 0.87) than static stimuli (M = 0.62). There was a main effect of gaze angle, F(2, 46) = 30.15, MSE = 0.03, p < .001, such that accuracy increased as gaze angle increased. Paired samples t-tests confirmed that accuracy was significantly greater for 3 degree gaze angles (M = 0.80) than for 2 degree gaze angles (M = 0.74), t(23) = 4.88, p < .001, and accuracy for 2 degree gaze angles was significantly greater than accuracy for 1 degree gaze angles (M = 0.68), t(23) = 3.40, p = .002.  55  There was a significant direction by gaze angle interaction, F(2, 46) = 7.25, MSE = 0.01, p = .002. The slopes relating accuracy to gaze angle were calculated for leftward and rightward eye movements respectively. A paired-samples t-test on these slope values revealed that the effect of gaze angle was significantly greater for rightward gaze stimuli (mean slope = .09) than for leftward gaze stimuli (mean slope = .04), t(23) = 4.24, p < .001 (see Figure 5.2). Again, this directional effect may reflect a response bias unique to this experiment. Given that no other directional effects were found, the interpretation of this effect is beyond the scope of this work. 1.00 .95 Proportion Correct  .90 .85 .80 .75 .70 .65 .60  left  .55  right  .50 1°  2° Gaze Angle (Degrees Visual Angle)  3°  Figure 5.2: Accuracy for left and right gaze at each gaze angle. Error bars represent the standard error of the mean. Critically, there was a significant contrast by motion interaction, F(1, 23) = 18.96, MSE = 0.04, p < .001. Inspection of the means and paired-samples t-tests revealed that this interaction results from the fact that in the static condition, accuracy to reverse contrast gaze  56  (M = 0.55) is significantly lower than accuracy to normal contrast gaze (M = 0.69), t(23) = 6.68, p < .001. Interestingly, no such difference was found in the dynamic condition (p = .94). There was a significant motion by gaze angle interaction, F(2, 46) = 6.23, MSE = 0.02, p = .004. Inspection of the means and paired-samples t-tests revealed that this interaction appears to result from the fact that for static stimuli, accuracy significantly increases from gaze angles of 2 degrees (M = 0.59) to 3 degrees (M = 0.69), t(23) = 4.72, p < .001, while remaining relatively stable between gaze angles of 1 degree (M = 057) and 2 degrees, t(23) = 0.87, p = .400. The reverse pattern occurs for dynamic stimuli, where accuracy significantly increases from gaze angles of 1 degree (M = 0.79) to 2 degrees (M = 0.89), t(23) = 4.36, p < .001, and from gaze angles of 2 degrees to 3 degrees (M = 0.92), t(23) = 2.16, p = .041. This motion by gaze angle interaction appears to arise from a ceiling effect present in the dynamic stimuli, where accuracy levels off near 90% at 2 and 3 degree gaze angles (see Figure 5.3).  57  1.00  Proportion Correct  .90 .80 .70 .60  dynamic static  .50 1°  2° Gaze Angle (Degrees Visual Angle)  3°  Figure 5.3: Accuracy for static and dynamic stimuli at each gaze angle. Error bars represent the standard error of the mean. The above two-way interactions were qualified by a significant interaction between contrast, motion and gaze angle, F(2, 46) = 6.49, MSE = .018, p = .003. To investigate this interaction, the slopes relating gaze angle to accuracy were calculated for reverse contrast, normal contrast, static and dynamic stimuli. This slope value represents the effect of gaze angle on accuracy for each of these factors. For static stimuli, the effect of gaze angle on accuracy was significantly greater for normal contrast (mean slope = 0.10) than reverse contrast (mean slope = 0.03), t(23) = 3.37, p = .003 (see Figure 5.4). These slope values were submitted to two one-sample t-tests against 0. Only the normal contrast slope was significantly greater than 0, t(23) = 3.41, p = .002. For dynamic stimuli, no significant difference was found between normal (mean slope = 0.05) and reverse contrast (mean slope = 0.08) stimuli (see Figure 5.5). Both the normal and reverse contrast slopes were significantly different from 0 in a one-sample t-test: normal, t(23) = 7.33, p < .001; reverse 58  contrast, t(23) = 8.30, p < .001. All other main effects and interactions were non-significant (all p's > .05). 1.00 normal Proportion Correct  .90  reverse contrast  .80 .70 .60 .50 .40 1°  2°  3°  Gaze Angle (Degrees Visual Angle)  Static Gaze Figure 5.4: Accuracy for normal and reverse contrast gaze in the static condition. Error bars represent the standard error of the mean.  59  1.00 Proportion Correct  .90  .80 .70 .60  normal  .50  reverse contrast  .40 1°  2°  3°  Gaze Angle (Degrees Visual Angle)  Dynamic Gaze Figure 5.5: Accuracy for normal and reverse contrast gaze in the dynamic condition. Error bars represent the standard error of the mean. Thus, it appears that contrast reversal impairs discrimination of static gaze only. Accuracy to static stimuli improves as gaze angle increases, but only for normal contrast stimuli. Accuracy is generally low across static, reverse contrast stimuli. Motion appears to improve accuracy and disambiguate reverse contrast stimuli, showing similar levels of accuracy regardless of contrast type across all gaze angles (see Figure 5.5). 5.3.2 Confidence Confidence responses were submitted to a 2 (contrast) by 2 (motion) by 2 (direction) by 3 (gaze angle) analysis of variance. There was a main effect of contrast, F(1, 23) = 7.55, MSE = 2.62, p = .011, such that confidence is significantly higher for normal (M = 5.66) than for reverse contrast stimuli (M = 5.24). There was a main effect of motion, F(1, 23) = 43.57, MSE = 14.41, p < .001, such that confidence is significantly higher for dynamic (M = 6.53)  60  than for static stimuli (M = 4.41). There was also a main effect of gaze angle, F(2, 46) = 63.50, MSE = 1.54, p < .001. Inspection of the means and paired-samples t-tests revealed that confidence for gaze angles of 2 degrees (M = 5.58) was significantly higher than gaze angles of 1 degree (M = 4.69), p(23) = 7.97, p < .001, and confidence for gaze angles of 3 degrees (M = 6.07) was significantly higher than gaze angles of 2 degrees, p(23) = 4.86, p < .001. Trend analysis revealed that this main effect contains both linear, F(1,23) = 79.60, MSE = 2.39, p < .001, and quadratic trends, F(1, 23) = 4.86, MSE = 0.68, p = .014. Inspection of the means reveals that this quadratic trend arises from the larger effect of gaze angle between 1 and 2 degrees (mean difference = 0.89) than that between 2 and 3 degrees (mean difference = 0.49). There was an interaction between motion and gaze angle, F(2, 46) = 45.03, MSE = 1.56, p < .001. The slope relating confidence to gaze angle was calculated for static and dynamic stimuli. Paired-samples t-tests conducted on these slopes revealed that the effect of gaze angle on confidence was significantly larger for dynamic stimuli (mean slope = 1.28) compared to static stimuli (mean slope = 0.13), t(23) = 7.33, p < .001 (see Figure 5.6).  61  9  Confidence Rating  8 7 6 5 4 3  2  dynamic  1  static  0 1°  2° Gaze Angle (Degrees Visual Angle)  3°  Figure 5.6: Confidence responses for static and dynamic gaze at each gaze angle. Error bars represent the standard error of the mean. This interaction was qualified by a marginal contrast by motion by gaze angle interaction, F(2, 46) = 3.12, MSE = 1.56, p = .054. Again, the slopes relating confidence to gaze angle were calculated for reverse contrast, normal contrast, static and dynamic stimuli. This slope value represents the effect of gaze angle on confidence for each of these factors. Paired-samples t-tests were conducted on these slope values to investigate the effect of contrast reversal on confidence for static and dynamic stimuli. For static stimuli, the effect of gaze angle on confidence was significantly greater for normal contrast (mean slope = 0.28) than reverse contrast stimuli (mean slope = 0.03), t(23) = 3.15, p = .005 (see Figure 5.7). No significant difference was found between normal (mean slope = 1.23) and reverse contrast (mean slope = 1.34) stimuli in the dynamic condition, t(23) = 0.72, p = .477 (see Figure 5.8).  62  9  Confidence Rating  8 7  6 5 4 3 2  normal  1  reverse contrast  0 1°  2°  3°  Gaze Angle (Degrees Visual Angle)  Static Gaze Figure 5.7: Confidence for normal and reverse contrast gaze in the static condition. Error bars represent the standard error of the mean. 9  Confidence Rating  8 7 6 5 4 3 2  normal  1  reverse contrast  0 1°  2°  3°  Gaze Distance (Degrees Visual Angle)  Dynamic Gaze Figure 5.8: Confidence for normal and reverse contrast gaze in the dynamic condition. Error bars represent the standard error of the mean. 63  5.3.3 Summary of Accuracy and Confidence Results Accuracy and confidence both increased when the stimuli contained a motion signal. As expected, when the luminance ratio across the eye was disrupted by contrast reversal, accuracy and confidence significantly dropped. However, upon the introduction of a motion signal to the reverse contrast stimuli, accuracy and confidence increased to levels similar to dynamic, normal contrast stimuli. Similar to Experiments 2 and 3, confidence was particularly low for static gaze stimuli, with only a small increase in confidence for normal over reverse contrast stimuli. 5.4  Accuracy When Confidence = 0 In order to examine the relationship between confidence and accuracy when  participants made "guess" responses, accuracy was calculated for those responses where participants indicated a confidence of 0. These responses were then parsed across static and dynamic stimuli. Note that not all participants indicated a confidence of 0, and so each subject contributed a different number of responses. In order to somewhat mitigate for these unequal cell means, we report here the results for Experiments 2, 3 and 4 combined. Only normal contrast stimuli results were included in this analysis. One-sampled t-tests against chance (0.5) were conducted on accuracy to static and dynamic stimuli when confidence was 0. Only accuracy to the dynamic stimuli was significantly above chance (M = 0.57), t(72) = 2.15, p = .035 (note, 5 subjects did not report having a confidence of 0 so N is correspondingly reduced here). This suggests that the motion signal may have boosted accuracy in the absence of awareness (see Figure 5.9).  64  Figure 5.9: Proportion correct responses when confidence was reported as 0 (i.e., a "guess" response). Error bars represent standard error of the mean. 5.5  Discussion Contrast reversal indeed produced a disruption in gaze perception, but only for static  stimuli. In the static condition, gaze angle improved accuracy for normal gaze (similar to Experiments 2 and 3) but it did not appear to improve accuracy to contrast reversed gaze (see Figure 5.4). This is consistent with reports of the interfering effects of contrast reversal on gaze perception reported in previous work using static cues (Ricciardelli et al., 2000; Olk et al., 2008). Consistent with Experiments 2 and 3, the motion signal significantly enhanced both accuracy and confidence responses. We predicted that if dynamic and static gaze perception draw on qualitatively similar mechanisms, then we would expect that contrast reversal would affect both static and dynamic conditions equally. In contrast, motion attenuated the effect of contrast reversal. In fact, motion completely disambiguated these contradictory luminance  65  cues. In the dynamic condition, there was no difference between accuracy to normal or reverse contrast gaze. This suggests first that the notion of an “inflexible” contrast rule is unjustified (Ricciardelli et al., 2000; Sinha, 2000). Second, it confirms that dynamic gaze perception is a qualitatively different signal than static gaze perception. Motion overrides the strong disruption to gaze perception caused by contrast reversal. This is in line with previous work demonstrating that the visual system integrates multiple cues in gaze perception (Jenkins, 2007; Olk et al., 2008) and extends this work to include motion as one of those cues.  66  6 6.1  General Discussion Experiment Summaries and Conclusions In Experiment 1 we developed a comprehensive stimulus set of small eye movements  made by four different models. The eye movements were in 8 radial directions at distances of 1, 2 and 3 degrees visual angle from the model’s perspective. We then evaluated observers’ sensitivity to detect and discriminate these small eye movements. 1, 2 and 3 degrees visual angle from the model’s perspective corresponds with iris deviations from the observers perspective of approximately 1.2 (1 degree gaze angle), 2.4 (2 degree gaze angle) and 3.6 (3 degree gaze angle) minutes of arc. Observers were reliably able to detect and discriminate the smallest eye movements in this stimulus set, with sensitivity (d’) for 1 degree eye movements significantly above chance. 1.2 minutes of arc is slightly above the threshold for dyadic gaze reported by Cline (1.10 minutes of arc, 1967, p. 45) and below thresholds reported for triadic gaze (0.85 – 8 minutes of arc, Symons et al., 2004). The results of Experiment 1 confirmed that humans are remarkably sensitive to small eye movements. The results of Experiment 1 also highlighted a few aspects of bias in eye movement detection and discrimination. First, upward eye movements are more easily detected, and once detected, upward oblique eye movements are often mistaken for straight upward (N) eye movements. This bias may be due to the fact that the camera recording the model was located above the model, thus exaggerating the eyelid movement. Second, there was a cardinal bias, such that eye movements in the cardinal directions were more easily discriminated. This cardinal bias in detection for upward eye movements and in discrimination for all cardinal eye movements may be either a perceptual effect or may be a  67  product of the model’s making slightly more accurate eye movements in the cardinal directions. In Experiment 2, we sought to investigate the mechanism that observers used when judging gaze direction in Experiment 1. Given that dyadic gaze is purported to represent a special case of gaze perception (Cline, 1967), it is surprising that the sensitivity observed in Experiment 1, which employed neither a dyadic nor strictly triadic method, is close to dyadic levels of gaze discrimination. Experiment 2 investigated the possibility that the motion signal present in the video stimuli used in Experiment 1 was responsible for the observed sensitivity. The video stimuli from Experiment 1 were compared against a single frame displaying the end-point of the eye movement. Specifically, in the dynamic condition, participants viewed videos of the models making eye movements to the left or right, while in the static condition, participants viewed a single frame displaying the eye already directed left or right. They were then asked to indicate the direction of the perceived eye movement in the dynamic condition, or the direction that the eye was looking in the static condition. In addition, participants were asked to rate their confidence in their direction response. Experiment 3 resembled Experiment 2 closely, with slight changes in the methodology in order to equate the static and dynamic stimuli. Instead of presenting participants with video and image stimuli, the first and last frames were extracted from each of the left and right videos from Experiment 1. To create the dynamic stimuli, the first and last frames were presented in succession (each for 1s), causing the apparent motion of the model’s iris from straight ahead, to the left or right. To eliminate the dynamic motion signal, a blank flicker stimulus was presented between the two frames for 200 ms. The participant’s task was the same as in Experiment 2.  68  The results of Experiments 2 and 3 were unequivocal. Motion improved both accuracy and confidence in gaze direction judgements. In addition, the size of the gaze angle improved accuracy in the static condition in a linear fashion even though confidence was low across static stimuli. Specifically, in the static condition, as the signal strength arising from the deviation of the iris increased, accuracy increased, but confidence remained low. The decision time analysis of Experiment 3 revealed that decision time decreased with increasing gaze angle for dynamic stimuli, but remained slow across all gaze angles for static images. Taken together, these data suggest that the perception of dynamic gaze is qualitatively different from the perception of static gaze. Experiment 4 tested this possibility by comparing gaze discrimination accuracy across static and dynamic stimuli when half of those stimuli were contrast reversed. Contrast reversal has been demonstrated to disrupt normal gaze processing by reversing the normal luminance properties of gaze. This disruption has led to the prominent theory of gaze perception that the visual system follows an inflexible contrast rule, such that it is the dark part of the eye that “does the looking” (Ricciardelli et al., 2000 p. B12; Sinha, 2000). Geometrical cues have been shown to attenuate contradictory luminance cues (Olk et al., 2008) suggesting that gaze perception relies on a combination of luminance and geometrical cues. We predicted that if dynamic gaze perception draws on a different mechanism from static gaze perception, then contrast reversal would impair static, but not dynamic gaze. The results supported this prediction, in that motion completely disambiguated contradictory luminance cues. This suggests first that dynamic gaze signals are indeed qualitatively different from static gaze signals and second, that the notion of an “inflexible” contrast rule (Ricciardelli et al., 2000; Sinha, 2000) is untenable.  69  6.2  Multiple Cues in Gaze Perception Experiments 1 to 4 demonstrated that dynamic gaze perception is qualitatively  different from static gaze perception and that motion is a cue used in judgements of gaze direction. However, Experiments 1 to 4 also demonstrate the significant role that gaze angle and luminance play in gaze perception. Motion is not manipulated independently of gaze angle (as gaze angle increases, the motion signal increases), so it is difficult to identify the relative contributions of motion and gaze angle (a geometrical cue) in the dynamic stimuli. Looking only at the static results across Experiments 2, 3 and 4 may provide some insight into the role that gaze angle and luminance play in gaze discrimination. In the static stimuli of Experiments 2, 3 and 4, accuracy significantly increased as gaze angle increased. This corroborates previous work suggesting that iris deviation information is used to determine static gaze direction (Symons et al., 2004; Bock et al., 2008; Anstis et al., 1969). Interestingly, accuracy increased with gaze angle even though confidence was relatively low across gaze angles. This suggests that the signal arising from the amount of iris deviation was relatively poor compared to that arising from both the iris deviation and motion signal combined, and additionally, that the perception of static and dynamic gaze are qualitatively different. Nevertheless, the positional information from the iris, despite being as small as 1.2 minutes of arc to the left or right, was enough for participants to extract relevant directional information in the static condition. In Experiment 4, the amount of iris deviation did not improve accuracy to static, reverse contrast stimuli. This suggests that the contrast reversal manipulation indeed disrupted normal gaze perception based on cues from iris position. This corroborates previous work suggesting that luminance cues are vitally important in gaze perception 70  (Ando, 2002; Ricciardelli et al., 2000; Sinha, 2000). In addition, this finding extends previous work by demonstrating that contrast reversal disrupts the perception of much smaller deviations in gaze than those typically employed, which are upwards of 25 degrees visual angle (Ricciardelli et al., 2000; Olk et al., 2008). The present investigation demonstrated that motion, geometrical and luminance cues all play a significant role in gaze direction discrimination. This is in line with the multiplecue account of gaze perception put forward by Jenkins (2007) and Olk and colleagues (2008). The basic premise behind this account is that the perception of gaze direction reflects the outcome of a competition between different gaze direction signals, whether geometric or luminance-based (Olk et al., 2008, p. 1304). Jenkins (2007) suggests that the visual system uses the cues that are most salient given the particular context. In his work, geometrical cues dominated at closer viewing distances, and luminance cues dominated at farther viewing distances. The present work extends these accounts to include motion as a particularly salient signal in gaze perception computations. Furthermore, dynamic gaze appears to be qualitatively unique, in that motion acts to disambiguate luminance cues that would normally disrupt static gaze perception. The present work did not support the parsimonious hypothesis that the perception of gaze direction information draws on the same type of information for both static and dynamic gaze. This raises the question of what mechanisms support each form of gaze perception. It is clear that geometrical cues from iris deviation support the perception of static gaze direction. In addition, the detrimental effect of contrast reversal on these stimuli suggests that luminance information is also critical for the perception of static gaze. On the other hand, the relative contributions of motion, luminance and geometrical cues in dynamic  71  gaze remain unclear. Ando (2002) hypothesized that gaze discrimination may be a compromise between slow processing of high spatial frequency geometrical cues and fast processing of low spatial frequency luminance cues. Motion may represent another fast cue used for gaze perception. This latter point is particularly interesting given that luminance and motion signals share a common visual pathway in the brain (the magnocellular system; Coren et al., 1999). Perhaps the motion signal was able to override contradictory luminance cues because luminance and motion normally work together to signal direction of gaze. Motion may help to disambiguate the iris from the sclera in contrast-reversed gaze. This would explain why contrast reversal disrupts normal gaze processing in static, but not dynamic displays. Further work will need to be conducted to dissociate the relative contributions of motion and luminance in gaze perception. 6.3  Study Limitations The present series of investigations has shed light on the factors that contribute to the  perception of gaze direction. Nevertheless, this work leaves open several unanswered questions. In the dynamic stimuli it is difficult to assess the relative contribution of motion, geometrical and luminance cues because motion and gaze angle were not manipulated independently. As gaze angle increased, so did the magnitude of the motion signal. A potential solution would be to employ a task more in line with Olk and colleagues (2008), where geometrical cues are manipulated with head turn, rather than gaze direction alone. Behavioural experiments able to dissociate these cues in gaze perception would be of use in studies interested in the brain localization of different forms of gaze perception (e.g. BaronCohen, 1995; Perrett & Emery, 1994; Langton et al., 2000; Nummenmaa et al., 2010).  72  An alternative possibility is that the mechanisms through which gaze direction are calculated are in fact not localized in the traditional sense. Rather, by virtue of the unique morphology of the human eye (Kobayashi & Kohshima, 2001), the visual system integrates cues from the dark iris and light sclera early in visual processing, then passes this information to a high-level gaze perception system dedicated to understanding the meaning of the seen gaze signal. This idea is supported by investigations suggesting that STS activation in response to gaze signals are highly situation specific (Pelphrey et al., 2004; Pelphrey, Singerman, Allison & McCarthy 2003). For example, when participants viewed a model shifting gaze in response to the onset of a target, STS activity was modulated by whether the model violated expectation by not looking at the onset (Pelphrey et al., 2003). This suggests that the function of high level gaze processing systems (potentially the STS) is the interpretation, rather than the direction discrimination of gaze. It was difficult to draw comparisons between gaze sensitivity in the present work and previous psychophysical work (Gibson & Pick, 1963; Cline, 1967; Anstis et al., 1969; Symons et al., 2004; Bock et al., 2008) because participants were not all seated at the same distance from the monitor (i.e. no use of a chin rest). If we had equated each participants’ distance to the screen, it would be possible to assess thresholds with more precision by calculating the relative change in the model’s iris position from the participants’ perspective with more accuracy. Although we assume that sensitivity in Experiment 1 is similar to levels of dyadic gaze detection (Cline, 1967), this comparison is somewhat speculative. Tighter control would be necessary in order to directly compare our results to this previous work.  73  6.4  Future Directions Our work is the first to demonstrate that the perception of dynamic gaze is  fundamentally different from the perception of static gaze. This point is particularly relevant for understanding how laboratory behaviour generalizes to more realistic situations. Certainly, we encounter moving eyes more often than static eyes in our day-to-day lives. Given that participant responses changed dramatically from static to dynamic gaze, it is possible that the adoption of even more realistic situations may similarly change behaviour. Future work is needed to examine the true extent of motion’s contribution to gaze perception. It is not known how the contribution of motion will scale as gaze deviations increase. As mentioned in the introduction, no appreciable effect of motion was shown in the triangulation tasks used by Symons and colleagues (2004) and Bock and colleagues (2008). In these tasks, a model fixated the nasion of the participant and then a target located on a board between them. One reason that motion may not have influenced triangulation judgements is that the participants needed to make eye movements themselves to follow the gaze of the model. Motion may have given a rough estimate of the model’s initial eye movement direction, but was afterwards invisible to participants as they followed the model’s gaze to the target. One can imagine that accuracy to static gaze increases linearly as gaze angle increases. If we extrapolate the gaze angle Figure 4.2 to 6 degrees visual angle, accuracy for static gaze will have converged on accuracy for dynamic gaze (and reached a ceiling). It is possible that motion and gaze angle become equivalent directional signals as iris deviation increases, while detailed examination of gaze angle is needed for fine target triangulation. The question of whether the visual system can use the velocity of eye motion in gaze triangulation is an interesting avenue of future research. To investigate this 74  possibility, triangulation tasks could be designed that do not require participants to make a large eye movement when following gaze. The work reported here does not give a sufficient answer to the relative contributions of motion, luminance and geometrical cues in gaze perception. One possibility is that the analysis of reaction time (RT) can clarify this important question. This is a fruitful area of future research because the tasks used in the present work, as well as those employed in previous work interested in understanding the cues used in gaze perception (Ando, 2002; Ricciardelli et al., 2000; Olk et al., 2008), lend themselves well to RT investigations. This is particularly relevant given that in these tasks, gaze deviations are predominantly above threshold levels of detectability. The use of RT in this work will allow for the development of theoretical models of gaze perception. For example, gaze perception may be composed of several sub processes such as luminance, motion and form processing. Mental chronometry techniques can be used to parse the relative contributions of these processes in the computation of gaze direction (see Meyer, Osman, Irwin & Yantis, 1988). By factorially manipulating each of these processes in turn (as in the present work), their joint effects on RT can be used to infer whether the factors influence a common processes or not (Sternberg, 1969; McClelland, 1979).  75  7  Conclusions The results of the present series of investigations have shed light on the nature of  gaze perception. We have demonstrated that observers are remarkably sensitive to subtle changes in gaze direction (Experiment 1) and that this sensitivity is largely due to the dynamic nature of these changes (Experiments 2 and 3). Further we have shown that the perception of dynamic gaze is qualitatively different from the perception of static gaze (Experiment 4). Additionally, these results brought into question the notion of an "inflexible" contrast rule (Ricciardelli et al., 2000; Sinha, 2000). Together, these investigations reveal that the visual system integrates multiple cues in order to form a coherent perception of gaze direction and that subtle differences in these cues can have a profound effect on how we perceive the eyes of others.  76  References Akiyama, T., Kato, M., Muramatsu, T., Umeda, S., Saito, F., & Kashima, H. (2007). Unilateral amygdala lesions hamper attentional orienting triggered by gaze direction. Cerebral Cortex, 17, 2593-2600. Allison, T., Puce, A. & McCarthy, G. (2000). Social perception from visual cues: role of the STS region. Trends in Cognitive Sciences, 4, 267-278. Anderson, N.C., Risko, E., & Kingstone, A. (2011). Exploiting human sensitivity to gaze for tracking the eyes. Behavior Research Methods, 43, 843-852. Ando, S. (2002). Luminance-induced shift in the apparent direction of gaze. Perception, 31, 657-674. Ando, S. (2004). Perception of gaze direction based on luminance ratio. Perception, 33, 1173-1184. Anstis, S.M., Mayhew, J.W. & Morley, T. (1969). The perception of where a face or television ‘portrait’ is looking. The American Journal of Psychology, 82, 474-489. Baron-Cohen S, Wheelwright S, Skinner R, Martin J, & Clubley E. 2001. The autismspectrum quotient (AQ): evidence from Asperger syndrome/high-functioning autism, males and females, scientists and mathematicians. Journal of Autism Developmental Disorder, 31, 5-17. Baron-Cohen, S, Campbell, R, Karmiloff-Smith, A, Grant, J, & Walker, J, (1995) Are children with autism blind to the mentalistic significance of the eyes? British Journal of Developmental Psychology, 13, 379-398. Baron-Cohen, S. (1995). Mindblindness: An essay on autism and theory of mind. MIT Press. 77  Baron-Cohen, S., Ring, H.A., Wheelwright, S., Bullmore, E.T., Brammer, M.J., Simmons, A., & Williams, S.C.R. (1999). Social intelligence in the normal and autistic brain: an fMRI study. European Journal of Neuroscience, 11, 1891-1898. Birmingham, E. & Kingstone, A. (2009). Human social attention: A new look at past, present, and future investigations. Annals of the New York Academy of Sciences, 1156, 118-140. Birmingham, E., Bischof, W.F. & Kingstone, A. (2008). Gaze selection in complex social scenes. Visual Cognition, 16, 341-335. Bock, S.W., Dicke, P. And Thier, P. (2008). How precise is gaze following in humans? Vision Research, 48, 946-957. Calder, A.J., Beaver, J.D, Winston, J.S., Dolan, R.J., Jenkins, R., Eger, E. & Henson, R.N.A. (2007). Separate coding of different gaze directions in the superior temporal sulcus and inferior parietal lobule. Current Biology, 17, 20-25. Calder, A.J., Lawrence, A.D., Keane, J., Scott, S.K., Owen, A.M., Christoffels, I., & Young, A.W. (2002). Reading the mind from eye gaze. Neuropsychologia, 40, 1129-1138. Cline, M.G. (1967). The perception of where a person is looking. The American Journal of Psychology, 80, 41-50. Coletta, N.J., Segu, P., and & Tiana, C.L.M. (1993). An oblique effect in parafoveal motion perception. Vision Research, 33, 2747-2756. Coren, S., Ward, L.M. & Enns, J.T. (1999). Sensation and Perception (5th Edition). New York: Harcourt Brace.  78  Dalton, K.M., Nacewicz, B.M., Johnstone, T., Schaefer, H.S., Gernsbacher, M.A., Goldsmith, H.H., Alexander, A.L., & Davidson, R.J. (2005). Gaze fixation and the neural circuitry of face processing in autism. Nature Neuroscience, 8, 519-526. Foulsham, T., Cheng, J.T., Tracy, J.L., Henrich, J. & Kingstone, A. (2010). Gaze allocation in a dynamic situation: Effects of social status and speaking. Cognition, 117, 319331. Friesen, C.K. & Kingstone, A. (1998). The eyes have it! Reflexive orienting is triggered by nonpredictive gaze. Psychonomic Bulletin & Review, 5, 490-495. Gallagher, H., & Frith, C.D. (2003). Functional imaging in theory of mind. Trends in Cognitive Sciences, 7, 77-83. George, N., Driver, J., & Dolan, R.J. (2001). Seen gaze-direction modulates fusiform activity and its coupling with other brain areas during face processing. NeuroImage, 13, 1102-1112. Gibson, J.J., & Pick, A.D. (1963). Perception of another person's looking behaviour. American Journal of Psychology, 76, 386-394. Grayson, D.M., & Monk, A.F. (2003). Are you looking at me? Eye contact and desktop video conferencing. ACM Transactions on Computer-Human Interaction, 10, 221243.  79  Haxby, J.V., Hoffman, E.A., & Gobbini, M.I. (2000). The distributed human neural system for face perception. Trends in Cognitive Sciences, 4, 223-233. Hein, G. & Knight, R.T. (2008). Superior temporal sulcus - it's my area: or is it? Journal of Cognitive Neuroscience, 20, 2125-2136 Henderson, J.M. (2003). Human gaze control during real-world scene perception. Trends in Cognitive Science, 7, 498-504. Hoffman, E.A., & Haxby, J.V. (2000). Distinct representations of eye gaze and identity in the distributed human neural system for face perception. Nature Reviews Neuroscience, 3, 80-84. Hubel, D.H., & Wiesel, T.N. (1979). Brain mechanisms of vision. Scientific American, 82, 84-97. Hunzelmann, N., & Spillman, L. (1984). Movement adaptation in the peripheral retina. Vision Research, 24, 1765-1769. Jenkins, R. (2007). The lighter side of gaze perception. Perception, 36, 1266-1268. Jenkins, R., Beaver, J.D. & Calder, A.J. (2006). I thought you were looking at me: Directionspecific aftereffects in gaze perception. Psychological Science, 17, 506-513. Johansson, G. (1973). Visual perception of biological motion and a model for its analysis. Perception & Psychophysics, 14, 201-211. Kawashima, R., Sugiura, M., Kato, T., Nakamura, A., Hatano, K., Ito, K., ... & Nakamura, K. (1999). The human amygdala plays an important role in gaze monitoring: a PET study. Brain, 122, 779-783.  80  Kendon, A. (1967). Some functions of gaze-direction in social interaction. Acta Psychologica, 26, 22-63. Kobayashi, H. & Kohshima, S. (2001). Unique morphology of the human eye and its adaptive meaning: comparative studies on external morphology of the primate eye. Journal of Human Evolution, 40, 419-435. Langton, S.R.H., Watt, R.J. & Bruce, V. (2000). Do the eyes have it? Cues to the direction of social attention. Trends in Cognitive Sciences, 4, 50-59. Lobmaier, J.S., Fischer, M.H., &and Schwaninger, A. (2006). Objects capture perceived gaze direction. Experimental Psychology, 53, 117-122. Maunsell, J.H.R., & Van Essen, D.C. (1983). Functional properties of neurons in middle temporal visual area of the macaque monkey: I. Selectivity for stimulus direction, speed, and orientation. Journal of Neurophysiology, 49, 1127-1147. Maurer, D. (1985). Infants’ perception of facedness. In Social Perception in Infants (Field, T. And Fox, N., eds.), Ablex. McClelland, J.L. (1979). On the time relations of mental processes: An examination of systems of processes in cascade. Psychological Review, 86, 287-330. Meyer, D.E., Osman, A.M., Irwin, D.E., & Yantis, S. (1988). Modern mental chronometry. Biological Psychology, 26, 3-67. Movshon, J.A., & Newsome, W.T. (1992). Neural foundations of visual motion perception. Current Directions in Psychological Science, 1, 35-39. Nakayama, K. & Silverman, G.H. (1985). Detection and discrimination of sinusoidal grating displacements. Journal of the Optical Society of America, 2, 267-274. 81  Nakayama, K. (1985). Biological image motion processing: A review. Vision Research, 25, 625-660. Nummenmaa, L. & Calder, A.J. (2009). Neural mechanisms of social attention. Trends in Cognitive Neuroscience, 13, 135-143. Nummenmaa, L., Passamonti, L., Rowe, J., Engell, A.D., & Calder, A.J. (2010). Connectivity analysis reveals a cortical network for eye gaze perception. Cerebral Cortex, 20, 1780-1787. Olk, B., Symons, L.A., & Kingstone, A. (2008). Take a look at the bright side: Effects of contrast polarity on gaze direction judgments. Perception & Psychophysics, 70, 12981304. Pelphrey, K.A., Singerman, J.D., Allison, T., & McCarthy, G. (2003). Brain activation evoked by perception of gaze shifts: the influence of context. Neurpsychologia, 41, 156-170. Pelphrey, K.A., Viola, R.J., & McCarthy, G. (2004). When strangers pass: Processing of mutual and averted social gaze in the superior temporal sulcus. Psychological Science, 15, 598-603. Perrett, D.I. & Emery, N.J. (1994). Understanding the intentions of others from visual signals: neurophysiological evidence. Cahiers de Psychologie Cognitive, 13, 683694. Rensink, R.A., O’Regan, J.K., & Clark, J.J. (1997). To see or not to see: The need for attention to perceive changes in scenes. Psychological Science, 8, 368-373.  82  Ricciardelli, P., Baylis, G. & Driver, J. (2000). The positive and negative of human expertise in gaze perception. Cognition, 77, B1-B14. Richardson, D.D. & Dale, R. (2005). Looking to understand: The coupling between speakers’ and listeners’ eye movements and its relationship to discourse comprehension. Cognitive Science, 29, 1045-1060. Rogers, B.J., & Graham, M. (1979). Motion parallax as an independent cue for depth perception. Perception, 8, 125-134. Saxe, R., & Kanwisher, N. (2003). People thinking about thinking people: The role of the temporo-parietal junction in "theory of mind." Neuroimage, 19, 1835-1842. Seyama, J. & Nagayama, R.S. (2006). Eye direction aftereffect. Psychological Research, 70, 59-67. Sinha, P. (2000). Here’s looking at you, kid. Perception, 29, 1005-1008. Sternberg, S. (1969). The discovery of processing stages: Extensions of Donders’ method. Acta Psychologica, Attention and Performance II, 30, 276-315. Symons, L.A., Lee, K., Cedrone, C.C. & Nishimura, M. (2004). What are you looking at? Acuity for triadic gaze. Journal of General Psychology, 131, 451-469. Vecera, S.P., & Johnson, M.H. (1995). Gaze detection and the cortical processing of faces: evidence from infants and adults. Visual Cognition, 2, 59-87.  83  von dem Hagen, E.A.H., Nummenmaa, L., Yu, R., Engell, A.D., Ewbank, M.P., & Calder, A.J. (2011). Autism spectrum traits in the typical population predict structure and function in the posterior superior temporal sulcus. Cerebral Cortex, 21, 493-500. Watt, R.J. (1999). What your eyes tell my eyes, and how your eyebrows try to stop them. Paper presented at the Tenth International Conference on Perception and Action, University of Edinburgh. Zilbovicius, M., Meresse, I., Chabane, N., Brunelle, F., Samson, Y., & Boddaert, N. (2006). Autism, the superior temporal sulcus and social perception. Trends in Neurosciences, 7, 359-366.  84  

Cite

Citation Scheme:

        

Citations by CSL (citeproc-js)

Usage Statistics

Share

Embed

Customize your widget with the following options, then copy and paste the code below into the HTML of your page to embed this item in your website.
                        
                            <div id="ubcOpenCollectionsWidgetDisplay">
                            <script id="ubcOpenCollectionsWidget"
                            src="{[{embed.src}]}"
                            data-item="{[{embed.item}]}"
                            data-collection="{[{embed.collection}]}"
                            data-metadata="{[{embed.showMetadata}]}"
                            data-width="{[{embed.width}]}"
                            async >
                            </script>
                            </div>
                        
                    
IIIF logo Our image viewer uses the IIIF 2.0 standard. To load this item in other compatible viewers, use this url:
http://iiif.library.ubc.ca/presentation/dsp.24.1-0072979/manifest

Comment

Related Items