UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

A model and analysis of two-handed interaction with a keyboard and pointing device Link, Juliette Frances 2013

Your browser doesn't seem to have a PDF viewer, please download the PDF to view this item.

Item Metadata

Download

Media
24-ubc_2014_spring_link_juliette.pdf [ 3.92MB ]
Metadata
JSON: 24-1.0052171.json
JSON-LD: 24-1.0052171-ld.json
RDF/XML (Pretty): 24-1.0052171-rdf.xml
RDF/JSON: 24-1.0052171-rdf.json
Turtle: 24-1.0052171-turtle.txt
N-Triples: 24-1.0052171-rdf-ntriples.txt
Original Record: 24-1.0052171-source.json
Full Text
24-1.0052171-fulltext.txt
Citation
24-1.0052171.ris

Full Text

   A MODEL AND ANALYSIS OF TWO-HANDED INTERACTION WITH A KEYBOARD AND POINTING DEVICE  by Juliette Frances Link B.A., The University of British Columbia, 2010    A THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF  MASTER OF SCIENCE in THE FACULTY OF GRADUATE AND POSTDOCTORAL STUDIES (Computer Science)   THE UNIVERSITY OF BRITISH COLUMBIA (Vancouver)  October 2013   ? Juliette Frances Link, 2013 ii  Abstract  The benefits of two-handed interaction have been heavily researched and new input devices that presumably better support such interaction are common. Yet the standard desktop setup today continues to be a keyboard and pointing device. Despite its universality, there does not exist a systematic description of two-handed interaction with this setup. Our work fills that gap. We conducted observational studies with 37 design application users focusing on three fundamental elements of two-handed interaction: (1) the configurations hands assume (the different positions the two hands are in relative to the input devices), (2) how much time is spent in each configuration, and (3) how often the hands move between the configurations. We propose a model that describes the patterns of two-handed interaction that occur broadly across all users, for example that every user had a configuration in which s/he spent the majority of his or her time (for most users this was one hand on the left side of the keyboard and one hand on the mouse). We also document where there are individual differences in patterns, namely in how much each hand moves. The three-staged analytic approach we took to arrive at these findings is described, along with implications for design. iii  Preface  I conducted all of the work presented in this thesis under the supervision of Dr. Joanna McGrenere and Dr. Kellogg Booth. The observational studies and interviews reported in Chapter 3 were conducted with the approval of the UBC Behavioral Research Ethics Board (BREB), under certificate number H08-02562. Parts of this thesis appear in a manuscript1 on which I am the lead author.                                                       1 Link, J., McGrenere, J., Booth, K.S., and Terry, M. A Model and Analysis of Two-handed Interaction with a Keyboard and Pointing Device. Manuscript in submission. iv  Table of Contents  Abstract ........................................................................................................................................... ii!Preface ............................................................................................................................................ iii!Table of Contents ........................................................................................................................... iv!List of Tables ................................................................................................................................. vii!List of Figures .............................................................................................................................. viii!List of Abbreviations ..................................................................................................................... ix!Glossary ........................................................................................................................................... x!Acknowledgements ........................................................................................................................ xi!Dedication ..................................................................................................................................... xiii!Chapter 1: Introduction ................................................................................................................. 1!1.1 Motivation .............................................................................................................................. 1!1.2 Approach & Contributions ..................................................................................................... 2!1.3 Thesis Overview ..................................................................................................................... 3!Chapter 2: Related Research ......................................................................................................... 4!2.1 Novel Techniques and Input Devices for Two-handed Interaction ....................................... 4!2.2 Asymmetric and Symmetric Two-handed Interaction ........................................................... 5!2.3 Models of Two-handed Interaction ........................................................................................ 6!Chapter 3: Methodology ................................................................................................................ 8!3.1 Participants ............................................................................................................................. 8!3.2 Tasks ...................................................................................................................................... 9!3.3 Apparatus ............................................................................................................................... 9!3.4 Procedure .............................................................................................................................. 10!v  Chapter 4: Data Collection .......................................................................................................... 12!4.1 Which Tasks were Video Coded .......................................................................................... 12!4.2 Data Captured from Video Coding ...................................................................................... 13!4.3 Input Devices Used .............................................................................................................. 15!4.4 Unused Data ......................................................................................................................... 15!Chapter 5: Analysis ...................................................................................................................... 16!5.1 Initial Analysis: Descriptive Statistics ................................................................................. 16!5.1.1! Similarities in Two-Handed Interaction Patterns ...................................................... 16!Mouse Hand and Keyboard Hand ..................................................................................... 17!Home Position ................................................................................................................... 17!Where Hands Move ........................................................................................................... 17!5.1.2! Differences in Two-handed Interaction Patterns ....................................................... 18!Hand Movement for NPs vs. LPs ...................................................................................... 19!The Effect of KH Home Position ...................................................................................... 20!Speed/efficiency ................................................................................................................ 21!5.1.3! Asymmetric Two-handed Interaction ........................................................................ 22!5.2 Timelines: Visualizing Hand Movement Data ..................................................................... 22!5.3 Cluster Analysis ................................................................................................................... 23!5.3.1! Transition Matrices .................................................................................................... 23!5.3.2! Comparing Variance in Hand Movement Patterns .................................................... 25!5.3.3! Cluster Analysis ........................................................................................................ 26!Tasks and Individual Differences ...................................................................................... 27!5.4 Interview Analysis ............................................................................................................... 28!5.4.1! Do Participants Mind Moving Their Hands? ............................................................ 28!5.4.2! Health Concerns ........................................................................................................ 28!5.4.3! Factors that Influence Hand Movement .................................................................... 29!5.5 Reflection on Methods Used ................................................................................................ 29!Chapter 6: Discussion ................................................................................................................... 30!6.1 A Model of Two-handed Interaction .................................................................................... 30!6.2 What Influences Two-handed Interaction ............................................................................ 31!6.3 Value of Our Model of Two-handed Interaction ................................................................. 31!6.4 Implications for Design ........................................................................................................ 32!Chapter 7: Conclusion ................................................................................................................. 34!7.1 Summary of Contributions ................................................................................................... 34!vi  7.2 Limitations ........................................................................................................................... 35!7.2.1! Limitations of Observing in Naturalistic Settings ..................................................... 35!7.2.2! Limitations of Our Video Coding ............................................................................. 35!7.3 Future Work ......................................................................................................................... 36!References ..................................................................................................................................... 37!Appendices .................................................................................................................................... 40!Appendix A - Participant Information ....................................................................................... 41!Appendix B - Timeline Groupings ............................................................................................. 43!Appendix C - Study Materials .................................................................................................... 46!C.1! Participant Recruitment Email (Naturalistic Participants) ........................................... 46!C.2! Screening Questionnaire .............................................................................................. 47!C.3! Consent Forms ............................................................................................................. 49!C.4! Interview Questions ..................................................................................................... 54! vii  List of Tables  Table 1. Median number of MH and KH moves for all samples from LPs vs. NPs. (N = 39 samples) ............................................................................................................................. 19!Table 2. Number of hand moves for samples using an application with and without a command line. (N = 39 samples) ........................................................................................................ 20!Table 3. Number of hand moves and time off home position for samples where participants? KH had a home position of ?off? versus ?keyboard (kb) left.? (N = 18 samples) ..................... 20!Table 4. Number of hand moves for ?fast? and ?slow? task completion times. (N = 12 samples) .. 21! viii  List of Figures  Figure 1. Example setup for Naturalistic Participants (NPs). ........................................................ 10!Figure 2. An example of a hand movement across the keyboard. In our coding scheme this would be a move from keyboard left, to keyboard right, and back to keyboard left (see keyboard regions in Figure 3). .......................................................................................................... 13!Figure 3. Regions of the keyboard. Keyboard image (minus markup) in the public domain. ....... 14!Figure 4. Average % of time each hand spent in each region. Regions with averages < 0.5% for both hands are omitted. kb = keyboard, MH = mouse hand, KH = keyboard hand. (N = 39 samples) ............................................................................................................................. 17!Figure 5. Average % of time hands spent in the eight most common configurations. (N = 39 samples) ............................................................................................................................. 18!Figure 6. Percentage of total moves by MH vs. KH for each sample. Ordered by increasing % MH. (N = 39 samples) ....................................................................................................... 19!Figure 7. Number of moves per minute by MH vs. KH for each sample, ordered by increasing hand moves per minute for the MH. (N = 39 samples) ..................................................... 19!Figure 8. Example timeline from LNP1s1/3. ................................................................................. 22!Figure 9. A 9x9 matrix that captures movement data for a single hand. We created two 9x9 matrices with this same layout for each sample - one for the MH and one for the KH. This matrix represents the average number of times (across all samples) that a participants? KH moved from any one region to any other region (including staying in the same region). Shading in the cells represents the frequency with which a move was made: the darker the shade, the more often participants? KH made that move. .......................... 24!Figure 10. A 31x31 matrix that captures movement data for two hands. We created a 31x31 matrix for each sample. This matrix represents the average number of times participants? two hands moved from any one configuration to any other configuration (including staying in the same configuration) - the darker the shade, the more often participants? hands made that move. ...................................................................................................... 25!ix  List of Abbreviations  LP ? lab participant NP ? naturalistic participant LNP ? lab/naturalistic participant KH ? keyboard hand MH ? mouse hand   x  Glossary  Configuration ? The different positions a user?s two hands are in relative to the input devices. Design application ? Computer applications for which the creation and manipulation of on-screen objects is a significant part of a user?s interaction with the application. These include image manipulation applications, and 2D and 3D graphics applications (including CAD). Home position ? The position in which a hand spends the majority of its time. Keyboard hand ? The hand that solely uses the keyboard. Mouse hand ? The hand that primarily operates the mouse. Region ? We divided the areas within which participants? hands moved into nine regions based on the input devices ? mouse (for when a hand was in contact with the mouse), 2nd mouse (for one of our participants who used two pointing devices), and off (for when a hand was not in contact with any input device), plus six areas of the keyboard: numpad, arrow keys, function keys, delete keys, keyboard right, and keyboard left. Sample ? The segments of the different tasks that were video coded. xi  Acknowledgements  First and foremost, I?d like to thank my awesome supervisors, Dr. Joanna McGrenere and Dr. Kellogg Booth. I?d also like to thank Dr. Michael Terry from the University of Waterloo who collaborated with us on this research and served as a third member of the supervisory committee. I thank Melissa Dawe Schmidt and the AutoCAD UX team at Autodesk for supporting my research and for making my summer internship a great experience. It was a fortunate coincidence that the study I ran at Autodesk aligned well with the study design I had already planned that was looking at how two hands are used with design applications. It was an added bonus that I was allowed to ask participants from the Autodesk study to also participate in my study. Kailun Zhang helped with a significant portion of the video coding for this research, for which I am very grateful. I?d like to thank Steve Wolfman for his continued support in my Computer Science education, and for providing valuable feedback for this thesis as the ?second reader.? My labmates ? Jessica Dawson, Louise Oram, Mona Haraty, Syavash Nobarany, Matt Negulescu, Matt Brehmer, Hasti Seifi, Junhao Shi, Oliver Schneider, Shathel Haddad, and many other MUX members (current and former) ? helped make this thesis possible by providing advice and feedback throughout my degree program. I thank everyone who participated in my observational studies for this research. And last, but certainly not least, I?d like to give special thanks to my family and to Reilly Wood for their unconditional support. The research reported in this thesis was supported by the Network Centers of Excellence Program (NCE) through the Graphics, Animation, and New Media (GRAND) network, by the Natural Sciences and Engineering Research Council of Canada (NSERC) through the Discovery Grant xii  program, and by the Canada Foundation for Innovation (CFI), The University of British Columbia, and the Institute for Computing, Information and Cognitive Systems (ICICS) that provided research infrastructure. xiii  Dedication  I dedicate this thesis to the memory of my mother, Genevi?ve Soulas-Link.1  Chapter 1: Introduction This chapter motivates our work, describes our general research approach, and summarizes our contributions to the field of Human-Computer Interaction. We provide detailed summary of previous work from the literature in the next chapter, but cite some of the most important work here as part of the background that motivates our research. 1.1 Motivation The benefits of two-handed (vs. one-handed) interaction have been well documented. For example, using two hands can save time by dividing labor, allowing some actions to be performed in parallel; and when each hand has a ?home position,? the amount of movement can be minimized (Buxton & Myers, 1986). Two-handed interaction is also more consistent with everyday skills from the physical world, so supporting the use of two hands can accelerate learning through skill transfer (Buxton & Myers, 1986; Owen, Kurtenbach, Fitzmaurice, Baudel, & Buxton, 2005). Balancing workload across two hands can also improve comfort by reducing the load placed on one limb and supporting different body positions (Odell, Davis, Smith, & Wright, 2004). The benefits of two-handed interaction are cognitive as well (Hinckley, Pausch, Proffitt, & Kassell, 1998; Leganchuk, Zhai, & Buxton, 1998). Two hands are better suited to certain tasks like those that require more interactive manipulation and/or include uncovering information that is hidden, making two-handed interaction especially useful for tasks that are cognitively demanding (Leganchuk et al., 1998; Owen et al., 2005). This said, there are some cases where two hands can be worse than one, for example when each hand is assigned an 2  independent subtask because this can increase cognitive load or introduce the possibility of sequencing errors in some situations (Kabbash, Buxton, & Sellen, 1994). Much of the research in two-handed interaction with desktop interfaces focuses on exploring and developing new techniques and technologies to support the use of two hands (Balakrishnan & Patel, 1998; Bier, Stone, Pier, Buxton, & DeRose, 1993; McLoone, Hinckley, & Cutrell, 2003). The extensiveness of that body of work implies that the standard desktop setup, namely a keyboard and pointing device, must be inadequately supporting two-handed interaction. MacKenzie & Guiard (2001) argued this explicitly. Despite this limitation, the fact remains that most users today continue to use a desktop with a keyboard and pointing device. Further, despite the breadth of research in two-handed interaction, to our knowledge, there has never been a systematic description of this common two-handed interaction setup.  1.2 Approach & Contributions Our research aims to fill this gap in understanding of two-handed interaction with a keyboard and pointing device. As a first step we specify and examine specific components of two-handed interaction, namely movement patterns over time, including where the two hands are relative to each other and to the input devices. We examine these two-handed interaction components in the context of computer applications for which the creation and manipulation of on-screen objects is a significant part of a user?s interaction with the application. These include image manipulation applications, and 2D and 3D graphics applications (including CAD), which we refer to collectively as design applications. Due to the exploratory nature of this research, we thought it best to focus on a single class of applications. We considered several classes of applications, including office productivity applications, but in the end we chose to focus on design applications because they involve a frequent and interesting mix of keyboard entry and pointing/mousing, and because keyboards and pointing devices are important interaction components for many other applications. We conducted observational studies and interviews with 37 design application users, both in the wild and in a lab setting. We focused our inquiry on three fundamental elements of two-handed interaction: (1) the configurations hands assume (the different positions the two hands are in relative to the input devices), (2) how much time is spent in each configuration, and (3) how often 3  the hands move between the configurations. Our analysis was guided by several research questions: What are the patterns in two-handed interaction with respect to these elements? What influences these patterns? To what extent are there individual differences? Our research makes the following contributions: 1) We propose a model of two-handed interaction using design applications with a keyboard and pointing device that describes the patterns that occur broadly across all users, for example that every user had a configuration in which s/he spent the majority of his or her time (for most users this was one hand on the left side of the keyboard and one hand on the mouse). 2) We further document where there are individual differences in patterns, namely in how much each hand moves.  3) We describe the three-staged analytic approach we took to arrive at these findings.  4) We outline implications for interaction design based on our findings. 1.3 Thesis Overview This thesis is organized into seven chapters. Previous work relevant to this research is summarized in Chapter 2. Chapters 3 through 5 discuss our observational studies ? Chapter 3 presents the methodology we used, Chapter 4 explains the data we collected, and Chapter 5 discusses our analytic approach. Chapter 6 summarizes the findings of this research, including a description of our model of two-handed interaction and implications for design. Chapter 7 concludes this thesis with a summary of our contributions and a discussion of directions for future work. 4  Chapter 2: Related Research This chapter describes the current state of research in two-handed interaction with desktop interfaces, including novel interaction techniques and models used for understanding and improving two-handed interaction. 2.1 Novel Techniques and Input Devices for Two-handed Interaction There are many examples of novel (or novel combinations of) techniques and input devices for two-handed interaction with desktop interfaces. Early in the study of two-handed interaction, Buxton & Myers (1986) showed that two hands outperformed one hand for simple selection/positioning and navigation/selection tasks using a 4-button puck and a slider. Later studies expanded two-handed interaction to a wider range of tasks, from drawing and editing geometric shapes (Bier et al., 1993; Casalta, Guiard, & Beaudouin-Lafon, 1999; Kurtenbach, Fitzmaurice, Baudel, & Buxton, 1997; Odell et al., 2004; Owen et al., 2005), to 3D interaction (Cutler, Fr?lich, & Hanrahan, 1997; Malik & Laszlo, 2004; Zeleznik, Forsberg, & Strauss, 1997), and scrolling and text editing (McLoone et al., 2003; Myers, Lie, & Yang, 2000). These techniques and technologies (re)distribute subtasks across hands and make subtasks accessible in new ways. Examples of this research include proposals for new input devices like the Microsoft Office Keyboard (McLoone et al., 2003) that added a ?left pod? to the keyboard to make navigation and editing options accessible to the left hand, the PadMouse (Balakrishnan & Patel, 5  1998) and TouchMouse (Hinckley, Czerwinski, & Sinclair, 1998) that augmented mice with touch sensors to provide new ways to activate modifiers and commands, and others (Cutler et al., 1997; Malik & Laszlo, 2004; Myers et al., 2000; Shaw & Green, 1997). Several studies explored the use of two pointing devices, including using two hands to control two cursors for 3D interaction (Zeleznik et al., 1997), and using a puck in the non-dominant hand and a stylus in the dominant hand for rectangle editing (Casalta et al., 1999) and curve manipulation (Owen et al., 2005). A few combined the use of two pointing devices with new interface tools and techniques, such as the ?Toolglass widgets? (Bier et al., 1993) and ?two handed stretchies? technique (Kurtenbach et al., 1997). Most recently, two-handed interaction has been examined in the context of touch-based input and tablets. For example, Yee (2004) examined two-handed interaction with a tablet display where the dominant hand uses a stylus and the non-dominant hand operates a touchscreen. Wagner et al. (2012) explored bimanual touch interaction with a tablet where the non-dominant hand simultaneously provides support and interacts with the tablet in conjunction with the dominant hand. In this research, we don?t propose a new technique or technology, but rather we describe a new way to understand and model current two-handed interaction with a keyboard and pointing device. In our efforts to understand and model this interaction, we also reflect on the extent to which the benefits of using two hands are present. 2.2 Asymmetric and Symmetric Two-handed Interaction Guiard (1987) identifies three classes of human everyday manual activities ? unimanual, symmetric bimanual, and asymmetric bimanual. He emphasizes that, for most skilled manual activities, the two hands play different (asymmetric) roles. Guiard?s principles of bimanual gestures are that (1) the left (non-dominant) hand provides frames of reference for the right (dominant) hand, (2) the right hand acts at finer spatial and temporal scales than the left, and (3) left hand actions precede right hand actions. His widely used ?kinematic chain model? describes the asymmetric division of labor between hands. The majority of two-handed interaction research, including most of the examples mentioned in the previous section (Section 2.1), deal with asymmetric bimanual interaction. However, there is some research that explores symmetric bimanual interaction (Balakrishnan & Hinckley, 2000; Latulipe, 2006). 6  While we weren?t specifically looking into asymmetric (or symmetric) two-handed interaction, we were interested in seeing the extent to which current two-handed interaction with a keyboard and pointing device is asymmetric/symmetric. 2.3 Models of Two-handed Interaction Guiard?s kinematic chain model and principles of bimanual gestures (Guiard, 1987) remain the principal model for two-handed interaction. While Guiard models the general use of two hands, many researchers have successfully applied Guiard?s framework to the design of two-handed interactions with desktop interfaces. For example, the ?Toolglass widgets? technique (Bier et al., 1993) was designed following Guiard?s framework. Kabbash et al. (1994) tested four techniques for performing a compound drawing/color selection task ? a unimanual technique and three bimanual techniques, including one that used the Toolglass widget ? and found that the Toolglass technique had the best overall performance.  Buxton?s three-state model of graphical input (Buxton, 1990) helps characterize some basic properties of input devices and interaction techniques using state-transition diagrams where each state represents a device state (e.g. out of range, tracking, and dragging for a tablet with stylus). This model can be useful in modeling two-handed input (Hinckley, Czerwinski, et al., 1998). Fitt?s Law can be used to understand one-handed aspects of two-handed interaction. The more comprehensive GOMS model (Card, Moran, & Newell, 1983) can also be used to understand two-handed interaction. For example, Gray et al. (1992) found that a new, supposedly improved, workstation for telephone operators at NYNEX was slower than the old workstation, in part because the design of the workstation didn?t take into account how operators used both hands for their tasks. The GOMS model Gray et al. used did account for the use of two hands and showed that the proposed keyboard had grouped commonly pressed keys together, which encouraged sequential one-handed key pressing instead of the faster, alternating two-handed key pressing that permits a degree of parallelism in the old workstation. While Buxton?s three-state model, Fitt?s Law and the GOMS model can be applied to aspects of two-handed interaction, they are not models of two-handed interaction. In contrast to these more general models, our model is descriptive and specific to two-handed interaction with a keyboard and pointing device. We look at and model aspects of two-handed 7  interaction that haven?t been modeled before, namely our three fundamental elements: (1) the configurations hands assume, (2) how much time is spent in each configuration, and (3) how often hands move between the configurations.  The next chapter discusses the methodology we used for observing two-handed interaction. 8  Chapter 3: Methodology To gain a systematic understanding of users? two-handed interaction behavior and address our research questions, we observed a variety of participants, some in their natural work environments performing their own work tasks, and others in a lab setting where their application, tasks, and apparatus were controlled for. This chapter describes the participants, the apparatus, and procedure for our observational studies. The study materials (including recruitment email, consent forms, screening questionnaire, and interview questions) can be found in Appendix C. 3.1 Participants We focused on experienced, regular users because we were interested in expert-level usage, not in learning effects for novice or casual users. Users were considered experienced if they used a design application for at least six hours per week and had been using that application for at least one year. For the majority of participants, their reported design application usage was far greater than this minimum: 68% of participants reported using their design application for 20+ hours/week, and 76% reported having used their application for 3+ years. Naturalistic Participants (NPs) ? We observed 20 participants (5 female) in their natural work environments: 17 were done in a major Canadian city; the other 3 in a major U.S. city, as explained below. NPs included students from design programs at two universities, a contract drafter, as well as employees from architecture firms, a visual effects studio, and a game development company. NPs used a variety of design applications, including Photoshop, 9  Illustrator, InDesign, Vectorworks, General CADD Pro, SketchUp, Maya, Unity, Nuke, Houdini, AutoCAD and Silhouette. NPs were recruited using snowball sampling, starting with the authors? professional contacts in design-related companies and university programs who were asked to forward the study invitation to their employees, colleagues, or students. Lab Participants (LPs) ? We observed 20 participants (7 female) as they participated in a larger Autodesk lab study that took place in a major U.S. city; the first author ran that study while on an internship. LPs included drafters, architects, designers, and students who regularly used AutoCAD. They were recruited from Autodesk?s pool of local AutoCAD users who had signed up to participate in user studies. Lab/Naturalistic Participants (LNPs) ? 3 participants participated as both LPs and NPs (meaning they first participated in the lab study and then, at a later date, they were also observed in their own workplace). This allowed us to compare 3 individuals? two-handed interaction in the two settings.  In total, we had 37 unique participants. All participants completed a preliminary screening questionnaire to determine if they used a design application and if they fit our criteria for being an experienced user. 3.2 Tasks NPs performed one of their own work tasks, such as creating templates for business cards, modeling a 3D building, or creating a digital painting, using the design application they regularly used. We refer to these tasks as naturalistic tasks. LPs, by contrast, performed a set of 18 specified lab tasks in AutoCAD by following a task instruction booklet they were given as part of the Autodesk lab study (only partially described in this thesis). Tasks ranged from testing specific AutoCAD features (plotting, rendering) to creating and modifying drawings. 3.3 Apparatus NPs were observed in their own workplaces, using their own software, computers, and input devices. A standalone video camera recorded participants? hands and input devices as they 10  performed their task. If we had permission from a participant?s manager, a second video camera recorded their screen (5 NPs had such permission). Figure 1 shows an example setup for NPs.  Figure 1. Example setup for Naturalistic Participants (NPs). LPs performed their tasks in a usability lab at an Autodesk office. They all used the same software, computer, and input devices, except two participants who brought their own ergonomic mice. We recorded LPs? hands and input devices as well as their screens. 3.4 Procedure NPs performed their naturalistic tasks for 15 to 20 minutes at the beginning of the observational session. If a participant did not complete their task within 20 minutes, they were stopped at that point. LPs performed all 18 lab tasks sequentially in the order specified in the instruction booklet. Participants worked on each task until it was completed or they exceeded the cut-off time for the task. Task times for the 18 tasks varied between 1-15 minutes. (The large variation was due to differences in both task complexity and the speed of the participant.) After each participant performed his or her task(s), we conducted an open-ended interview. Participants were asked about their input device usage, hand movement, and any other interesting 11  interactions that had been observed while they performed their task(s). Sessions with NPs lasted 40 to 60 minutes. Sessions with LPs lasted 1.5 to 3 hours.  A full list of participants, including what application and input devices they used, and the number of samples we coded from each participant can be found in Appendix A.  In the next chapter we discuss what data we collected from these observational studies.  12  Chapter 4: Data Collection We recorded approximately 67 hours of video data. Selecting which tasks to video code, and how to code them took careful consideration. This chapter describes how we selected the tasks that were coded and the data we collected from coding those tasks. 4.1 Which Tasks were Video Coded We coded the first 15 minutes of video for all naturalistic tasks performed by NPs and LNPs. Given that naturalistic tasks were not defined, we essentially normalized based on time. For lab tasks, we normalized on task. We selected one of the 18 lab tasks to code, hereafter referred to as lab task 1 (average task completion time: 11.81 min, s.d.: 3.31 min). In this task, LPs were given a printed copy of a floor plan and were told to recreate it using AutoCAD. We coded full videos of this task for 11 LPs who had completed the task. The other 9 LPs had not completed the task and/or we did not have video for them because of a technical error. Because coding multiple tasks for the same participant would allow us to explore whether or not there were individual differences in two-handed interaction patterns across tasks, we also coded a second task for all participants for whom we had video of a second task. This included 1 NP who performed two naturalistic tasks, and 7/11 LPs for whom we had coded lab task 1. For this second lab task, lab task 2, LPs were asked to modify an existing floor plan (average completion time: 6.78 min, s.d.: 3.51 min). Due to a technical error we did not have video for this task for the other 4/11 LPs. Out of the 18 lab tasks LPs performed, lab task 1 and 2 were selected because they were longer and not AutoCAD-specific.  13  We use the term samples to uniformly refer to the segments of the different tasks that were video coded. For NPs, samples are the first 15 minutes of their tasks. For LPs, samples are up to two complete tasks of the 18 tasks they performed. We identify individual samples by the sample number: s*/#, where * is the sample number and # is the total number of samples from that participant (e.g., NP1s1/2 is the first of two samples we have for NP1). For LPs, s1/2 refers to lab task 1 and s2/2 refers to lab task 2. In total, we coded 39 samples from 29 different participants that spanned 8.01 hours of video. Nine participants had multiple samples: 8 (LP1-LP6, LNP2, NP1) had 2 samples and LNP1 had 3 samples. The number of samples we coded from each participant can be found in the Participant Information table in Appendix A. 4.2 Data Captured from Video Coding The aspect of two-handed interaction we focused on was the movement of the two hands relative to the input devices. We only coded major movements ? movements from one input device to another and movements of a distance of more than three keys over on the keyboard. Minor movements ? movements while a hand was on the mouse and movements three keys or less across the keyboard ? were not coded. We picked three keys because we found that to be the point at which the whole hand would have to move (rather than just reaching with fingers). Figure 2 shows an example of a left hand movement across the keyboard.  Figure 2. An example of a hand movement across the keyboard. In our coding scheme this would be a move from keyboard left, to keyboard right, and back to keyboard left (see keyboard regions in Figure 3). 14  We divided the area participants? hands moved within into nine regions based on the input devices ? mouse (for when a hand was in contact with the mouse), 2nd mouse (for one of our participants who used two pointing devices, described below), and off (for when a hand was not in contact with any input device), plus six areas of the keyboard: numpad, arrow keys, function keys, delete keys, keyboard right, and keyboard left. Keyboard right was for all the keys on the right side of the ?main area? of the keyboard, keyboard left was for all the keys on the left side. Major movements within the main area of the keyboard were always clearly to the right or left side. Figure 3 shows the six keyboard areas we used.  Figure 3. Regions of the keyboard. Keyboard image (minus markup) in the public domain. Most of the regions emerged naturally. The mouse is separate, so it made sense to treat it as its own region. With the exception of keyboard right and keyboard left, the keyboard regions came from the existing groupings of keys on regular keyboards. The keyboard right vs. keyboard left distinction came from observing participants, and from having to account for the fact that major hand movements could take place within the ?main area? of the keyboard. While coding the first few videos, we tested using these regions and found that coding movements between the regions covered all major hand movements. For each sample, every time a hand moved to a different region, we coded which hand moved, the region it moved to, and the move time. Two investigators (including the author) did the video coding. One coded 12 samples, the other coded 25. The investigators coded 2 samples together to train the second investigator. One sample was then coded separately by both investigators to check for inter-coder reliability. Both coding files for that sample were checked to see that they 15  had the same hand movements coded to the same regions with the same time stamps, which they did. 4.3 Input Devices Used During video coding, we also noted what input devices participants used. All participants used a keyboard and at least one pointing device. All participants used a full ?regular? keyboard (like in Figure 3), except 3 (NP1, NP2, NP8) who used a laptop keyboard, and LNP3 who used an ergonomic keyboard. All participants used a mouse, except NP8 who used a Wacom tablet with stylus in addition to a mouse, NP14 who used a Wacom tablet with stylus instead of a mouse, and NP2 who used a trackpad. For simplicity, in the rest of the paper we refer to all input devices as mice. Because NP8 used both a mouse and a tablet, we added the 2nd mouse region to our list.  All participants except two had their pointing device to the right of their keyboard. Both exceptions were left-handed individuals who used their pointing device on the left of their keyboard. (There were three left-handed individuals in total but the third one used the same setup as right-handed individuals, with the pointing device to the right of the keyboard.)  Video coding was done in the same way for left-handed and right-handed participants. 4.4 Unused Data We also attempted to code what keys were pressed and what was on screen when a hand moved. However, this information was incomplete because it was not always possible to see in the videos what keys were being pressed and because we only had screen recordings for about half of the samples. As a result, we did not analyze that data. The next chapter describes the methods we used to analyze the data we collected from video coding and our findings from that analysis.  16  Chapter 5: Analysis In order to answer our research questions we took a three-staged analytic approach, which we describe in this chapter. Our initial analysis consisted of summarizing the data through descriptive statistics. We looked at patterns in terms of where hands moved, how often they moved, and how much time they spent in certain regions. From there we plotted the data in timeline visualizations to see if time-based patterns might emerge through visual inspection. Finally, we modeled the data using matrices, which supported an algorithmic approach to clustering. In addition, we analyzed interview transcripts to see what participants had to say about how they used their two hands. 5.1 Initial Analysis: Descriptive Statistics  Our analysis showed similarities in two-handed interaction patterns across samples in terms of what regions hands moved to and how much time they spent in each region. We also found differences across samples in terms of how often hands moved. 5.1.1 Similarities in Two-Handed Interaction Patterns In this section we report a mix of averages and medians. Averages show that there are regions that no participants? hands moved to at all as well as regions only a few participants? hands moved to. Medians would be 0 in both those cases. We otherwise largely report medians to give less weight to the especially active participants. 17  Mouse Hand and Keyboard Hand All participants had one hand that primarily operated the mouse and one hand that solely used the keyboard. Participants? mouse hand (MH) spent an average of 87.6% of the time on the mouse. Participants? keyboard hand (KH) spent an average of 83.31% of the time on the keyboard (on any keyboard region). Participants? spent an average of 75.51% of the time with their MH on the mouse and KH on the keyboard.  Home Position All participants? hands had a home position ? A participant?s hand?s home position is considered to be the position in which it spends the majority of its time. Everyone?s MH home position was on the mouse. For 34/39 samples, KH home position was on keyboard left, for 4 samples (LP8s1/1, LP4s1/2, LP4s2/2, LP5s1/2) KH home was off, and for 1 sample (LNPs2/3) KH home was keyboard right. The median percentage of time spent with KH in home position was 82.74%. KH home position stayed the same across tasks for 7/9 participants with multiple samples. Where Hands Move  On average, over the course of a sample, participants? MH moved to 3.74 different regions (s.d.: 1.45) and their KH moved to 3.44 regions (s.d.: 0.85). Figure 4 shows the average percentage of time each hand spent in each region they moved to.  Figure 4. Average % of time each hand spent in each region. Regions with averages < 0.5% for both hands are omitted. MH = mouse hand, KH = keyboard hand. (N = 39 samples) 18  Hands don?t cross over into each other?s home position ? Participants? MH rarely moved to the left side of the keyboard. The KH also rarely moved to the numpad or arrow keys, so another way to look at this is that each hand only travelled so far. Both hands move frequently to keyboard right: ?no hands land? ? The region between the two hands, keyboard right, was covered by both the MH and KH.  Participants were in an average of 7.68 different two-handed configurations over the course of a sample (s.d.: 2.63). Besides KH on keyboard left and MH on the mouse, the most common configurations were KH off and MH on the mouse (M = 13.13%) and KH keyboard left and MH keyboard right (M = 5.28%). Figure 5 shows the average percentage of time the two hands spent in the most common configurations.   Figure 5. Average % of time hands spent in the seven most common configurations. (N = 39 samples) 5.1.2 Differences in Two-handed Interaction Patterns Even though we found consistency in what regions hands moved to, there was a large variation in how often hands moved between those regions and how movement was distributed between hands. Half of participants had more MH moves than KH moves, and the other half had the reverse. A graph of percentage of moves by MH vs. KH (Figure 6) illustrates this variation in distribution well. The variation in absolute movement for each hand is related but different and is shown in Figure 7 for comparison. We next explore a few factors that contribute to these differences. 19   Figure 6. Percentage of total moves by MH vs. KH for each sample, ordered by increasing % MH. (N = 39 samples)  Figure 7. Number of moves per minute by MH vs. KH for each sample, ordered by increasing hand moves per minute for the MH. (N = 39 samples) Hand Movement for NPs vs. LPs   median # of moves (per minute) MH KH Total (MH + KH) LPs 2.76 4.72 7.48 NPs 2.47 1.33 3.80 Table 1. Median number of MH and KH moves for all samples from LPs vs. NPs. (N = 39 samples) We found that LPs had a much higher median number of hand moves per minute for the KH than NPs (see Table 1). Our observations suggest that the main reason for this difference is that 20  AutoCAD (the application used by all LPs) has a command line that was frequently used by participants. Command lines make more options accessible from the keyboard, which leads to more hand movement to/from and around the keyboard. Since all but one participant who used a command line was using AutoCAD, it?s important to note that the patterns we saw for command line users may just apply to AutoCAD users or CAD applications. However, samples where CAD applications without command lines (Vectorworks and SketchUp) were used did not have the same patterns we saw for command line users, and the one NP who used a command line in an application other than AutoCAD (General CADD Pro) did have similar patterns. Table 2 shows the difference in hand movements for participants who used a command line vs. those who did not. The difference in KH movement is significant (p < .001, using a t-test).   command line? median # of moves (per minute) MH KH Total (MH + KH) yes (n = 22) 2.76 5.10 7.86 no (n = 17) 2.47 1.2 3.67 Table 2. Number of hand moves for samples using an application with and without a command line. (N = 39 samples) These findings suggest that, not surprisingly, applications (at least in terms of whether or not they have a command line) do influence two-handed interaction patterns. The Effect of KH Home Position  KH home position median # of moves  (per minute) median % task time off home position MH KH MH KH off 6.03 5.21 25% 28% kb left 2.49 4.64 13% 23% Table 3. Number of hand moves and time off home position for samples where participants? KH had a home position of ?off? versus ?keyboard (kb) left.? (N = 18 samples) For 4 of the LP samples (LP8s1/1, LP4s1/2, LP4s2/2, LP5s1/2), participants? KH had a home position of ?off.? We compared these samples to all other LP samples to see if/how a KH home position affects two-handed interaction. As shown in Table 3, samples where participants had a 21  KH home position of off had more MH and KH movement, with the doubling in MH movement being the starkest. Their hands also spent more time off home position. This suggests that users? choice of home position influences their two-handed interaction patterns, though it?s unclear to what extent that decision is consciously made. Speed/efficiency   task completion time group (median time) median # of moves (per minute) MH KH Total slow (14.12 min)  6.03 4.64 10.67 fast  (5.42 min) 1.49 6.98 8.47 Table 4. Number of hand moves for ?fast? and ?slow? task completion times. (N = 12 samples) We used the time it took each LP to complete the lab tasks (task completion time) as a measure of efficiency in order to see if efficiency affected two-handed interaction patterns. We looked at two equal-sized groups of samples ? the ?slow? group was chosen to include the 3 samples with the longest task completion times from both lab tasks (6 samples total), and the ?fast? group included the 3 samples with the shortest completion times from both lab tasks (6 samples total). As shown in Table 4, samples in the fast group moved their KH, on average, almost 5 times as much as their MH (a significant difference, p = .05 using a paired t-test). Samples in the slow group had much more evenly distributed MH and KH movement. For 3/6 samples in the slow group participants had a KH home position of off, again suggesting that KH home position impacts efficiency. The idea of hands having a home position that minimizes the distances hands travel, as we discuss in this section, has been previously identified as a benefit of two-handed interaction (Buxton & Myers, 1986), but our data documents the degree to which this actually occurs in the kind of two-handed interaction we observed. 22  5.1.3 Asymmetric Two-handed Interaction These results show that in current two-handed interaction with a keyboard and mouse the two hands are asymmetric in terms of input device use (mouse vs. keyboard), regions they move to, and how much they move. This asymmetry in input device use and hand movement is different than the asymmetry in hand roles that Guiard describes (Guiard, 1987), but the fact that we found some kind of asymmetry suggests Guiard?s model and principles may apply to this type of interaction.  5.2 Timelines: Visualizing Hand Movement Data Our summary data analysis provided interesting insights into general two-handed interaction patterns, but it missed an important part of our definition of two-handed interaction ? that is, hand movement patterns over time. To visualize and compare participants? hand movements over the course of a sample, we created timeline graphs that plotted the regions participants? MH and KH were in at every half-second interval. The y-axis represented the 9 regions hands were in and the x-axis represented time. Figure 8 shows an example of a timeline for a sample from LNP1, which reveals that his MH rarely moved off the mouse while his KH made regular moves from keyboard left to keyboard right and off. Using this format, we printed out the timelines for all our samples and spread them out on a large table so we could visually inspect and compare them.   Figure 8. Example timeline from LNP1s1/3. 23  We expected we might see different patterns depending on where participants were in their task (e.g. beginning, middle, or end). Our visual inspection of the timelines didn?t reveal any such noticeable changes in patterns within samples, so further analysis for this was not performed. What did stand out, however, was that there were large differences in two-handed interaction patterns between samples, and that these differences seemed to group. We quickly and informally sorted some of the samples into eight different groups of 1 to 4 samples each based on the amount that each hand moved and what regions each hand moved to. Timelines and descriptions of the eight groups can be found in Appendix B. To continue exploring these groupings, we moved to a more systematic, numerical approach using cluster analysis. 5.3 Cluster Analysis In order to perform cluster analysis on our data, we created transition matrices that encapsulated all three elements of two-handed interaction for which we collected data: (1) configurations hands were in, (2) how much time was spent in each of these configurations, and (3) how often hands moved between these configurations. In addition to further exploring the groupings we saw with timelines, the matrices also allowed us to compare variance in some elements of two-handed interaction. 5.3.1 Transition Matrices We had three types of transition matrices for each sample representing patterns in MH movement alone, KH movement alone, and two-handed movement combined (referred to as MH matrices, KH matrices and 2H matrices, respectively). The MH and KH matrices were 9x9, with one row and one column for each of the nine regions. To fill the matrices, for every half-second interval, we looked at the region the hand was in at that time (current region) and at one interval prior to that (previous region). Each cell?s value was the average number of times (per unit time) a hand moved from the previous region (the row?s value) to the current region (the column?s value). To keep track of how much time hands spent in each region we included times where the current region was equal to the previous region (i.e. when hands stayed in the same spot). Figure 9 shows the layout for the 9x9 matrices. The 2H matrices were 81x81, with one row and one column for each two-handed configuration, represented as a 24  pair of hand positions, e.g. [KH@KeyboardLeft MH@Mouse]. Of the 81 possible configurations, participants only ever moved to 31, which allowed us to consider only 2H matrices of size 31x31. Even the 31x31 matrices were sparse, as can be seen in Figure 10. 2H matrices were filled in the same way as MH/KH matrices. Matrices had two types of cells ? cells that represented changes in region/configuration (i.e. moves), and cells that represented staying in the same region/configuration (i.e. time spent in regions/configurations). The cells that represented time spent in regions/configurations fell along the diagonal; moves were the off-diagonal cells.   Figure 9. A 9x9 matrix that captures movement data for a single hand. We created two 9x9 matrices with this same layout for each sample - one for the MH and one for the KH. This matrix represents the average number of times (across all samples) that a participants? KH moved from any one region to any other region (including staying in the same region). Shading in the cells represents the frequency with which a move was made: the darker the shade, the more often participants? KH made that move.  25   Figure 10. A 31x31 matrix that captures movement data for two hands. We created a 31x31 matrix for each sample. This matrix represents the average number of times participants? two hands moved from any one configuration to any other configuration (including staying in the same configuration) - the darker the shade, the more often participants? hands made that move. 5.3.2 Comparing Variance in Hand Movement Patterns We used the matrices to measure variance in hand movement patterns by calculating the cosine similarity of every pair of matrices, and computing the mean and statistical variance of the resulting similarity scores. We did these calculations for just moves (by excluding the diagonal), just time spent in configurations (by excluding the off-diagonal), and a combination of both for all MH, KH, and 2H matrices. For these calculations and our subsequent cluster analysis, we treated the matrices as one-dimensional vectors by appending one row after the next. For the MH, we didn?t see a lot of differences across samples (average similarity score was high and variance was low). For the KH, we saw much less similarity and a lot more variation. This tells us that the two hands were very asymmetric in terms of their movement patterns. Looking at both hands together showed even less similarity and more variation.  26  Both time spent in configurations and moves had high variation, but within the same pairs of matrices they found different levels of similarity. For example, a low cosine similarity between two specific samples based on time spent in configurations didn?t mean the same two samples would have a low cosine similarity based on moves (and vice versa). To increase the variance in cosine similarity, we combined moves and time spent in configurations (i.e. included the full 2H matrices). We started by weighting these components equally, but because values for time spent in configurations were naturally larger than values for moves, we then tried gradually increasing the weight assigned to moves to better balance each component. We found that variance peaked when cells representing moves were assigned a weight of nine. 5.3.3 Cluster Analysis To further explore the groupings in samples we had found from inspecting the timelines, we ran a cluster analysis on 39 cases (representing all 39 samples). Since we found that including both hands as well as both moves and time spent in configurations (with moves assigned a weight of nine) had the largest variation, we used these vectors for our cluster analysis. Following a common approach to cluster analysis (Burns & Burns, 2008), a hierarchical cluster analysis using Ward?s method was run and showed the cases split into four main clusters, then a k-means analysis (using running means) was conducted to determine the four clusters. Cluster analysis provides groupings but does not give meanings or labels to the groupings. We have given labels to each cluster to reflect the trends that we saw within a cluster. Cluster 1: High KH and MH movement ? The first cluster included 7 samples (LP1s1/2, LP7s1/1, LP8s1/1, LP4s1/2, LP4s2/2, LP5s1/2, LP5s2/2) performed by 5 different participants. All 4 samples where the participant had a KH home position of ?off? fell into this cluster. Six of the 7 samples were in the bottom (slow) half in terms of completion time for their task. In these samples participants moved both their hands a lot (Mdn MH moves/min: 6.52, Mdn KH moves/min: 4.64) as compared to the median from all samples (MH 2.49, KH 2.07). Cluster 2: High KH movement ? The second cluster included 5 samples (LNP1s1/3, LNP1s2/3, LNP1s3/3, LP6s2/2, LNP2s1/2) performed by 3 different participants. LNP1 had the fastest task completion time for both lab tasks. LP11 had the second fastest task completion time for the second lab task. Participants? MH barely moved (Mdn MH moves/min: 0.2 vs. 2.49 for all samples) while their KH moved a lot (8.93 vs. 2.07 for all samples). 27  Cluster 3: Extraordinarily high KH movement ? Cluster 3 included 1 sample (NP3s1/1) from 1 participant who was clearly in his own category. Similar to the samples in cluster 2, for this sample the participant moved his KH a lot and his MH very little. However, his KH movements (24.73 moves/min) far outnumbered even those in cluster 2 (Mdn moves/min: 8.93). While there isn?t a task completion time for this task, during observations it was noted that this participant seemed incredibly fast at completing his task. Cluster 4: Low/moderate KH and MH movement ? The fourth cluster included all 26 remaining samples performed by 23 different participants. Samples in this cluster had low to moderate MH and KH movement. These clusters support our findings from our initial analysis that KH home position and speed/efficiency are strong factors in the differences that appear in two-handed interaction. We believe that given data from new samples, these four clusters would still stand, and that new samples could easily be classified into one of them. Gathering and analyzing new data could also help further define the largest cluster (cluster 4). Tasks and Individual Differences Another question we probed with cluster analysis was whether individuals? two-handed interaction patterns differed across tasks. There were 8 participants (LP1 - LP6, LNP2, NP1) who had samples from two tasks and 1 participant (LNP1) who had samples from three tasks. For 6 of those 9 participants (including the participant with three samples), their samples ended up in the same cluster. For the 3 participants whose samples ended up in different clusters, changes in number of hand movements between samples were between low and moderate or moderate and high. No participants jumped from having very low movement of one hand to very high movement of the same hand (or vice versa). This means that across different tasks individuals tended to have similar two-handed interaction patterns. We further explored these individual differences across tasks using cosine similarity. We calculated the cosine similarity for each the 19 samples from these 9 participants with all other samples. For 8 of the 19 samples, the sample with the highest cosine similarity was the sample from the same participant (i.e. the samples that were most similar were the ones performed by the same individual). For 16 of the 19 samples, the sample from the same participant was in the top 25% in terms of cosine similarity scores (i.e. samples performed by the same individual were 28  more similar to each other than to at least 75% of all other samples). Cosine similarity thus shows that samples from the same individual aligned very well with each other, but not perfectly. 5.4 Interview Analysis To hear participants? perspective on their hand movement, we performed semi-structured interviews with all 37 participants. We asked participants to reflect on why they move their hands, and if they mind having to do so. We transcribed, coded, and performed thematic analysis on all interviews. Here we report on the two findings that suggest the biggest implications for design. We also briefly summarize reasons participants mentioned for moving their hands, which encompassed the largest portion of findings from our interview analysis. 5.4.1 Do Participants Mind Moving Their Hands? No participants mentioned that they didn?t like moving their KH across the keyboard. The 6 participants who were explicitly asked ?Do you mind moving your [keyboard] hand across the keyboard?? replied ?No.? Of the 20 participants who discussed moving their MH off of the mouse, 3/20 said they minded having to move it, 11/20 said they didn?t mind, 5/20 said they didn?t mind but not having to move their MH would be better, and 1/20 said he sometimes minded moving his MH. 5.4.2 Health Concerns During interviews, when discussing their hand movement, 4 participants (LP4, LP9, NP13, NP12) mentioned health concerns.  Health concerns affected how or how often participants moved their hands. 3 participants mentioned that they like moving their hands because they think it?s healthier, and 1 participant said he moves his hands to reduce behavior he felt was unhealthy, like stretching his fingers out.  Health concerns also influenced input device choice, which can in turn influence hand movement. 4 participants (overlapping with 2 that raised health concerns) used an ergonomic mouse, which didn?t seem to affect their two-handed interaction behavior. 1 participant (LNP3) used an 29  ergonomic keyboard (split down the middle), which did affect behavior. He mentioned it made it harder for him to move his KH around the keyboard. Indeed, his number of KH moves (0.4 moves/min) was much lower than the overall median (2.07 moves/min), especially for an AutoCAD user, and his number of MH moves (3.47 moves/min) was higher than the overall median (2.49 moves/min).  5.4.3 Factors that Influence Hand Movement Reasons participants mentioned for moving their hands included factors that directly affected their hand movements as well as factors that determined whether they performed an action using the keyboard or mouse, which, in turn, influenced whether or not they had to move their hands. The factors they mentioned were: state of the other hand (e.g. if the other hand was busy or further away from a key), keyboard layout, speed (e.g. whether using the mouse or keyboard would be faster for a particular action), familiarity with the application, how options are accessible in the application, force of habit, and affordances of input devices. Interestingly, one participant mentioned she moved her MH because pressing keys with the MH can seem more final ? ?That [pressing ?Enter? with MH] seems like more of a finality? (LP3). 5.5 Reflection on Methods Used Our initial descriptive data analysis identified at a high level what was similar and what was different in two-handed interaction patterns across participants. The visual timelines didn?t reveal changes in patterns over time as we thought they might, but they led us to the important insight that differences in patterns could be grouped into a small number of sets based on how often hands moved and where they moved. The transition matrices and cluster analysis enabled us to more precisely explore groupings. While eight groups came out of the quick informal visual inspection of timelines, only four groups emerged from the cluster analysis. These techniques are very different, and more data will be needed to reconcile their outcomes. In addition, the matrices were a compact way to encapsulate all our data in a way that showed the key components in the two-handed interaction patterns (e.g. KH patterns, MH patterns, home position, moves, time in configurations) and allowed us to separate or emphasize each of those components (e.g. by weighting the non-diagonal or collapsing it to a 9x9 matrix for just KH or MH).  30  Chapter 6: Discussion This chapter summarizes our findings and proposes a model of two-handed interaction with a keyboard and pointing device. It also discusses some implications for design that arise from our findings. 6.1 A Model of Two-handed Interaction  Based on our research we propose a descriptive model of two-handed interaction using design applications with a keyboard and pointing device. The model captures those patterns that occurred broadly across our users and thus we anticipate will generalize beyond the data we collected. Users use two input devices with design applications ? a keyboard and a pointing device (usually a mouse) ? with the mouse placed to the right of the keyboard. One of their hands primarily operates the mouse (MH) and the other primarily the keyboard (KH). Each hand?s position can be in one of 9 regions at any given time. The combination of the KH and MH positions makes a configuration. Each hand has a home position, which is the region that that hand spends the majority of its time. In terms of coverage of regions, the KH predominantly covers keyboard left, keyboard right, and off, and the MH predominantly covers the mouse, keyboard right, and off. Hands do not cross into each other?s home regions. The only region they both cover is keyboard right ? ?no hand?s land?. 31  A general way to represent our model would be a four tuple (P, homeK, homeM, T) where P is the set of regions that either hand could be in (in our case 9), homeK is the home position for the KH, homeM is for MH, and T:(PxP)2?R is a transition matrix of size |P|?|P| x |P|?|P| whose rows and columns correspond to all possible configurations, and R is the real numbers that give the probability of KH and MH moving from one configuration to another configuration (including itself). 6.2 What Influences Two-handed Interaction We used a mixed-methods and mixed-analytic approach to address our research questions. Given the inherent strengths and weaknesses of each method, and the inherent limitations of the data we collected, we are only partially able to tease apart the influential factors.  We didn?t have any samples from the same participant using different applications, limiting our exploration of application as a factor. We did, however, find that whether or not an application has a command line does impact patterns. Additionally, participants mentioned that how options are made accessible within an application influenced their hand movements. Our cluster analysis best addressed the impact of task and user. We found that users generally stayed in the same cluster across tasks, and at the very least never had extreme differences in number of movements for each hand across tasks. By contrast, we found that samples of the same task (lab task 1 and lab task 2 across participants) spanned multiple clusters. This suggests that task is a weak determinant of two-handed interaction patterns. Overall, this suggests that individual differences across users are the larger determinant. Users make choices like where to keep their hands homed, whether to use the mouse or the keyboard for certain actions, and which hand to move to a region, all of which influence their two-handed interaction. It remains unclear, however, to what extent users are making these decisions consciously. 6.3 Value of Our Model of Two-handed Interaction The context of two-handed interaction is very broad, and we chose to focus our inquiry specifically on the movement of the two hands relative to the input devices. For our analysis, we further narrowed our lens by selecting three fundamental components to code: (1) configurations 32  hands were in, (2) how much time was spent in each of these configurations, and (3) how often hands moved between these configurations. Our results shed light onto the aspects of these components that were most valuable.  KH movement patterns distinguished one sample from the next better than MH movements. Similarly moves between configurations distinguished better than time spent in configurations. Home positions ? While every participant had a home position, a participant?s chosen home position influenced how often they had to move their hands, which impacted efficiency. Limited number of configurations ? While there were 81 possible configurations, only a much smaller subset of 31 were observed. Some configurations are physically challenging or uncomfortable (e.g. MH on keyboard left, KH on numpad) and are unlikely to ever be observed. Other configurations were not observed simply because our participants? MH and/or KH never moved to certain regions (e.g. KH on numpad). It is not physically challenging or uncomfortable for hands to move to those regions, so those configurations might be observed with different tasks or applications. Distinction of the two hands as keyboard vs. mouse hand - We had originally distinguished the two hands as dominant/non-dominant (following Guiard (1987)), but that distinction was the same as the mouse/keyboard hand for all but one participant who used the mouse with his non-dominant hand (a left-handed person who had adapted to work like a right-handed person). That one participant highlighted a deficiency in the dominant/non-dominant distinction. Given that other exceptions exist (e.g., ambidextrous persons, although not in our sample), the keyboard/mouse hand distinction provides a better representation of what participants were doing with each hand. 6.4 Implications for Design The understanding of two-handed interaction gained from applying our model has implications for the design of input devices and software. Designing for efficiency. One commonly chosen design criterion is efficiency ? that is, designing interactions so that users can do more in less time. Our results suggest having a KH home position that is on the keyboard, low MH movement, and a highly active KH are related to high 33  efficiency. Efficiency could be increased by having the MH move less as a result of making more actions accessible on the keyboard by the KH alone, which is consistent with efforts such as the Microsoft Office Keyboard (McLoone et al., 2003). Our participants? MH most commonly moved to numpad, arrow keys, and keyboard right. Participants? specifically reported that they needed two hands on the keyboard to press multiple keys that were spread out. To address this, the keyboard could be physically rearranged, for example, by moving the numpad to the left side of the keyboard?s main area and by changing software key mappings such that keys usually pressed by the MH are changed to keys closer to the KH thereby minimizing key press combinations that require two hands; for example, common shortcuts, such as Ctrl-P.  Designing for health. Another criterion for design is health. Efficiency isn?t everything, and in fact some of our participants reported just that. They had health concerns and felt there was a tradeoff between speed and health. Participants thought that increased hand movement (of the ?major? sort we studied) was healthier, but to our knowledge, there is no supporting research. By contrast, the field of ergonomics has extensive publications about health in computer usage in terms of posture (Gerr, Marcus, & Monteilh, 2004) and at the level of finger and wrist (?minor?) movements (Thomsen, Gerr, & Atroshi, 2008). More research is required to determine a healthy level of ?major? hand movement and to incorporate that into design. Designing adaptive interfaces. Systems could incorporate two-handed interaction patterns into models of the user. For example, we observed that when a participant?s hands were in the ?off? region (excluding participants whose KH home position was ?off?) the participant was often referring to external resources like manuals, suggesting that the off position may be related to competence in the task.  This information could be used by an adaptive interface to provide contextual assistance to a user who is struggling.  34  Chapter 7: Conclusion This final chapter summarizes the contributions and limitations of this research, and suggests several ideas for future work. 7.1 Summary of Contributions Two-handed interaction with a keyboard and pointing device is not one-size-fits-all ? there are individual differences, but there is also some predictability. Certain patterns were found across all individuals, including having a home position for each hand, having hands that don?t cross into each other?s home position, having one hand that primarily operates the mouse, and one hand that solely uses the keyboard. While some of these patterns have been found in previous research (Buxton & Myers, 1986), we?ve provided quantitative evidence on the extent to which they appear in current interaction with a keyboard and pointing device. Additionally, we found certain factors (speed and KH home position) influence two-handed interaction patterns and that user (as a factor) is a better predictor of their interaction patterns than the tasks they do or the applications they use. Finally, we found that two-handed interaction patterns cluster. Our cluster analysis suggests that if we know enough about an individual?s behavior to fit them into a cluster (like amount of hand movement), we might be able to predict other behavior (like their KH home position). Altogether, this research shows there is predictability in two-handed interaction.  35  7.2 Limitations Limitations arose both as a result of observing in naturalistic settings and of relying on video coding. 7.2.1 Limitations of Observing in Naturalistic Settings The observational studies we conducted in naturalistic settings gave us great insight into current two-handed interaction in participants? everyday work, however it made one-to-one comparison of the samples challenging. Finding companies who would allow a researcher to come into their office and observe their employees performing actual work tasks was also difficult. In the end, we found a few companies through professional and personal contacts who let us observe their employees, though for all of the companies we needed to sign NDAs and we were not allowed to record participants? screens. Despite working with a total of 37 participants, we only had more than one task for 9 participants, only one of which was a NP. Except for that one participant, each NP only performed one task while we were observing. LPs did each perform multiple tasks, but we were only able to code multiple samples from 8 of them due to technical errors in video recording and to the large time investment required to video code multiple tasks. We discuss further limitations of our video coding in the next section. 7.2.2 Limitations of Our Video Coding For our observations of participants we wanted to be as unobtrusive as possible, so we limited our video equipment setup to two small standalone video cameras with only one researcher to conduct the observation. No software was installed on participants? computers, and nothing was added to the participants? input devices or workspace. As a result, we had to rely on video coding to gather most of the quantitative data, which was an extremely time consuming process. It took an average of approximately 2 hours to code 15 minutes of video. This was partially because in addition to coding the time and region for every hand movement, we also coded what keys were pressed when a hand moved somewhere other than its home region. We ended up not using the keys pressed data because of time constraints and because it was incomplete (we couldn?t always 36  see what keys were pressed). Only coding hand movements to regions other than the hand?s home position saved us some time. We could easily assume that a hand was in its home position for any time where a movement hadn?t been coded. We tried automating hand movement tracking using ARToolKit (?ARToolKit,? n.d.), but that required attaching paper markers to participants? hands, which made some participants uncomfortable. Additionally, we often couldn?t get the video camera at a high enough angle to keep the markers fully visible as participants moved their hands. In the future, hand tracking in videos could be automated using blob tracking/image classifiers and tools like OpenCV (?OpenCV,? n.d.) or Crayons (Fails & Olsen, 2003), which would significantly speed up the data collection process, while still remaining as unobtrusive as possible. Logging mouse clicks and key presses would add an interesting layer to the data analysis, but would require installing software on participants? computers, which can be difficult to arrange, especially in the workplace where privacy and security are often significant concerns. 7.3 Future Work Future work includes iterating on our model. Gathering data from additional participants would allow us to see how well they fit into our existing model and adjust the model accordingly. We want to more deeply explore the factors that influence two-handed interaction, such as application and task, in order to better quantify their influence. Running lab experiments would address this challenge and would allow us to control for confounding factors. Finally we need to assess the generalizability of our model and the analytic approach we used to develop it. We had a limited number of non-mouse pointing devices in our study, so although we did not see an impact of device type, further investigation may be needed to see if our model extends to other pointing devices. We suspect our model will apply to other kinds of applications that rely heavily on both the keyboard and the mouse, such as productivity applications, as well as to other domains, such as PC gaming, but more work is required to know for certain. 37  References ARToolKit. (n.d.). Retrieved September 29, 2013, from http://www.hitl.washington.edu/artoolkit/ Balakrishnan, R., & Hinckley, K. (2000). Symmetric bimanual interaction. In Proceedings of the SIGCHI conference on Human factors in computing systems - CHI  ?00 (pp. 33?40). New York, New York, USA: ACM Press. doi:10.1145/332040.332404 Balakrishnan, R., & Patel, P. (1998). The PadMouse: facilitating selection and spatial positioning for the non-dominant hand. In Proceedings of the SIGCHI conference on Human factors in computing systems - CHI  ?98 (pp. 9?16). New York, New York, USA: ACM Press. doi:10.1145/274644.274646 Bier, E. A., Stone, M. C., Pier, K., Buxton, W., & DeRose, T. D. (1993). Toolglass and magic lenses. In Proceedings of the 20th annual conference on Computer graphics and interactive techniques - SIGGRAPH  ?93 (pp. 73?80). New York, New York, USA: ACM Press. doi:10.1145/166117.166126 Burns, R., & Burns, R. (2008). Chapter 23 - Cluster Analysis. In Business Research Methods and Statistics using SPSS. Retrieved September 29, 2013, from http://www.uk.sagepub.com/burns/chapters.htm Buxton, W. (1990). A three-state model of graphical input. In Proceedings of the IFIP TC13 Third Interational Conference on Human-Computer Interaction - INTERACT  ?90 (pp. 449?456). North-Holland Publishing Co. Buxton, W., & Myers, B. (1986). A study in two-handed input. ACM SIGCHI Bulletin, 17(4), 321?326. Card, S., Moran, T. P., & Newell, A. (1983). The Psychology of Human Computer Interaction. Lawrence Erlbaum Associates. Casalta, D., Guiard, Y., & Beaudouin-Lafon, M. (1999). Evaluating two-handed input techniques: rectangle editing and navigation. In CHI  ?99 Extended Abstracts on Human Factors in Computing Systems - CHI EA '99 (pp. 236?237). doi:10.1145/632716.632862 38  Cutler, L. D., Fr?lich, B., & Hanrahan, P. (1997). Two-handed direct manipulation on the responsive workbench. In Proceedings of the 1997 symposium on Interactive 3D graphics - SI3D  ?97 (pp. 107?114). New York, New York, USA: ACM Press. doi:10.1145/253284.253315 Fails, J., & Olsen, D. (2003). A design tool for camera-based interaction. In Proceedings of the conference on Human factors in computing systems - CHI  ?03 (pp. 449?456). New York, New York, USA: ACM Press. doi:10.1145/642611.642690 Gerr, F., Marcus, M., & Monteilh, C. (2004). Epidemiology of musculoskeletal disorders among computer users: lesson learned from the role of posture and keyboard use. Journal of Electromyography and Kinesiology, 14(1), 25?31. doi:10.1016/j.jelekin.2003.09.014 Gray, W. D., John, B. E., & Atwood, M. E. (1992). The precis of Project Ernestine or an overview of a validation of GOMS. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems - CHI  ?92 (pp. 307?312). doi:10.1145/142750.142821 Guiard, Y. (1987). Asymmetric division of labor in human skilled bimanual action: The kinematic chain as a model. Journal of Motor Behavior, 19, 486?517. Hinckley, K., Czerwinski, M., & Sinclair, M. (1998). Interaction and modeling techniques for desktop two-handed input. In Proceedings of the 11th annual ACM symposium on User interface software and technology - UIST  ?98 (pp. 49?58). doi:10.1145/288392.288572 Hinckley, K., Pausch, R., Proffitt, D., & Kassell, N. F. (1998). Two-handed virtual manipulation. ACM Transactions on Computer-Human Interaction (TOCHI), 5(3), 260?302. doi:10.1145/292834.292849 Kabbash, P., Buxton, W., & Sellen, A. (1994). Two-handed input in a compound task. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems - CHI  ?94 (pp. 417?423). doi:10.1145/191666.191808 Kurtenbach, G., Fitzmaurice, G., Baudel, T., & Buxton, B. (1997). The design of a GUI paradigm based on tablets, two-hands, and transparency. In Proceedings of the ACM SIGCHI Conference on Human factors in computing systems - CHI  ?97 (pp. 35?42). doi:10.1145/258549.258574 Latulipe, C. (2006). A Symmetric Interaction Model for Bimanual Input. PhD thesis, The University of Waterloo. Retrieved September 9, 2013, from http://hdl.handle.net/10012/2915 Leganchuk, A., Zhai, S., & Buxton, W. (1998). Manual and cognitive benefits of two-handed input: an experimental study. ACM Transactions on Computer-Human Interaction (TOCHI), 5(4), 326?359. doi:10.1145/300520.300522 MacKenzie, I. S., & Guiard, Y. (2001). The two-handed desktop interface. In CHI  ?01 extended abstracts on Human factors in computing systems - CHI EA '01 (pp. 351?352). New York, New York, USA: ACM Press. doi:10.1145/634067.634275 39  Malik, S., & Laszlo, J. (2004). Visual Touchpad: A Two-Handed Gestural Input Device. In Proceedings of the 6th international conference on Multimodal interfaces - ICMI  ?04 (pp. 289?296). New York, New York, USA: ACM Press. doi:10.1145/1027933.1027980 McLoone, H., Hinckley, K., & Cutrell, E. (2003). Bimanual Interaction on the Microsoft Office Keyboard. In Human-Computer Interaction INTERACT?03 (pp. 49?56). Myers, B. A., Lie, K. P., & Yang, B.-C. (2000). Two-handed input using a PDA and a mouse. In Proceedings of the SIGCHI conference on Human factors in computing systems - CHI  ?00 (pp. 41?48). New York, New York, USA: ACM Press. doi:10.1145/332040.332405 Odell, D. L., Davis, R. C., Smith, A., & Wright, P. K. (2004). Toolglasses, marking menus, and hotkeys: a comparison of one and two-handed command selection techniques. In Proceedings of Graphics Interface - GI  ?04 (pp. 17?24). OpenCV. (n.d.). Retrieved October 01, 2013, from http://opencv.org/ Owen, R., Kurtenbach, G., Fitzmaurice, G., Baudel, T., & Buxton, B. (2005). When it gets more difficult, use both hands: exploring bimanual curve manipulation. In Proceedings of Graphics Interface - GI  ?05 (pp. 17?24). Shaw, C., & Green, M. (1997). THRED: a two-handed design system. Multimedia Systems, 5(2), 126?139. doi:10.1007/s005300050048 Thomsen, J. F., Gerr, F., & Atroshi, I. (2008). Carpal tunnel syndrome and the use of computer mouse and keyboard: a systematic review. BMC musculoskeletal disorders, 9(1), 134. doi:10.1186/1471-2474-9-134 Wagner, J., Huot, S., & Mackay, W. E. (2012). Bitouch and bipad: Designing bimanual interaction for hand-held tablets. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems - CHI  ?12 (pp. 2317?2326). doi:10.1145/2207676.2208391 Yee, K.-P. (2004). Two-handed interaction on a tablet display. In Extended abstracts of the 2004 conference on Human factors and computing systems - CHI EA  ?04 (pp. 1493?1496). New York, New York, USA: ACM Press. doi:10.1145/985921.986098 Zeleznik, R. C., Forsberg, A. S., & Strauss, P. S. (1997). Two pointer input for 3D interaction. In Proceedings of the 1997 symposium on Interactive 3D graphics - SI3D  ?97 (pp. 115?120). New York, New York, USA: ACM Press. doi:10.1145/253284.253316  40  Appendices  The following appendices contain additional information about our participants (Appendix A), some example timelines (Appendix B), and the materials we used for our observational study and interviews (Appendix C).41  Appendix A  - Participant Information Participant)ID) Participated)as)NP?) Participated)as)LP?) Female?) Lefty?) #)of)NP)samples) #)of)LP)samples) Application)used) Used)Command)Line?) Keyboard)used) Pointing)device)used)NP1$ yes$ $$ yes$ $$ 2$ $$ InDesign$ $$ Laptop$keyboard$ Mouse$NP2$ yes$ $$ $$ $$ 1$ $$ Illustrator$ $$ Laptop$keyboard$ Trackpad$NP3$ yes$ $$ $$ yes$ 1$ $$ General$CADD$Pro$ yes$ "Regular"$full$keyboard$ Mouse$NP4$ yes$ $$ $$ $$ 1$ $$ Maya$ $$ "Regular"$full$keyboard$ Mouse$NP5$ yes$ $$ $$ $$ 1$ $$ Maya$ $$ "Regular"$full$keyboard$ Mouse$NP6$ yes$ $$ $$ $$ 1$ $$ Unity$ $$ "Regular"$full$keyboard$ Mouse$NP7$ yes$ $$ yes$ $$ 1$ $$ Illustrator$ $$ "Regular"$full$keyboard$ Mouse$NP8$ yes$ $$ yes$ $$ 1$ $$ Photoshop$ $$ Laptop$keyboard$ Mouse$+$Tablet$with$stylus$NP9$ yes$ $$ yes$ $$ 1$ $$ SketchUp$ $$ "Regular"$full$keyboard$ Mouse$NP10$ yes$ $$ $$ $$ 1$ $$ Vectorworks,$Photoshop$ $$ "Regular"$full$keyboard$ Mouse$NP11$ yes$ $$ $$ $$ 1$ $$ Vectorworks$ $$ "Regular"$full$keyboard$ Mouse$NP12$ yes$ $$ $$ $$ 1$ $$ Vectorworks$ $$ "Regular"$full$keyboard$ Mouse$NP13$ yes$ $$ $$ $$ 1$ $$ Maya$ $$ "Regular"$full$keyboard$ Ergonomic$mouse$NP14$ yes$ $$ $$ yes$ 1$ $$ Silhouette$ $$ "Regular"$full$keyboard$ Tablet$with$stylus$NP15$ yes$ $$ $$ $$ 1$ $$ Nuke,$Houdini$ $$ "Regular"$full$keyboard$ Mouse$NP16$ yes$ $$ yes$ $$ 1$ $$ Nuke$ $$ "Regular"$full$keyboard$ Mouse$NP17$ yes$ $$ $$ $$ 1$ $$ Maya$ $$ "Regular"$full$keyboard$ Mouse$LP1$ $$ yes$ yes$ $$ $$ 2$ AutoCAD$ yes$ "Regular"$full$keyboard$ Mouse$LP2$ $$ yes$ $$ $$ $$ 2$ AutoCAD$ yes$ "Regular"$full$keyboard$ Mouse$42  Participant)ID) Participated)as)NP?) Participated)as)LP?) Female?) Lefty?) #)of)NP)samples) #)of)LP)samples) Application)used) Used)Command)Line?) Keyboard)used) Pointing)device)used)LP3$ $$ yes$ yes$ $$ $$ 2$ AutoCAD$ yes$ "Regular"$full$keyboard$ Mouse$LP4$ $$ yes$ $$ $$ $$ 2$ AutoCAD$ yes$ "Regular"$full$keyboard$ Ergonomic$mouse$LP5$ $$ yes$ $$ $$ $$ 2$ AutoCAD$ yes$ "Regular"$full$keyboard$ Mouse$LP6$ $$ yes$ $$ $$ $$ 2$ AutoCAD$ yes$ "Regular"$full$keyboard$ Mouse$LP7$ $$ yes$ yes$ $$ $$ 1$ AutoCAD$ yes$ "Regular"$full$keyboard$ Mouse$LP8$ $$ yes$ $$ $$ $$ 1$ AutoCAD$ yes$ "Regular"$full$keyboard$ Mouse$LP9$ $$ yes$ $$ $$ $$ 1$ AutoCAD$ yes$ "Regular"$full$keyboard$ Mouse$LP10$ $$ yes$ $$ $$ $$ $$ AutoCAD$ yes$ "Regular"$full$keyboard$ Mouse$LP11$ $$ yes$ yes$ $$ $$ $$ AutoCAD$ yes$ "Regular"$full$keyboard$ Mouse$LP12$ $$ yes$ $$ $$ $$ $$ AutoCAD$ yes$ "Regular"$full$keyboard$ Mouse$LP13$ $$ yes$ $$ $$ $$ $$ AutoCAD$ yes$ "Regular"$full$keyboard$ Mouse$LP14$ $$ yes$ $$ $$ $$ $$ AutoCAD$ yes$ "Regular"$full$keyboard$ Mouse$LP15$ $$ yes$ yes$ $$ $$ $$ AutoCAD$ yes$ "Regular"$full$keyboard$ Mouse$LP16$ $$ yes$ yes$ yes$ $$ $$ AutoCAD$ yes$ "Regular"$full$keyboard$ Mouse$LP17$ $$ yes$ yes$ $$ $$ $$ AutoCAD$ yes$ "Regular"$full$keyboard$ Mouse$LNP1$ yes$ yes$ $$ $$ 1$ 2$ AutoCAD$ yes$ "Regular"$full$keyboard$ Mouse$LNP2$ yes$ yes$ $$ $$ 1$ 1$ AutoCAD$ yes$ "Regular"$full$keyboard$ Ergonomic$mouse$LNP3$ yes$ yes$ $$ $$ 1$ $$ AutoCAD$ yes$ "Regular"$full$keyboard$/$Ergonomic$full$keyboard$ Mouse$Totals:) 20) 20) 12) 3) 21) 18) )) 21) )) ))    43  Appendix B  - Timeline Groupings Below are the timelines from each of the 8 groupings we identified from initial visual inspection of the timelines. The y-axis represents the 9 regions hands were in and the x-axis represents time. Blue = MH and red = KH.  Group 1: In order ? NP6s1/1, NP14s1/1, NP16s1/1. These timelines show a very low amount of movement of both hands.    Group 2: In order ? LP3s2/2, LP2s2/2. These timelines show a low amount of KH movement to off and MH movement to keyboard right.    44  Group 3: In order ? NP2s1/1, LNP3s1/1. These timelines show a low amount of KH movement to keyboard right and off, and a high amount of MH movement to keyboard right.   Group 4: In order ? LNP1s1/3, LNP1s2/3, LP6s2/2. These timelines show a high amount of KH movement to keyboard right and a low amount of MH movement to keyboard right.    Group 5: In order ? LNP2s2/2, LNP1s3/3. These timelines show a low amount of MH movement and a high amount of KH movement to keyboard right and off.   45  Group 6: NP3s1/1. This timeline shows a high amount of MH movement to keyboard right and a very high amount of KH movement to keyboard right.  Group 7: In order ? LP5s2/2, LP5s1/2, LP4s1/2, LP4s2/2. These timelines show a high amount of KH movement between off and keyboard left, and a high amount of MH movement to keyboard right.     Group 8: In order ? NP1s2/2, NP9s1/1. These timelines show a moderate amount of MH movement to keyboard right and a low amount of KH movement to off and keyboard right.  46  Appendix C  - Study Materials C.1 Participant Recruitment Email (Naturalistic Participants)  We are currently recruiting participants for a human-computer interaction study on interaction techniques. Please review the information below and email xxxxxx if you would like to participate.  Purpose: To investigate the use of two hands with computer applications.  Procedure: For this study, you will be asked to perform one or two tasks that are part of your usual work in order to demonstrate how you normally use a computer application. You will then be asked to answer a few questions and explain some of what you did while performing the task(s). Video will be recorded while you are performing the task(s).  Eligibility: Any person 19 years of age or older, with a firm grasp of written and spoken English, without any physical or mental impairments, who regularly use computers.  Interested people will be sent an initial questionnaire. Participants will be screened based on computer experience and the type of computer applications they use.  Commitment: The interview will last for one to two hours. Participants will receive a $25 gift certificate as a token of our appreciation.  Location: The study will take place at your usual work location.  To participate, email: xxxxxx     47  C.2 Screening Questionnaire   General Computer Usage 1. How many years have you been using a computer? __ less than 5 years      __ 5 - 10 years      __ 10 - 15 years      __ more than 15 years  2. How often do you actively use a computer on an average day? __ ~25% of the day or less      __ ~50% of the day      __ ~75% of the day      __ ~90% of the day  3. How would you rate your computer expertise?  __ Novice      __ Intermediate      __ Expert  Expert/Frequent Usage 4. Please mark how often you use the following types of applications.... Word processors:  __ never      __ seldom      __occasionally      __ frequently Presentation programs:       __ never      __ seldom      __occasionally      __ frequently Spreadsheets:        __ never      __ seldom      __occasionally      __ frequently Email:         __ never      __ seldom      __occasionally      __ frequently Browsers:         __ never      __ seldom      __occasionally      __ frequently Video games:        __ never      __ seldom      __occasionally      __ frequently  5. Other than the kinds of applications listed above, what applications do you use frequently? Please list the application(s) you use most frequently (maximum 3), particularly any domain specific applications (e.g. photo/video/audio editing programs, image manipulation programs, CAD tools, etc.).      48   Please answer the following five questions for each of the applications you listed above?  Application 1:  Application 2: Application 3: 6. How frequently do you use the application? __ several times a day __ once a day __ several times a week __ once a week __ several times a month __ once a month __ several times a year __ several times a day __ once a day __ several times a week __ once a week __ several times a month __ once a month __ several times a year __ several times a day __ once a day __ several times a week __ once a week __ several times a month __ once a month __ several times a year 7. When was the last time you used the application?    8. What is the typical amount of time you use the application for?    9. What input devices do you use with the application? __ mouse __ keyboard __ stylus __ other: ____________________  __ other: ____________________  __ mouse __ keyboard __ stylus __ other: ____________________  __ other: ____________________  __ mouse __ keyboard __ stylus __ other: ____________________  __ other: ____________________  10. Have you done/learned anything to make yourself more efficient and/or faster when using the application (learning shortcuts, creating macros, etc.)? If yes, what?    49  C.3 Consent Forms Participant Consent Form   Department of Computer Science  University of British Columbia Vancouver, BC, V6T 1Z4   Consent Form TWO-HANDED INTERACTIONS WITH GRAPHICAL APPLICATIONS Principal Investigator: Dr. Joanna McGrenere, Associate Professor in the Department of Computer Science at the University of British Columbia, phone: xxx-xxx-xxxx  Co-Investigators: Juliette Link, MSc student in the Department of Computer Science at the University of British Columbia, phone: xxx-xxx-xxxx Dr. Kellogg S. Booth, Professor in the Department of Computer Science at the University of British Columbia, phone: xxx-xxx-xxxx Dr. Michael Terry, Associate Professor at the Cheriton School of Computer Science at the University of Waterloo, phone: xxx-xxx-xxxx  Study Purpose and Procedures: ____________________ (Senior Manager?s name) has given permission for this project  to be conducted at ____________________ (name of organization).  This study investigates the use of two hands with computer applications. You will be asked to perform one or two tasks that are part of your usual work in order to demonstrate how you normally use computer applications. You will then be asked to answer a few questions and explain some of what you did while performing the task(s).   Portions of the research will be used for the UBC MSc research of Juliette Link. MSc theses are publicly available.  Option for Videotaping: We would like to videotape your hands and your computer screen as you complete your task(s) during the study for the purposes of data analysis and presenting our results in academic presentations and publication of research papers. Your face will not be videotaped. Please note that this is an optional procedure, which you are free to decline. Your refusal to be videotaped T he  U N IV ERS IT Y OF  B RIT IS H COLU MB IA 50  will in no way affect your eligibility for this study. Only the investigators of this study will have access to the original video files. When videos are used in presentations and publications, your identity, computer screen, and other identifiable information will be anonymized by masking and/or blurring.   Please initial below if you agree to the following statement... I agree to being videotaped, subject to whatever restrictions my company/manager has placed. I understand that the videos and still images captured from the videos may be used in academic presentations and/or publications. Initial here: ________  Confidentiality: The identities of participants and all data will be kept confidential. All data will be stored securely in a locked metal filing cabinet or in a password protected computer account. All data from individual participants will be coded so that their anonymity will be protected in any reports, research papers, thesis documents, and presentations on this work. The recordings will be stored for five years after the study, after which they will be permanently erased.   Contact Information about the study: This study is being conducted by Dr. Joanna McGrenere (xxx-xxx-xxxx), Juliette Link (xxx-xxx-xxxx), Dr. Kellogg S. Booth (xxx-xxx-xxxx), and Dr. Michael Terry (xxx-xxx-xxxx). You may contact any one of them if you have questions or desire further information about the study.  Contact for Information about the Rights of Research Participants: If you have any concerns about your treatment or rights as a research subject, you may contact the Research Subject Information Line in the UBC Office of Research Services at xxx-xxx-xxxx.  Consent: We intend for your experience in this study to be pleasant and stress-free.  Your participation in this study is entirely voluntary and you may refuse to participate or withdraw from the study at any time.   Your signature below indicates that you have received a copy of this consent form for your own records.  Your signature indicates that you consent to participate in this study.  You do not waive any legal rights by signing this consent form.   I, ________________________________, agree to participate in the study as outlined above. My participation in this study is voluntary and I understand that I may withdraw at any time.    ____________________________________________________ Participant?s Signature                                                     Date     ____________________________________________________ Researcher?s Signature                                                    Date   51  Manager Consent Form   Department of Computer Science  University of British Columbia Vancouver, BC, V6T 1Z4   Manager Consent Form TWO-HANDED INTERACTIONS WITH GRAPHICAL APPLICATIONS Principal Investigator: Dr. Joanna McGrenere, Associate Professor in the Department of Computer Science at the University of British Columbia, phone: xxx-xxx-xxxx  Co-Investigators: Juliette Link, MSc student in the Department of Computer Science at the University of British Columbia, phone: xxx-xxx-xxxx Dr. Kellogg S. Booth, Professor in the Department of Computer Science at the University of British Columbia, phone: xxx-xxx-xxxx Dr. Michael Terry, Associate Professor at the Cheriton School of Computer Science at the University of Waterloo, phone: xxx-xxx-xxxx  Study Purpose and Procedures: You are being asked to allow us to conduct a study within your organization that investigates the use of two hands with computer applications.   We will ask participants to perform one or two tasks that are part of their usual work in order to demonstrate how they normally use computer applications. They will then be asked to answer a few questions and explain some of what they did while performing the task(s).   Portions of the research will be used for the UBC MSc research of Juliette Link. MSc theses are publicly available.  Risk/Benefits: There are no known risks to participating. Participants may learn more about the computer application(s) and interaction techniques they use.  Option for Videotaping: With your permission and the permission of participants, we would like to videotape participants? hands, keyboards and computer screens as they complete their task(s) during the study for the purposes of data analysis and presenting our results in academic presentations and publication of research papers. Participants? faces will not be videotaped. Please note that this is an optional procedure, which you are free to decline. Refusal to allow videotaping will in no way affect participants? eligibility for this study. Only the investigators of this study will have access to the original video files. When videos are used in presentations and publications, any identifiable T he  U N IV ERS IT Y OF  B RIT IS H COLU MB IA 52  information will be anonymized by masking and/or blurring.   Please sign your initials by one or more of the following options for videotaping: ______ I allow videotaping of the computer screens of members of my organization who    participate in this study. ______ I allow videotaping of the hands and keyboards of members of my organization who participate in this study. ______ I do not allow any videotaping.  Please sign your initials by one or more of the following options for data analysis and dissemination of study results, as needed: ______ I allow the researchers to use the videos and still images captured from the videos in academic presentations and/or publications, with any identifiable information anonymized by masking and/or blurring. ______ I allow only the investigators of this study to view the videos. ______ I allow only Juliette Link (the member of the research team conducting the study) to view the videos at any point.  If applicable, please sign your initials by the following option for NDAs: ______ I request that Juliette Link (the member of the research team conducting the study) sign a NDA. I will provide an NDA for Juliette Link to sign.  Confidentiality: The identities of participants and all data will be kept confidential. All data will be stored securely in a locked metal filing cabinet or in a password protected computer account. All data from individual participants will be coded so that their anonymity will be protected in any reports, research papers, thesis documents, and presentations on this work. The recordings will be stored for five years after the study, after which they will be permanently erased.   Contact Information about the study: This study is being conducted by Dr. Joanna McGrenere (xxx-xxx-xxxx), Juliette Link (xxx-xxx-xxxx), Dr. Kellogg S. Booth (xxx-xxx-xxxx), and Dr. Michael Terry (xxx-xxx-xxxx). You may contact any one of them if you have questions or desire further information about the study.  Contact for Information about the Rights of Research Participants: If you have any concerns about your treatment or rights as a research subject, you may contact the Research Subject Information Line in the UBC Office of Research Services at 604-822-8598.  Consent: We intend for participants? experience in this study to be pleasant and stress-free.  Participation in this study is entirely voluntary and participants may refuse to participate or withdraw from the study at any time.   Your signature below indicates that you have received a copy of this consent form for your own records.  Your signature indicates that you consent to allow members of your organization to participate in this study. If you consent to only allow select individuals from your organization participate, please list the names of those individuals below: ____________________________________________________________________________________________________________________________________________________________ 53  ______________________________________________________________________________  You do not waive any legal rights by signing this consent form.   I, ________________________________, agree to allow members of my organization participate in the study as outlined above.    _______________________________________________________ Manager?s Signature                                                         Date                _______________________________________________________ Researcher?s Signature                                                    Date     54  C.4 Interview Questions  Note: Interviews were open-ended, the following questions served as guidelines only.  Questions to be asked at the start of the session? ? Clarify and expand on questions from Screener Form as necessary.  Questions to be asked (if necessary) while participant is performing their task(s)? If after approximately 15 minutes, the participant has not performed any of the expert methods they mentioned in the screener form, the interviewer will prompt them with...  ? ?I noticed you mentioned using ___ (some expert method(s) they mentioned in the screener). Can you show me when/how you would use those? Afterwards, ask why they were not using those methods initially (is it because they were doing a different task? Or using a different program?).  Questions to be asked after participant has finished performing their task(s)? After performing a task (or a few tasks) of their choice, participants will be asked to slowly walk through some of their actions and answer some questions.  In terms of the methods you use to make yourself more efficient/faster with applications... ? Why did you learn those methods? ? How did you learn those methods (in school, teaching yourself from online tutorials, etc.)? ? At what point did you learn those methods (while you were learning to use the application, after having used the applications, etc.)?  ? What were you using the keyboard for? Specifically when using it with just one hand? ? What were you using the mouse for? ? What were you using any other devices for? ? Why do you switch between the mouse and keyboard? When do you do so? How often do you have to do so? ? What improvements do you think could be made for the way you interact with the application? What already works very well? ? If there was a way to have the interaction be more bimanual (like, for example, more one hand on the keyboard one hand on the mouse), would that appeal to you? What challenges or benefits would you foresee about that? ? Would you be open to a follow up conversation should any questions arise after I review your video?  

Cite

Citation Scheme:

        

Citations by CSL (citeproc-js)

Usage Statistics

Share

Embed

Customize your widget with the following options, then copy and paste the code below into the HTML of your page to embed this item in your website.
                        
                            <div id="ubcOpenCollectionsWidgetDisplay">
                            <script id="ubcOpenCollectionsWidget"
                            src="{[{embed.src}]}"
                            data-item="{[{embed.item}]}"
                            data-collection="{[{embed.collection}]}"
                            data-metadata="{[{embed.showMetadata}]}"
                            data-width="{[{embed.width}]}"
                            async >
                            </script>
                            </div>
                        
                    
IIIF logo Our image viewer uses the IIIF 2.0 standard. To load this item in other compatible viewers, use this url:
http://iiif.library.ubc.ca/presentation/dsp.24.1-0052171/manifest

Comment

Related Items