UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Measuring students' engagement and learning during problem-solving in introductory genetics : the effects… Fisher, Heather Anne 2014

Your browser doesn't seem to have a PDF viewer, please download the PDF to view this item.

Item Metadata

Download

Media
24-ubc_2014_september_fisher_heatheranne.pdf [ 18.47MB ]
Metadata
JSON: 24-1.0165913.json
JSON-LD: 24-1.0165913-ld.json
RDF/XML (Pretty): 24-1.0165913-rdf.xml
RDF/JSON: 24-1.0165913-rdf.json
Turtle: 24-1.0165913-turtle.txt
N-Triples: 24-1.0165913-rdf-ntriples.txt
Original Record: 24-1.0165913-source.json
Full Text
24-1.0165913-fulltext.txt
Citation
24-1.0165913.ris

Full Text

  MEASURING STUDENTS’ ENGAGEMENT AND LEARNING DURING PROBLEM-SOLVING IN INTRODUCTORY GENETICS: THE EFFECT OF PROBLEM-SOLVING AND SELF-REGULATED LEARNING PROMPTS   by  Heather Anne Fisher   B.A., McMaster University, 2012 Hons. B.Sc., McMaster University, 2011     A THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF     MASTER OF ARTS   in    The Faculty of Graduate and Postdoctoral Studies   (Science Education)     THE UNIVERSITY OF BRITISH COLUMBIA  (Vancouver)     August 2014   © Heather Anne Fisher, 2014   ii Abstract  Developing problem-solving skills is a major goal in most undergraduate science courses. However it is rarely taught and supported explicitly. As courses shift away from didactic formats towards more interactive, problem-based ones, students’ abilities to problem-solve become even more integral to their success. Unfortunately, many students entering these introductory science courses are new to and struggle with problem-solving, requiring support to develop these skills. One possible support is prompting students throughout the process of problem-solving, encouraging content understanding and broad-based problem-solving skill development.  This research investigates the role of two types of prompting, exploring how they affect student engagement and learning during problem-solving. This study took place in an Introductory Genetics course, where students completed a scheduled weekly problem-based tutorial, containing a question set and quiz question. Tutorial sections were divided into one of three conditions, which included different combinations of prompts provided in addition to content-based questions. The first condition, Problem-Solving, encouraged positive problem-solving behaviours, such as stating known information and identifying relevant data, through answering content-related prompts. The second condition, Self-Regulated Learning, included the same positive problem-solving prompts, and also asked students to reflect on why the prompts assisted them in problem-solving. A Control condition received no prompts and only engaged in the domain-specific problem-solving activity.  Responses to questions during and following the manipulation were coded on three scales – completion, correctness, and explanation – which represent three facets of  iii engagement. Engagement Profiles were created to characterize student engagement throughout the question set. The three scales were used to explore the effect of condition, using the quiz question as a post-intervention measure of learning. Engagement Profile results demonstrated students engaged with the question set differently across conditions, but there were no significant differences on the quiz question responses on any of the scales.  This study contributes to educational research by comparing two forms of problem-solving support, suggesting a method to categorize student engagement during problem-solving. It also demonstrates the importance of measuring process, in addition to learning outcomes, to identify behavioural changes; and proposes an application of self-regulated learning theory that is situated in context. Finally, course-specific recommendations were made.      iv Preface  This research study received approval from the UBC Research Ethics Board on August 29, 2013, certificate number H13-02215.   This statement is to identify that I, Heather Anne Fisher, was involved in all components of the conception, design, and implementation of this research study, including piloting the tool, revising it, and implementing it in the intervention discussed below, as well as all coding, statistical analysis, and writing of this manuscript. This study was conceived out of an interest in conducting an in situ intervention in a moderately controlled environment to explore the impacts of self-regulatory and problem-solving prompts on student engagement and learning.  All committee members actively participated throughout the research study and manuscript preparation in the form of feedback on resource design and statistical analysis, consultation on areas of expertise, and continual input on drafts of the manuscript. In addition, Dr. Lisa McDonnell provided assistance with coding a portion of the data to ensure inter-rater reliability for the coding manual.   At this point, no part of this research has been published.    v Table of Contents Abstract ............................................................................................................................... ii	  Preface ................................................................................................................................ iv	  Table of Contents ................................................................................................................ v	  List of Tables ................................................................................................................... viii	  List of Figures .................................................................................................................... ix	  Acknowledgements ............................................................................................................. x	  Dedication .......................................................................................................................... xi	  1	   Introduction ................................................................................................................... 1	  1.1	   Rationale for study ................................................................................................. 3	  1.2	   Research questions and hypotheses ....................................................................... 6	  1.3	   Overview of methodology ..................................................................................... 8	  2	   Literature Review ........................................................................................................ 11	  2.1	   Approaches to science education at the University of British Columbia ............ 12	  2.2	   Challenges in acquiring problem-solving skills in introductory courses ............. 13	  2.3	   Problem-solving in scientific contexts ................................................................. 14	  2.3.1	   Positive problem-solving behaviours ............................................................ 15	  2.3.2	   Problem-solving and self-regulated learning ................................................ 16	  2.4	   Prompting students during problem-solving ........................................................ 23	  2.5	   Student engagement and learning during problem-solving ................................. 25	  3	   Methodology ............................................................................................................... 28	  3.1	   Research design ................................................................................................... 28	  3.1.1	   Conditions ..................................................................................................... 28	  3.1.2	   Assigning conditions ..................................................................................... 33	  3.1.3	   Teaching Assistant preparation ..................................................................... 33	  3.2	   Course context ..................................................................................................... 34	  3.3	   Participants ........................................................................................................... 36	  3.3.1	   Consent ......................................................................................................... 37	  3.4	   Materials .............................................................................................................. 38	  3.5	   Procedure ............................................................................................................. 39	  3.6	   Data coding .......................................................................................................... 41	  3.6.1	   Preliminary coding: Completeness, Correctness, and Explanation scores ... 41	  3.6.2	   Secondary coding: Engagement Profiles and Engagement Patterns ............. 44	   vi 3.6.3 Inter-rater reliability ......................................................................................... 47	  3.7	   Analysis................................................................................................................ 48	  3.7.1	   Engagement Profiles and Engagement Patterns ........................................... 48	  3.7.2	   Effect of condition on engagement and learning .......................................... 48	  4	   Results ......................................................................................................................... 50	  4.1	   Descriptive statistics ............................................................................................ 50	  4.1.1	   Midterm 1 grades .......................................................................................... 50	  4.1.2	   Question Completeness ................................................................................. 50	  4.2	   Engagement Profiles ............................................................................................ 54	  4.3	   Engagement Patterns ............................................................................................ 59	  4.4	   Effect of condition on Completeness, Correctness, and Explanation scales ....... 62	  5	   Discussion ................................................................................................................... 65	  5.1	   Main findings ....................................................................................................... 65	  5.1.1	   Engagement Profiles ..................................................................................... 65	  5.1.2	   Engagement Patterns ..................................................................................... 69	  5.1.3	   Effect of condition on Completeness, Correctness, and Explanation scales 73	  5.2	   Significance of research ....................................................................................... 75	  5.3	   Limitations ........................................................................................................... 78	  6	   Conclusions ................................................................................................................. 83	  6.1	   Summary of findings ........................................................................................... 83	  6.2	   Challenges of measuring student engagement: a reflection for educational researchers ..................................................................................................................... 84	  6.3	   Future directions .................................................................................................. 90	  7	   References ................................................................................................................... 94	  8	   Appendices ................................................................................................................ 100	  8.1	   Appendix A: Summary of tutorial question set (for each condition) ................. 100	  8.2	   Appendix B: Instructions provided to Teaching Assistants prior to tutorial ..... 106	  8.3	   Appendix C: Course syllabus ............................................................................. 111	  8.4	   Appendix D: Consent form ................................................................................ 114	  8.5	   Appendix E: Tutorial question set (for each condition) ..................................... 116	  8.6	   Appendix F: Coding manual .............................................................................. 142	  8.7	   Appendix G: Multivariate (MANCOVA) analysis for question Completeness by condition (Midterm 1 covariate) ................................................................................. 151	  8.8	   Appendix H: Midterm 1 by condition ANOVA ................................................ 155	  8.9	   Appendix I: Engagement Pattern tables ............................................................. 157	   vii 8.10	   Appendix J: Multivariate (MANCOVA) analysis for Quiz Question ............. 183	     viii List of Tables  Table 2.1: From Pintrich (2004) - Phases and areas for self-regulated learning .............. 19	  Table 3.1: Summary of questions and prompts in question set, by condition .................. 29	  Table 3.2: Teaching Assistant assignment of condition ................................................... 33	  Table 3.3: Methods of course evaluation .......................................................................... 36	  Table 3.4: Student demographics, major programs of enrolment (>10 students enrolled in program reported) ............................................................................................................. 37	  Table 3.5: Student demographics, year of study ............................................................... 37	  Table 3.6: Distribution of group size, by condition .......................................................... 40	  Table 3.7: Timeline for tutorial provided to Teaching Assistants .................................... 41	  Table 3.8: Definition of codes for Completeness, Correctness, and Explanation scales .. 42	  Table 3.9: Scale conversions used to create binary codes for Engagement Profiles ........ 46	  Table 3.10: Summary of Engagement Pattern categories ................................................. 47	  Table 4.1: Count of students who completed each question part; binary (0/1 = not attempted; 2/3 = attempted/completed) ............................................................................ 53	  Table 4.2: Multivariate (MANCOVA) analysis for question Completeness (average completeness, Questions 1-5) ........................................................................................... 54	  Table 4.3: Engagement Profile count, following preliminary coding .............................. 55	  Table 4.4: Engagement Profile distributions, by condition (count and percentage of condition) .......................................................................................................................... 58	  Table 4.5: Summary of Engagement Pattern trends ......................................................... 60	  Table 4.6: Engagement Pattern chi-square test of independence results .......................... 62	  Table 4.7: Multivariate (MANCOVA) results, Quiz Question ......................................... 64	       ix List of Figures  Figure 1.1: Theoretical approach to research design .......................................................... 6	  Figure 2.1: Zimmerman's (1989) model of self-regulated learning .................................. 17	  Figure 2.2: Prompt and scaffold types utilized in intervention ......................................... 25	  Figure 2.3: Overview of engagement and learning in the research design ....................... 26	  Figure 3.1: Scenario and data provided in Question 2a .................................................... 30	  Figure 3.2: Question 2a, as seen by the Control condition ............................................... 30	  Figure 3.3: Question 2a, as seen by the Problem-Solving condition ................................ 31	  Figure 3.4: Question 2a, as seen by the Self-Regulated Learning condition .................... 32	  Figure 3.4: Question 2 reflection question for Self-Regulated Learning condition ......... 33	  Figure 3.5: Sample conceptual question with possible answers, provided by course coordinator ........................................................................................................................ 36	  Figure 3.7: Engagement Profiles diagrams used in original coding; Top left: Run-out-of-time, Top right: Pick-Up, Bottom left: Sampler, Bottom Right: Other ............................ 46	  Figure 4.1: Engagement Profiles diagrams after coding; Top left: Complete, Top right: Run-out-of-time, Bottom left: Pick-Up, Bottom right: Sampler ....................................... 57 Figure 6.1: Current and future analysis ............................................................................. 89      x Acknowledgements  I want to express my extreme thanks to the many individuals who helped make this study possible. To my co-supervisors, Dr. Marina Milner-Bolotin and Dr. Ido Roll, who remind me each day the power of having passion for your work and whose work ethic and intelligence drive me to greater heights. To my committee members Dr. Lisa McDonnell, for the endless hours of advice, feedback, and coding help; and Dr. Deborah Butler for lending an ear to a novice storyteller and providing the space for me to think critically about all aspects of my work. In addition, I would like to thank Svetlana Chachashvili-Bolotin for her patience and coaching during the data analysis process.   Also to Kathleen, Nicole, Mark, and Karen for their unwavering emotional support to a perpetual student. Finally, a special thanks to Matt Wright, who brings me back to Earth in the world of academia.     xi Dedication  To the researchers, educators, and support staff  who make space in the learning process for mistakes.      1 Introduction  One of the main goals of higher education is to develop students’ skills in areas such as critical thinking and communication skills, with students in science additionally developing reasoning and problem-solving skills (Carmel & Yezierski, 2013; Ogilvie, 2009; Webb, 2012; Willingham, 2008). At the University of British Columbia, a key goal of the Faculty of Science is “to deliver high-quality undergraduate science education that prepares students in the sciences to be productive members of a civil and sustainable society” (Faculty of Science Strategic Plan, 2011). This includes helping students develop problem-solving skills and acquire relevant content knowledge.  There are two major challenges involved in developing students’ problem-solving skills. The first is the challenge instructors face in teaching these skills, and the second is the challenge students face in acquiring them. From the perspective of the instructor, explicitly teaching problem-solving skills would require knowing what strategies facilitate problem-solving and how to teach them. These challenges become more difficult because there is little agreement on how to best teach problem-solving skills explicitly (Yeager & Walton, 2011). Additionally, since the skills are expected to be developed over the course of a student’s undergraduate degree, there is little incentive for instructors to formally teach problem-solving skills in a single course, let alone an introductory course where students’ skills are underdeveloped. From the perspective of the student, problem-solving is a particularly difficult set of skills to acquire as it is highly contextual, grounded in disciplinary knowledge and practices, and takes a great deal of expertise to apply appropriately (Ogilvie, 2009; Schoenfeld, 1992). For students to acquire these skills they must first gain experience with problem-solving and then  2 learn how to engage in problem-solving effectively. Neither of these are easy feats and in science fields, where problem-solving is often the foundation of the discipline, this makes for a steep learning curve for students entering introductory courses.  Since both instructors and students face a number of challenges, it was possible to design this research on problem-solving skill development from a variety of perspectives. In this research, I chose to address problem-solving skill development from the perspective of the instructor as the possible implications for instructional changes are widespread. This research study aims to evaluate how we can facilitate problem-solving skill development through the materials we provide to students in tutorials focused on solving conceptual problems.  This research study was framed from the perspective of instructors. As such, the considerations instructors must address in order to facilitate these skills as well as our current understandings of who students in introductory science courses are and how they learn are discussed further. Instructors must consider how to (a) develop students’ problem-solving skills within a single context, (b) cultivate these skills with sufficient depth that students can apply them to novel contexts within the course, and (c) encourage students to extrapolate these skills in general ways to apply outside of the course context. These challenges are compounded by the understanding that students in introductory science courses tend to lack even the most basic problem-solving skills and have a number of biases that influence effective learning strategies (Bjork, 1994; Kruger & Dunning, 1999).  First, students simply spending time solving problems is not sufficient to ensure they develop these sophisticated skills (Willingham, 2008). Second, students who have  3 surface-level content knowledge of a subject often believe they understand material more deeply than they do (Metcalfe, 1998; Kruger & Dunning, 1999). This is particularly problematic in introductory science courses for three reasons. First, science courses often include tutorials (problem-solving sessions) where students further explore the applications of the content introduced in previous lecture material. Students might feel increased familiarity with the content during these tutorials, as they have experienced it earlier in the week during lectures. This familiarity could result in students engaging in material at a surface level because they feel they understand it already. Second, introductory courses tend to survey large amounts of material at a surface level to allow students to make broad sense of the field and identify topics of interest for future years of study. And third, students are often in their early undergraduate years and have novice skill sets. These reasons together are troubling because when students believe they understand content they are more likely to terminate their study efforts or to employ poor problem-solving habits (Bjork, 1994).  1.1 Rationale for study  Given that students are often poor judges of their own skills and understanding, designing educational materials that facilitate this skill development is important. This is especially important in science courses where students need problem-solving skills to succeed in their undergraduate courses and future careers, and in introductory courses, where students are new to problem-solving and lack sophisticated skills. In this research study, I designed and implemented an intervention aimed at facilitating problem-solving skill development.   4 The intervention method was selected based on meta-analysis findings by Yeager and Walton (2011) who demonstrated that small social and psychological interventions embedded into the course context could have significant impact on students’ skill and content development. In this research, the small intervention was in the form of prompts embedded in a question set that students worked on as part of an introductory science course tutorial. This format was selected because prompts provide an opportunity to explore multiple theoretical approaches to facilitating problem-solving in a controlled manner.  This research study explores two types of prompts by evaluating how they impact student engagement and learning. They are compared to a control condition that received no prompts. The two types of prompts were motivated by the following questions about problem-solving skill development – (1) Is it enough to guide students through the problem-solving process, or (2) is it necessary for them to think about how to problem-solve and the strategies that support problem-solving in addition to being guided through the problem-solving process? This resulted in three conditions being designed, a control condition (for comparison) and two different prompting conditions, each of which addressed one of the questions posed above.  As mentioned earlier, the main challenge that students in introductory science courses face is gaining experience engaging with problem-solving and then developing effective problem-solving skills. Designing two prompting conditions (and comparing them to a control condition) provides insight into the development of problem-solving skills, but it does not describe the experience of engaging in problem-solving. This is a question of process instead of outcome. As a result, this research adopted a two-pronged  5 approach to data coding and analysis in order to provide information about both the process of engagement and the learning outcomes of that engagement. In this case, that meant evaluating the work students completed during the intervention as well as a post-intervention measure of learning. Exploring the process students engage in during problem-solving provides in-context information about student behaviour during the period of learning, which is where students develop skills and conceptual understanding. Together, information about students’ approaches to problem-solving and the impact of those skills on their learning create a deeper image of student behaviour during skill development.  Overall, this study was designed to advance knowledge about how students engage in problem-solving when provided with differing prompts, as well as to evaluate the effect of these different prompts on student learning and engagement in problem-solving. Figure 1.1 summarizes this theoretical approach to the research design. In particular, student engagement was theorized as occurring during the question set, while learning was theorized as occurring throughout the question set and the quiz question. Engagement was confined to the question set because the formal nature of the quiz question ensured student engagement and therefore it would not tell us much about how students choose to engage during problem-solving.        6   1.2 Research questions and hypotheses   This study aims to investigate the effect of two types of prompts on student responses in a problem-solving tutorial in an Introductory Genetics course. In addition, the goal of this research is to characterize the effect of the two types of prompts on students’ responses in situ as well as in near transfer situations. The research questions addressed in this study are as follows:  1) How did students engage in the problem-solving tutorial question sets with varying types of prompts?  Figure 1.1: Theoretical approach to research design  7 2) When compared to a control group, what was the effect of prompts (problem-solving and self-regulation) on how students engaged in the tutorial? 3) Was there an effect of prompting condition in the tutorial question set on student responses on a later, unprompted problem (as measured by completeness, correctness, and explanation)? To address these research questions, all student responses to the question set were evaluated on three different facets of engagement – completeness, correctness, and explanation quality. These three facets were used to describe student engagement throughout the question set. It was expected that students in the Control condition would complete more questions than the two prompting conditions because they did not have the prompts present in addition to the main content-based questions. In contrast, it was expected that students in the two prompting conditions would have more correct answers than the Control condition because the prompts encouraged them to focus on useful information and guided them through the problem-solving process.  There were also expected differences between the two prompting conditions. The first prompting condition was designed to answer the question “is it enough to guide students through the problem-solving process”, while the second prompting condition was designed to answer the question “or is it necessary for them to think about how to problem-solve and the strategies that support problem-solving in addition to being guided through the problem-solving process”? With these theoretical questions in mind, it was expected that students in the second prompting condition would have deeper explanations than the first prompting condition. This was expected because the nature of the second set  8 of prompts encouraged students to articulate their reasoning in addition to understanding the correct answer. Finally, it was expected that the problem-solving skills developed during the tutorial would extend to a quiz question and that the overall trends would remain. In particular, we predicted increased correctness and explanation quality for the prompting conditions, and that students in the second prompting condition would provide the highest quality responses. 1.3 Overview of methodology  This research was a quasi-experimental quantitative study conducted in an Introductory Genetics course at the University of British Columbia located in Vancouver, British Columbia, Canada. Students participating in a weekly tutorial, as part of the course requirements, took part in an intervention during a single week of these tutorials. Students completed a five-question tutorial and a quiz, and provided consent for their materials to be analyzed (N = 300). All questions were open-ended and required students to make sense of data and apply concepts in novel contexts.  There were three conditions, which were scaffolded to explore how additional prompts impacted student responses to tutorial questions. The first condition, Control, received no prompts and only engaged in the domain-specific problem-solving activity.  The second condition, Problem-Solving, encouraged positive problem-solving behaviours, such as stating known information and identifying relevant data, through answering content-related prompts. The third condition, Self-Regulated Learning, included the same positive problem-solving prompts, and also asked students to reflect on why the prompts assisted them in problem-solving.   9 Student responses to the question set and the quiz question were coded along the three dimensions of student engagement described above – Completeness, Correctness, and Explanation quality. Answers to all prompts were coded for explanation only as they were embedded within a question and completing them correctly was not necessary for overall question correctness. The results were analyzed in three different manners to answer the research questions.  First, Engagement Profiles were developed based on students completeness score on each question. Engagement Profiles provided an at-a-glance picture of student engagement by categorizing a student’s overall approach to answering the question set. Second, students’ scores on all three facets of engagement – Completeness, Correctness, and Explanation quality – were converted into a single measure called Engagement Patterns. These provided a picture of the overall quality of response for all students on a single question. Together, Engagement Patterns and Engagement Profiles described how students engaged in the question set (Research Question 1). The Engagement Profile and Engagement Patterns distributions between the three conditions conditions were also analyzed (Research Question 2). Finally, student responses to question five and the quiz question were analyzed as post-intervention measures to identify differences between conditions along the three facets of engagement and learning (Research Question 3). Midterm 1 grades were used as an indicator of content knowledge, ensuring no differences in prior knowledge existed between conditions prior to the intervention.  In this chapter I described the challenge researchers and instructors face by attempting to develop science students problem-solving skills in introductory courses and the need for research that explores possible mechanisms to facilitate this skill  10 development. In addition, I described the research questions and hypotheses in this study. In the next chapter, I will describe the context of the research study and the literature I drew upon to define problem-solving skills and one possible method for developing these skills.         11 2 Literature Review  Culture plays a significant role in society, including decisions about how our youth are educated. Our cultural backgrounds pulse below the surface and are so ingrained in personal experience that it is often difficult to identify the biases and expectations created. Cultural norms influence factors from the expectations learners bring with them to the classroom about how they should be taught as well as what it means to learn, and to how institutions prioritize academic programs and design courses. This research was conducted in an undergraduate science course at the University of British Columbia. Understanding this cultural context sets the stage for how this research study was conceptualized and enacted. This understanding also informs the design of the educational intervention.  Science education is a well-established field in Canadian education research. Although it is composed of multiple disciplines, such as Biology, Chemistry, Physics, and Earth Sciences, they are often theorized together due to their common values and their shared conceptions of knowledge (Kitsantas & Kavussanu, 2011). For example, constructivist epistemologies, which consider learners to be actively constructing knowledge and understanding based on prior experience, and a focus on the scientific method are common across the fields (Schoenfeld, 1992). In addition, science courses tend to share similar course structures, such as labs and tutorials, as well as pedagogical approaches. These pedagogical approaches are particularly important to situate this research study.   Science education research has been characterized by a focus on conceptual understanding, the development of students’ skills in areas such as problem-solving,  12 critical thinking, and scientific reasoning, and the ability to transfer understanding to novel contexts (Kitsantas & Kavussanu, 2011). Many of the pedagogical considerations being debated in science education focus on how to advance students’ abilities in these areas. 2.1 Approaches to science education at the University of British Columbia  In 2007, the University of British Columbia made a commitment to these principles of science education through the development of the Carl Wieman Science Education Initiative (CWSEI), which puts research-based understandings of science education into practice. This initiative saw the creation of dedicated post-doctoral positions to assist in the modification and development of courses that support student learning. The core principles were to “establish what students should learn, determine what students are actually learning, and improve student learning” with the ultimate goal of “achieving the most effective, evidence-based science education” (Wieman, Perkins & Gilbert, 2010). In fields like Biology, this has included the modification of many courses that are traditionally considered to be fact-based and focused on memorization. These CWSEI researchers have worked to highlight existing courses focused on conceptual understanding and skills development, as well as developing new initiatives including designing methods for measuring students’ conceptual understanding (Semsar, Knight, Smith & Birol, 2011; Smith, Wood, Krauter & Knight, 2011; Knight, 2010), exploring students biological science misconceptions (McDonnell & Kalas, 2013; Knight & Smith, 2010), and changing student approaches to problem-solving (Taylor, Smith, van Stolk & Spiegelman, 2010). The CWSEI has a strong culture of innovation realized through the  13 implementation of research-based practices into courses. This focus on innovative pedagogies was integral to the implementation of this research study.  2.2 Challenges in acquiring problem-solving skills in introductory courses Acquiring the problem-solving skills necessary to be successful in an undergraduate program is not an easy task. This challenge is made more difficult for students in introductory courses as they are often novices in both the content and the strategies needed to successfully solve problems. Novice students’ lack of experience poses a unique challenge for a number of reasons. For example, novice students struggle to make sense of information from multiple sources (Stadtler, Scharrer, Brummernhenrich & Bromme, 2013), develop complex understandings of content (de Bruin, Rikers & Schmidt, 2007), reason scientifically (Willingham, 2008), think critically (Carmel & Yezierski, 2013; Willingham, 2008), and solve problems in new contexts. And while it is necessary for students to struggle with these skills in order to develop them more deeply, it is nonetheless important to foster this struggle in a meaningful way. The task is compounded because novices struggle to develop a deep understanding of content and develop sophisticated approaches to learning at the same time (Chase & Simon, 1973). These are not small hurdles to overcome, especially when they occur in a survey course where a large amount of material is covered in less detail to provide foundational knowledge for future courses. In these courses it is unlikely that students will develop deep, nuanced understanding of content, which also makes it difficult for them to develop sophisticated problem-solving skills to respond to the material.    In addition, there are a unique set of challenges that instructors and researchers face in their efforts to advance students’ problem-solving skills. First, skills such as  14 problem-solving are extremely difficult to teach directly (Yeager & Walton, 2011). In a meta-analysis of social-psychological interventions, Yeager and Walton (2011) demonstrated that explicitly articulating your goals can often undermine an instructor’s efforts to develop students’ skills. They recommend that instructors should integrate such interventions into the course structure unbeknownst to students.  Once the intervention has been selected, the materials must be implemented in context, and in an authentic manner (Devolder, 2012; Schraw, Crippen & Hartley, 2006). For the purposes of this research, that means developing prompts that are relevant to the content and that students feel are meaningful to their learning. Without this saliency of prompts, students are unlikely to engage with them and the intervention loses purpose. These are two extremely difficult requirements, which I will discuss further in this thesis.  2.3 Problem-solving in scientific contexts  Problem-solving is highly contextual and often novice learners have not been exposed to a particular subject area in enough depth to have developed problem-solving skills in a meaningful and contextual manner. In addition, the highly contextual nature of problem-solving makes it difficult to define in absolute terms, as its meaning differs across disciplines, subjects, and courses. For this study, I drew upon multiple frameworks to develop a definition of problem-solving. First, the focus is on identifying positive problem-solving behaviours in the context of genetics, as this study aims to create conditions that will foster ‘positive problem-solving behaviours’ in order to support students development of content knowledge and problem-solving skills in that area. I draw on research about expertise vs. novice problem-solving behaviours and scientific inquiry to define these problem-solving behaviours. Then I turn to self-regulated learning  15 theory, which provides a lens through which we can think about problem-solving behaviours more generally.  2.3.1 Positive problem-solving behaviours  Differences between expert and novice approaches to problem-solving are well documented in the literature, including a smaller body of literature in genetics education (Smith & Good, 1984). There are a number of behaviours that experts engage in that make for successful and efficient problem-solving. These behaviours form the foundation of the positive problem-solving behaviours of interest in this study. First, experts tend to have more strategies based on previous experience to help them solve problems. Second, they check their work as they go and modify their approaches if necessary. And finally, they approach problem-solving as conceptual and expect to make sense of information in order to create knowledge/understanding (Smith & Good, 1984).  Scientific inquiry is an integral part of scientific reasoning and promotes the development of expert-like views on the nature of science (Alters, 1997; Ryder, Leach & Driver, 1999). De Jong’s (in Elen, Clark, & Lowyck, 2006) model describes scientific inquiry as an iterative process consisting of five steps:  orientation, hypothesis generation, experimentation, drawing a conclusion, and making an evaluation. Students experience a similar version of these five steps during problem-solving. The five steps of scientific inquiry are adapted here to reflect the definition of positive problem-solving behaviours used in this study.  De Jong (2006) describes orientation as the process of placing oneself in the problem. In the first step of problem-solving, students orient themselves to a new problem by engaging in behaviours such as interpreting what the problem is asking and  16 identifying relevant information. This is a key step in the problem-solving process and is a stumbling block for many students who have not developed sophisticated problem-solving skills (Butler, 2002). Once students have made sense of what they are being asked to do in a problem, they must develop a hypothesis for why the problem is occurring. Articulating a coherent hypothesis is an important step in making a plan to move forward. Although the language of hypothesis generation is not often used in problem-solving, students with positive problem-solving approaches often make predictions about what is happening and use this prediction to inform their next steps (Smith & Good, 1984). Experimentation involves students testing out their hypothesis and executing many of the technical aspects of problem-solving. Finally, students must interpret their answer, draw a conclusion, and evaluate whether their conclusion makes sense in the context of the problem. Reflection is necessary throughout this process (Butler & Winne, 1995). For example, students must reflect on previous knowledge during orientation, and they must reflect on their hypothesis and their understanding of the problem in order to make an evaluation.  2.3.2 Problem-solving and self-regulated learning  Self-regulated learning has been highly theorized and shares many features with problem-solving. As a result, the self-regulated learning literature provides an interesting theoretical lens through which I can investigate problem-solving. An examination of the self-regulated learning literature provides a deeper explanation of what I mean by positive problem-solving behaviours. In this study I draw upon two theorists models of self-regulated learning – Zimmerman (1989) and Pintrich (2004).   17 The first is the social-cognitive model by Zimmerman (1989), which considers how learning is impacted by cognitive, behavioural, and environmental factors (Figure 2.1). This includes the thoughts, feelings, and actions that learners engage in during the attainment of a goal (Zimmerman, 2011). This model proposes that students regulate their learning in response to their environmental context and adapt their behaviours based on experience and modify their goals and cognitive strategies accordingly.  In theory, all students are self-regulated learners as they all participate in this feedback loop of modifying behaviours and strategies based on previous experience and environmental context. What distinguishes one student from another is the quality and quantity of their self-regulatory behaviours. Zimmerman and Martinez-Pons (1986) identified 14 self-regulatory learning behaviours that are indicative of self-regulated learning. Some behaviours that are important for sophisticated problem-solving include goal setting and planning, organizing and transforming, seeking information, monitoring, and self-evaluation. These behaviours occur at different points in the cycle of self-regulated learning.  Figure 2.1: Zimmerman's (1989) model of self-regulated learning  18 The second model is by Pintrich (2004), which explores the phases a student goes through during self-regulated learning. The four phases he describes are (1) forethought, planning, and activation, (2) monitoring, (3) control, and (4) reaction and reflection (Table 2.1). Pintrich (2004) characterized how a learner could effectively regulate their behaviour in each of these phases. The first phase includes students identifying prior knowledge and relevant strategies, making a judgment about how valuable and difficult the question is, and concluding whether they are interested in answering the question. Students also plan how to spend their time and how to check their behaviour. The second phase includes students being aware of and checking their understanding, motivation, and behaviours while answering the question. Some behaviours of particular importance are how much effort students put in to solving the question, how long they are taking on the question, and whether they need help to solve the problem. The third phase includes students using the information they gathered in the second phase and selecting the strategies needed to modify their understanding, motivation, and behaviours accordingly. The fourth phase includes students reflecting on the choices they made and the about the task itself.     19 Table 2.1: From Pintrich (2004) - Phases and areas for self-regulated learning Phases Areas for regulation Cognition Motivation/Affect Behaviour Context Phase 1  Forethought, planning, and activation Target goal setting Prior content knowledge activation Metacognitive knowledge activation Goal orientation adoption Efficacy judgments  Perception of task difficulty  Task value activation Interest activation Time and effort planning Planning for self-observation behaviour Perceptions of task Perceptions of context Phase 2  Monitoring Metacognitive awareness and monitoring of cognition Awareness and monitoring of motivation and affect Awareness and monitoring of effort, time use, need for help Self-observation of behaviour Monitoring changing task and context conditions  Phase 3 Control Selection and adaptation of cognitive strategies for learning, thinking Selection and adaptation of strategies for managing, motivation, and affect Increase/decrease effort Persist, give up  Help-seeking behaviour Change or renegotiate task Change or leave context Phase 4 Reaction and reflection Cognitive judgments Affective reactions Choice behaviour Evaluation of task   Pintrich’s (2004) model and Zimmerman’s (1989) model are explored in relation to the positive problem-solving behaviours described above. Using the first phase as an example, the self-regulatory behaviours can be exercised in a number of ways during problem-solving. Students can regulate their cognition by setting goals for their learning and evaluating their previous knowledge and understanding of the question. Students can regulate their emotions by judging the difficulty of the task and determining the value of answering the question. Students can regulate their behaviours by planning out their time. Finally, students can regulate their environment by changing their perception of the task. Each of these phases (Pintrich, 2004) and self-regulatory behaviours (Zimmerman &  20 Martinez-Pons, 1986) are closely related to the positive problem-solving behaviours described above. For example, the behaviours students undertake in the forethought, planning, and activation phase can significantly impact how students orient themselves to a problem. By thinking about scientific inquiry and self-regulated learning together, we have a more complex and holistic picture of the problem-solving process.  The self-regulated learning literature also provides important information about student trends related to age, giftedness, and the role of context. Although there is extensive research on the developmental nature of self-regulated learning it is important to apply this information critically, as there are no benchmarks for “how much” self-regulated learning students should exert. These trends are discussed here to provide insight into the nuances of self-regulated learning.  It is well documented that how students regulate their learning changes over time and becomes more sophisticated as learners gain experience in a new context (Wigfield, Klauda & Cambria, 2011; Zimmerman & Martinez-Pons, 1990). For example, children imitate positive learning behaviours, adolescents control their behaviour in structured settings, and adults self-regulate their behaviour without structure (Zimmerman, 2011). There are a number of established trends related to age. First, students’ strategy use increases as they age (Zimmerman & Martinez-Pons, 1990). Second, the strategies they employ shift over time in relation to environmental demands, such as students moving away from using primary sources to using their secondary interpretation of those sources (ex. using self-generated notes instead of textbooks), students seeking help from parents to teachers (and in the case of students with high self-efficacy, seeking help from peers),  21 and students monitoring their own learning more as they age (Zimmerman & Martinez-Pons, 1990).  There are also a number of trends that document the differences between students of the same age who are categorized as average or gifted. Gifted students demonstrate more strategy use than their average peers, especially in making sense of information and seeking help (Zimmerman & Martinez-Pons, 1990). When considering differences between students of the same age, self-regulated learning strategy use is a major predictor of performance (Pintrich and de Groot, 1990).  The self-regulated learning literature also provides strong evidence for the role context plays in problem-solving (Alexander, Dinsmore, Parkinson & Winters, 2011; Butler, 2002; Butler, Cartier, Schnellert, Gagnon & Giammarino, 2011; Kitsantas & Kavussanu, 2011). Context is important for two reasons. First, we learn content about a particular subject, and our ability to apply that knowledge outside of that context is limited. Second, students bring deep-seated beliefs about their capacity to learn which are also dependent on the content being learned (Bandura, 1978; Partin, Haney, Worch, Underwood, Nurnberger-Haag, Scheuermann & Midden, 2011). For example, students often have strong beliefs about their ability to learn math (both positive and negative). These beliefs impact their motivation to learn the subject and their confidence on whether they can master the content (Zimmerman, 2008). These two factors together mean that when students develop the skills to regulate their learning, they are dependent on what they are learning and their perceptions about learning that material. Students’ motivation to learn and their interest in a subject also make an impact on the context-dependent nature of self-regulated learning. Zimmerman notes that students are constantly self- 22 regulating their learning, the key is whether they are doing this more or less efficiently. A student can have well-developed approaches to regulate their learning in one context, having honed their self-regulated learning strategies through experience, while they are a relative novice in another subject. This student could be a highly motivated and goal directed overall, but their ability to effectively utilize their self-regulated learning strategies could be drastically different in unique contexts.    Finally, the self-regulated learning literature allows us to address another important factor in this research study, which is that students are working in groups during the main portion of problem-solving. While some models address how environmental factors influence an individual student (e.g. Zimmerman, 1989), self-regulated learning is often theorized in relation to the individual and how they navigate learning. This can, and often does, include how an individual interacts with others, but the theory refers to how these interacts and other factors influence an individual’s behaviours in the pursuit of an educational goal. Co-regulation is an adaptation of self-regulated learning theory that addresses how students impact one another when they are participating together in the learning process (Jarvela & Hadwin, 2013). In contrast to self-regulated learning, co-regulated learning is when “individuals’ regulatory activities are guided, supported, shaped, or constrained by and with others” (Jarvela & Hadwin, 2013). Although models of self-regulated and co-regulated learning have been distinguished from each other, it is likely that they occur in tandem in most authentic educational environments. While this research does not address the implications of students working in groups and that students are likely participating in co-regulated learning and self-regulated learning at the same time, the distinction is important to make.   23 2.4 Prompting students during problem-solving  There is the fine line between designing materials that support students’ successful problem-solving and creating an environment where students learn to take control over their own problem-solving processes. The challenge and question of many researchers is whether leading students through problem-solving encourages this autonomy, or if further prompts are needed to encourage this autonomy. The prompts in this study were designed to address these two approaches to problem-solving supports.  In this study, I used prompts as a form of scaffolding for students. Scaffolding includes any “tools, strategies, or guides that support students in gaining higher orders of understanding” (Devolder, 2012). Hannafin, Land and Oliver (1999) define four types of scaffolding, each of which serves a unique pedagogical function. The four types of scaffolds are conceptual, metacognitive, procedural, and strategic scaffolds (Hannafin, Land & Oliver, 1999; Zohar & Barzilai, 2013). Conceptual scaffolds direct a student toward information that is available in the problem and encourage them to make sense of it. Metacognitive scaffolds direct students to think about the problem in different ways and to think about different strategies that would help them solve the problem. Procedural scaffolds direct students attention to resources or information available to them to successfully solve the problem. Strategic scaffolds direct students through strategies and approaches to successfully solve the problem (Hannafin, Land & Oliver, 1999).  Each of the four types of scaffolds was incorporated into the prompts developed for this study. All prompts were embedded directly in the materials and they were fixed (i.e. no feedback was provided). Prompts were used in the two experimental conditions and were delivered in three formats (Figure 2.2). The first experimental condition  24 received the first type of prompt, while the second experimental condition received all three types of prompts.  The first type of prompts was intended to support positive problem-solving behaviours such as stating known information and identifying relevant data. These prompts were developed as conceptual scaffolds that directed the student toward relevant information to facilitate problem-solving (Devolder, 2012). An important feature of these prompts is while they turn students in a direction that is meaningful to solve the problem, they do not guarantee students will answer the question correctly (van Merrienboer & Sweller, 2005). For example, a prompt might ask students to explicitly identify and state relevant information from the question, but if a student is not able to identify why this information is relevant they might not successfully answer the question.  The second and third types of prompts were intended to encourage students to consider how the prompts support problem-solving, but they approached this from different perspectives.  The second type of prompts comprised the majority of the additional prompts provided in the second experimental condition. Here students were asked to reflect on how the prompts helped them solve the problem at hand. These prompts were conceptual and procedural scaffolds that directed the student toward relevant information and asked them to consider why understanding that information was helpful for learning (Devolder, 2012). For example, after stating relevant information from the question, students might be asked to reflect on why that information is relevant and what it tells them about the question being asked. By supplementing the conceptual scaffold with a related procedural scaffold, students are provided with more support to explicitly make sense of the  25 information instead of leaving it up to them to identify the relevance of the conceptual prompt.   The third type of prompts consisted of a small subset of prompts delivered at the end of each question instead of being embedded within the question. These prompts were metacognitive and strategic scaffolds that asked students to extrapolate the problem-solving skills they used in the current context for future use (Devolder, 2012). Metacognitive scaffolds in the form of prompts have been shown to shift students toward more sophisticated problem-solving strategies, including more critical analysis of materials and increased evidence-based decision-making (Peters & Kitsantas, 2010; Holmes, Day, Park, Bonn & Roll, 2014; Holmes, Day & Bonn, 2013).      2.5 Student engagement and learning during problem-solving  Intervention-based research often measures the outcome of student learning through formal assessment such as quizzes. This method of assessment, when coupled with a pre-intervention measure of knowledge, allows researchers to demonstrate Figure 2.2: Prompt and scaffold types utilized in intervention  26 students learning gains as a result of the intervention. However, in research where there is a lack of understanding about the processes occurring during the intervention, there is a missing piece of the picture. Another approach is to collect information about students during the intervention to explore these processes. I adopted a two-pronged approach to research design for this study (Figure 2.3). This research looks at the outcomes of learning as well as at the process.  In the context of this study, engagement is defined as students’ active interaction with the materials during their participation in the problem-solving process. By measuring and characterizing engagement, I attempt to address the processes of problem-solving in the context of the intervention conditions. In terms of learning, as students answer questions they are developing problem-solving skills to some degree as well as content knowledge (McDaniel, Anderson, Derbish & Morrisette, 2007). Whether these are positive or negative to their long-term learning and skill development remains unknown. My particular interest is in the learning and engagement that occurs during and as a result of the intervention.   Figure 2.3: Overview of engagement and learning in the research design    27 In this chapter I described the context of the research study, which took place within the Carl Wieman Science Education Initiative where the culture of educational research is strong, as well as the literature and theoretical models that helped inform my definition of problem-solving skills that undergraduate science students should develop. In addition, I described how literature about prompting helped inform the research design. In the next chapter I will describe the research design in detail and expand on the coding and analysis methods.     28 3 Methodology  3.1 Research design  Knowing that students struggle to problem-solve in introductory science courses, this research study was created to explore possible mechanisms to support student problem-solving while they are engaging with the problems. A quasi-experimental study was designed and implemented in the context of an Introductory Genetics course. This course was selected because of its focus on conceptual understanding and problem-solving and it included a weekly mandatory tutorial where students worked on open-answer questions that required them to make sense of data and apply concepts to novel contexts.  The research was designed around these weekly tutorials and the intervention that took place during a single week of the tutorials. The intervention was designed to explore the effect of two different types of prompts on student engagement and learning. As such, three conditions were created, one control condition, and two experimental conditions,  with each tutorial section being assigned to a single condition.  3.1.1 Conditions  The three conditions were a Control condition, a Problem-Solving condition, and a Self-Regulated Learning condition. The three conditions built upon each other so that I could explore how additional prompts impacted student responses to tutorial questions. Each prompt was formatted as a single question and was provided directly after a tutorial question. Only a subset of the tutorial questions contained prompts, to allow for pre- and post-intervention measurements (Table 3.1). All prompts were fixed, and no feedback  29 was provided to students after responding to a prompt. This allowed for consistency between tutorials and Teaching Assistants.  Table 3.1: Summary of questions and prompts in question set, by condition Problem Set Question Number  Number of parts in each question Conditions* Control Problem-Solving  Self-Regulated Learning 1 3 Unprompted  Unprompted Unprompted 2 3 Problem-Solving prompts Problem-Solving and Self-Regulated Learning prompts 3 2 4 1 5 1 Unprompted Unprompted 6 (Quiz)  4 Unprompted Unprompted Unprompted  The conditions built upon each other, from Control to Problem-Solving to Self-Regulated Learning. The Control condition received no prompts. The Problem-Solving and Self-Regulated Learning conditions contained prompts in questions two through four. For these questions, a prompt was included for all parts [See Appendix A for a summary of the question set, including prompts, provided to each condition]. For example, a different prompt was given in Question 2a, 2b, and 2c. A sample question from the tutorial (Question 2a) is included in the description of each condition, to provide a concrete example of questions and prompts that students experienced (Figures 3.1, 3.2, 3.3, 3.4, and 3.5).  This study was conducted in the context of genetics, specifically on a unit involving genetic analysis, complementation, and mutagenesis for the purposes of identifying multiple genes involved in a single phenotype. Each question started by posing a scenario and providing students with data, then asking students to make sense of/interpret this information. If a question had parts, each part addressed different aspects  30 of the problem and built on previous answers to address more sophisticated topics. For example, in Question 2a, students were provided with a scenario where an animal was known to have three different coat colours. When animals with different coat colours were bred together, a variety of coat colours resulted in their offspring. The question included a data set that described the colours of the offspring and asked students to devise a genetic explanation for the breeding data (Figure 3.1).  In minks, wild types have an almost black coat. Breeders have developed many pure lines of color variants for the mink-coat industry. Two such pure lines are platinum (blue gray) and Aleutian (steel gray). These lines were used in crosses, with the following results:     Control condition: Students completed a traditional tutorial problem set comparable to the course tutorial format. This tutorial question set included no prompts to support problem-solving. Students were asked to answer the question posed after being given the scenario and data set (Figure 3.2).   Devise a genetic explanation of these three crosses. Show complete genotypes for the parents, the F1, and the F2 in the three crosses, and make sure that you show the alleles of each gene that you hypothesize for every mink.   Figure 3.1: Scenario and data provided in Question 2a Figure 3.2: Question 2a, as seen by the Control condition  31 Figure 3.3: Question 2a, as seen by the Problem-Solving condition Problem-Solving condition: Students completed a tutorial problem set with the same questions found in the Control condition. In addition to the main content-based question posed in the Control condition question set, prompts were provided after all question parts in questions two through four. These prompts were intended to encourage students to engage in positive problem-solving behaviours, such as stating known information and identifying relevant data, by answering content-related prompts intended to facilitate students’ successful completion of a problem. All prompts asked students to explicitly make sense of information that was integral for successful completion of the question. Although it was not necessary for the student to answer this prompt to answer the question, understanding the information would help students move in a productive direction. For example, in Question 2a students were asked to identify the ratios of the different coat colours in the data set (Figure 3.3). Understanding ratios helps students to identify patterns, which represent different genetic explanations.   Devise a genetic explanation of these three crosses. Show complete genotypes for the parents, the F1, and the F2 in the three crosses, and make sure that you show the alleles of each gene that you hypothesize for every mink.   What are the ratios observed in the scenario?      Self-Regulated Learning condition: Students completed a tutorial problem set with the same questions and prompts found in the Problem-Solving condition. In addition, this condition contained a second set of prompts that asked students to reflect on how the problem-solving prompts supported their learning. For example, Question 2a asked  32 Figure 3.4: Question 2a, as seen by the Self-Regulated Learning condition students to articulate why understanding the ratios would be helpful for solving the question (Figure 3.4). As described above, answering this prompt would require students to understand that identifying ratios gives information about the number of genes involved in determining coat colour.  At the end of questions two through four, students were also asked to answer a final prompt that was slightly removed from the question context. This prompt encouraged them to make sense of overarching learning strategies and how they could be applied in novel contexts. As such, students had the opportunity to address two aspects of self-regulation – reflective and contemplative – while answering questions. The aim of having embedded and separated prompts was to solidify student conceptions of how to problem-solve, with the ultimate goal of them being able to apply those strategies in novel contexts. A sample reflection prompt from Question 2 is given in Figure 3.5. Devise a genetic explanation of these three crosses. Show complete genotypes for the parents, the F1, and the F2 in the three crosses, and make sure that you show the alleles of each gene that you hypothesize for every mink.   What are the ratios observed in the scenario?     What do the ratios tell you about the problem? Why is it important to make sense of the data to answer this question?        33 Figure 3.5: Question 2 reflection question for self-regulated learning condition  What was a stumbling block you encountered when solving this problem? What did you do to successfully deal with it?    What strategies did you use that allowed you to successfully answer this problem? How can you use them in them to successfully answer another problem?      3.1.2 Assigning conditions  In this study, participants were drawn from across a number of course sections, and tutorials were facilitated by seven different Teaching Assistants (see Table 3.2). Each Teaching Assistant was responsible for facilitating only one condition (Control, Problem-Solving, and Self-Regulated Learning), to help ensure consistency of approach across sections. Beyond this, the 20 tutorial sections were randomly assigned to one of the three conditions. Numbers have been assigned to Teaching Assistants to ensure anonymity.  Table 3.2: Teaching Assistant assignment of condition Teaching Assistant Condition Number of tutorial sections 1 Control 2 2 Control 3 3 Problem-Solving 3 4 Problem-Solving 2 5 Problem-Solving 2 6 Self-Regulation 3 7 Self-Regulation 4  3.1.3 Teaching Assistant preparation  Seven Teaching Assistants facilitated the tutorials. All Teaching Assistants were graduate or post-doctoral students pursuing studies within the Biology department in the Faculty of Science at the University of British Columbia. Teaching Assistants  34 participated in a weekly meeting to prepare for the upcoming week of tutorials. This meeting took place on the Friday prior to the start of a new week of tutorials. I was able to use this weekly meeting prior to the start of the intervention to prepare the Teaching Assistants.  The researcher and a former course instructor prepared the Teaching Assistants for approximately one hour, including time for Teaching Assistants to ask questions about the research, their role, and the procedure. This session addressed the background and details of the study, specific instructions on how to administer the intervention (such as ethics and consent), and procedure for responding to student questions. Teaching Assistants were provided with a summary of the questions and prompts for each condition and a summary of the preparation instructions [Appendix B].  3.2 Course context The Introductory Genetics course where the research took place was an undergraduate course offered three times per year at the University of British Columbia in the Department of Biology. It was a 200-level course, meaning it was intended to be completed in the second-year of studies in a Bachelor of Science degree. The course is required for any student majoring in Biology and addresses fundamental genetics principles, including mutation, phenotype, segregation, linkage, complementation, and gene interaction, as well as applications of these principles [Course syllabus is available in Appendix C]. Two instructors taught the course, with lecture time divided equally. Both instructors were faculty members within the Department of Biology and had previously taught this course.     The course was 13-weeks in duration with three one-hour lectures per week. In addition, students attended a 110-minute tutorial once per week. Students were enrolled  35 in a tutorial section at the outset of the course. Prior to attending their registered tutorial section, students completed a weekly reading quiz related to the previous weeks content, and they completed a set of related conceptual questions (approximately five questions). During the tutorial, students addressed questions stemming from earlier assignments and completed a second set of conceptual questions that often built upon earlier concepts or required additional application skills.  The course included extensive formative and summative assessment components. The evaluation scheme is described in Table 3.3. All methods of evaluation in the course focused on conceptual understanding of course material. With the exception of the reading quizzes and in-class activities, all methods of evaluation required students to respond to conceptual questions with short answer responses. Figure 3.5 provides an example of a conceptual question students answered prior to the tutorial where the intervention took place.   The intervention took place in the tutorials during Week 9 of the course in the Winter 2013 term, prior to Midterm 2. The topic of the weekly tutorials was complementation, which was identified by a previous course instructor as being conceptually difficult for students and was approved by the current course instructors (McDonnell & Kalas, 2013).     36  Table 3.3: Methods of course evaluation Evaluation Method Percentage of Grade Details Midterm 1 15 50 minutes; during week 5 of course Midterm 2 15 50 minutes; during week 10 of course Final  55 2.5 hours; following completion of course Tutorials 10 Bi-weekly quizzes during tutorials Reading Quizzes 3 Weekly online quiz related to previous weeks content In-class activities 15 Responses to conceptual questions (using electronic response systems)   Talons (the sharp curved claws on the ends of owl feet), just like our fingernails, are made of the protein keratin, just like our fingernails.  Imagine you perform a mutant screen for owls with missing or defective talons to find the genes controlling the process of making talons.  a) If you find mutants lacking claws because the gene for β-keratin is mutated, describe how else the mutant phenotype might differ from wildtype besides the lack of claws.  Notes from course coordinator:  Lots of possible answers: e.g. Beak could be missing because it is also made of β-keratin. Feathers could be affected (quills are made of β-keratin)  b) What other kinds of genes might give you no talons-besides a mutation in the gene for keratin? Notes from course coordinator:  This will be hard for students. Genes to control the growth of the limb (e.g. Hox genes, or more simply transcription factors involved in the development of the limb…students don’t learn this in any previous classes unless their particular BIOL 121 instructor talked about HOX genes)  3.3 Participants All participants were enrolled in the Introductory Genetics course at UBC and were enrolled in a tutorial section. A total of 434 students were registered in the course. Table 3.4 describes the major programs of study for students enrolled in the course. Table 3.5 outlines the year of study for students enrolled in the course. Most students were Figure 3.6: Sample conceptual question with possible answers, provided by course coordinator  37 enrolled in an undergraduate science program in their second year of study. Students must have successfully completed a full-year of 100-level Biology at UBC as a prerequisite for entrance into this course.  Table 3.4: Student demographics, major programs of enrolment (>10 students enrolled in program reported) Program Number of Students  Biology 138 Biochemistry 126 Integrated Sciences 24 Science One  24 Applied Animal Biology 23 Life Science 12   Table 3.5: Student demographics, year of study Year of Study Number of Students 1 19 2 327 3 62 4 20 5 4 Unknown 1   3.3.1 Consent  Students enrolled in the Introductory Genetics course were notified of the research study via their Learning Management System. This online notice outlined the details of participation and indicated students would be notified in their tutorial section and provided with more information. During their regularly scheduled tutorial section, students were notified that a research study was taking place and provided with instructions for giving or denying consent [a copy of the consent form can be found in Appendix D]. Teaching Assistants outlined to students that participation in the study entailed submitting their tutorial question set and quiz responses to be analyzed. It should be noted that all students were required to participate in the tutorial, but consent to have  38 their responses analyzed was optional. All materials were collected at the end of tutorials to remove any social pressure to consent. If consent was not given student responses were destroyed.   Students were asked to work in groups ranging from one to three students. As such, consent was required from all members of the group in order to be included in the analysis. A total of 319 students gave consent and were part of a group in which all members also gave consent. Of these students, 300 students submitted the question set and 298 also submitted the quiz question. These question set and quiz question responses were included for analysis.  Students were given course credit for submitting their final quiz during the tutorial. This quiz mark was given regardless of whether students provided consent to participate in the study. Each weekly quiz totaled 0.5% of a student’s final grade.   3.4 Materials   Students completed a question set in their weekly tutorial related to the content addressed in lecture during the previous week. Complementation was the topic of the intervention. The tutorial question set was developed by a previous instructor of the course, in conjunction with the course coordinator. All questions were conceptual in nature and required students to make sense of information or data. A copy of the tutorial question set for each condition is provided in Appendix E.   The question set consisted of five questions, with a sixth question administered separately as a quiz. All students received the same version of questions one and five, while questions two through four differed by condition (as described in Table 3.1). Question six (the quiz question) was identical for all conditions. Questions one, two, and  39 three each contained parts (1a, b, and c, 2a, b, and c, and 3a, and b), bringing the total number of questions students were asked to complete during the tutorial to ten. The quiz had four parts (6a, 6b, 6c, and a bonus).  Students submitted their consent forms, question set and quiz question to their Teaching Assistant at the end of their tutorial. As question sets were completed in varying group sizes, students listed the names of all group members on the first page of the question set. Consent forms and quiz question responses were collected for each individual. I collected all materials at the end of the week of tutorial sections for coding and analysis.  3.5 Procedure Each section was given time at the outset of the tutorial for Teaching Assistants to give instructions, deal with logistics (such as handing out tutorial materials), and allow students time to read through the consent form. During this time, students were asked to arrange themselves in pairs or groups of three, in the case of uneven numbers. Teaching Assistants were instructed to allow students to work individually if they did not wish to participate in groups. As such, groups ranged from one to three students (Table 3.6). Having students’ works in pairs provided a more authentic tutorial environment, as in all other tutorials students were permitted to work in groups, but to simplify coding for the purposes of future analysis and interpretation.     40 Table 3.6: Distribution of group size, by condition Group Size Condition Control Problem-Solving Self-Regulation Total Number of Groups 1 1 4 1 6 2 27 50 49 126 3 4 3 7 14 Total Number of Students 67 113 120 146/ 300  Teaching Assistants were given approximate timelines to follow for the 110-minute tutorial (Table 3.7). Teaching Assistants used approximately 20 minutes at the outset of the tutorial to address questions outside of the scope of the research study and to provide the instructions described above. Following this, students were given 55 minutes of the tutorial (excluding the time for instructions) to work freely on the tutorial question set. There were no requirements for students to work through questions in order; students could complete the question set as they saw fit.  Teaching Assistants indicated to students fifteen minutes prior to the quiz that they might want to move on to question five, as it would provide them with practice for the quiz. It was similar to the quiz in the sense that in all conditions is was an unprompted question and the quiz was also unprompted. It should be noted that students were not required to move on to question five, but all Teaching Assistants recommended it. The final twenty minutes of the tutorial were dedicated to the quiz, to ensure students were provided sufficient time to complete the question.      41 Table 3.7: Timeline for tutorial provided to Teaching Assistants Amount of time allocated (minutes) Problem Set Question Number 20 • Questions from previous tutorial work • Teaching Assistants provide instructions about study and consent 55 Free time for students to work on the question set 15 Students are recommended to spend this time working on Question 5 20 Students complete Quiz Question (for grade)   All students submitted their consent forms at the end of the tutorial regardless of whether consent was given. This was done to provide anonymity to students who chose not to consent and reduce any perceived social pressure to participate in the study. All tutorial question sets and quizzes were also submitted at the end of the 90-minute tutorial. 3.6 Data coding   Data was coded in two phases. The preliminary coding provided a baseline for future analysis, scoring student responses for three facets of student engagement. The secondary coding supplemented these codes, creating a fuller picture of how an individual student approached the question set as well as how all students approached a single question. All data was coded using Microsoft Excel 2011. 3.6.1 Preliminary coding: Completeness, Correctness, and Explanation scores  Student engagement on the fourteen question parts was rated on three scales, which were identified to address multiple facets of student engagement in problem-solving. These three scales were Completeness, Correctness and Explanation (summarized in Table 3.8). Multiple scales were included to explore the quality of student responses on multiple dimensions.  42 Completeness (Scored from 0-3): This scale was used to describe how much of the question students answered, independent of whether that answer was correct. This scale provided a measure of quantity for question responses.  Correctness (Scored from 0-2): This scale was used to describe the factual accuracy of the student answer, independent of the quality of the explanation regarding how the student arrived at this answer. The intention of this scale was to interpret the quality of the factual component of student responses.  Explanation (Scored from 0-2): This scale was used to describe the depth of explanation provided in light of student answers. The intention of this scale was to interpret the level of analysis and sense-making a student articulated while problem-solving.   Table 3.8: Definition of codes for Completeness, Correctness, and Explanation scales Scale Scoring Description Completeness 0 Not attempted at all 1 Attempted at a basic level; not possible for raters to score correctness 2 Attempted; possible to score correctness 3 Completed question Correctness 0 Incorrect 1 Partially Correct 2 Correct  Explanation 0 No explanation; student response did not contribute any explanation above the correctness of answer 1 Surface explanation; student response repeated or restated previously given information but did not provide any novel interpretation of answer 2 Deep explanation; student interpreted answer above restating previously given information  Remember that students completed question set responses in groups, whereas quiz questions were completed individually. In order to code responses, I needed to address the incongruence on the unit of analysis for the question set and quiz question. As such,  43 all question sets were coded and these scores were duplicated for each member of the group to allow for analysis at the level of the individual. Quiz questions were also coded at the level of the individual.   Each question part was coded for the three scales using a coding manual [Appendix F]. This resulted in fourteen scores for each scale (10 questions parts and four quiz parts). An average score was also calculated for each question (including the quiz) on each of the three scales, allowing for comparisons between questions with different numbers of parts.  There were two cases of scoring that did not meet these basic coding rules. First, all prompts were coded in addition to the questions. As prompts were embedded in the questions, it was not necessary for students to complete them correctly to successfully answer the main question. As a result, they were not coded on the Completeness or Correctness scales. They were only coded on the Explanation scale, because student responses to the prompts added to the depth of their explanation. This resulted in having multiple Explanation scores for the subset of questions with prompts. To address this, and to give students the most credit possible for their explanation quality, I used the highest Explanation score for each question in analysis. Second, the fourth part of the quiz question was a bonus question that asked students to expand on their answer in the third part. These questions had a high level of overlap and required students to make sense of the same data and explain themselves in a similar manner. As such, students’ maximum scores (on all three scales) between the two question parts were used in all analysis.   44 3.6.2 Secondary coding: Engagement Profiles and Engagement Patterns The primary coding provided a baseline for analysis, telling the story of what students did. What these codes did not provide was a narrative about how students engaged in problem-solving during the tutorial. In order to provide a comprehensive picture of the approach students took, a second level of coding was designed.  Two aspects of student engagement were characterized in this level of coding. The first, engagement profiles, provided information about how an individual student approached the question set as a whole. The second, Engagement Patterns, provided a snapshot of how all students approached a single question. Together, these give a fuller, more complete picture of student engagement during problem-solving.  Engagement Profiles Engagement Profiles were used to categorize the different approaches an individual student might use to solving problems in the overall question set (i.e., across Questions 1-5 and the quiz). Engagement Profiles were based on student completeness scores, converted into a binary score (0 and 1 became a 0; 2 and 3 became 1).  Prior to analysis, the research team identified four major types of engagement profiles students would likely use (Figure 3.7).  1. Run-Out-of-Time: When a student works ordinally through questions. This student does not skip any part of a question and continues working through questions until they run out of time. For example, a student might complete Questions 1, 2, and 3, but not attempt the remainder of the questions   45 2. Pick-Up: When a student works ordinally through the questions up until a point, when they skip over all remaining questions to answer the final question. For example, a student might complete Questions 1 and 2, then skip to Question 5.  3. Sampler: When a student attempts a minimum of two questions throughout the question set, working approximately in order. This student completes a portion of multiple questions but does not complete any question. For example, student might attempt 1a and 1b, but does not attempt 1c.  4. Other: This includes all students who did not fit into the other three profiles.  I identified these four Engagement Profiles based on a number of assumptions about possible student approaches. First, we predicted many students would work ordinally, attempting to answer each problem fully before moving on. Second, we predicted some students would work ordinally, but at one point realize the time and move on to the final question to prepare for the quiz. Third, we predicted some students would use a portion of a question as a test for their knowledge. For example, once they felt they either understood or did not understand the question they would skip the remaining question parts and start a new question.  The intention was to code all data based on these four Engagement Profiles and to explore the patterns of the students left in the Other profile. The engagement profiles would then be modified to account for as much data as possible.    46               Engagement Patterns  Engagement Patterns were used to create a snapshot of how all students, regardless of condition, approached a single question. A similar approach to engagement profiles was used to code engagement patterns; only in this case all three scales, Completeness, Correctness, and Explanation, were used to provide a fuller picture.  All scales were converted to binary scales to allow for categorization (see Table 3.9). These binary codes were used to develop engagement patterns, describing two overarching qualities of student responses to a given question: was it complete/correct, and was the explanation of a deep quality? Table 3.10 summarizes the possible engagement patterns.  Table 3.9: Scale conversions used to create binary codes for Engagement Profiles Scale Binary Scale Category 0 1 Completeness 0, 1 2, 3 Correctness 0, 1 2 Explanation 0, 1 2         Figure 3.7: Engagement Profiles diagrams used in original coding; Top left: Run-out-of-time, Top right: Pick-Up, Bottom left: Sampler, Bottom Right: Other  47  Table 3.10: Summary of Engagement Pattern categories Engagement Pattern Completeness/ Correctness Explanation Description A 1 1 Correct, with explanation B 1 0 Correct, no explanation C 0 1 Incorrect, with explanation D 0 0 Incorrect, no explanation  3.6.3 Inter-rater reliability  The researcher and a former course instructor developed the coding manual and discussed acceptable answers prior to embarking on the inter-rater reliability [Appendix F]. We identified the mandatory components of student responses to achieve each level of the three codes, based on the definitions of each code, and documented these baselines in the manual prior to rating.  We independently rated more than 10% of all tutorial question sets to ensure consistency and agreement with the coding manual. In the event that the coding manual was not clear, the researcher and former course instructor discussed possible interpretations, came to a conclusion on the coding, and updated the coding manual for future use. Following the independent coding, we discussed all discrepancies between ratings and possible interpretations of student responses. All discrepancies were resolved and the coding manual was updated.  We rated 32 student question sets, randomly sampling from all three conditions to account for differences in group sizes (control = 9, problem-solving = 12, self-regulation = 11). We achieved 91.47% overall agreement on these question sets, with ratings in  48 individual conditions ranging from 89-94%. As such, the researcher coded all remaining question sets independently.  3.7 Analysis First, Midterm 1 grades were used as a pre-test measure of students’ prior knowledge. These grades are described in this study in decimal form (i.e. .75 corresponds to 75% on the Midterm). An analysis of variance (ANOVA) was conducted to determine whether differences in prior knowledge existed among the three conditions. Next, a multivariate (MANCOVA) analysis was conducted to explore whether there were significant differences among conditions on question Completeness in the question set. Following this, Engagement Profiles and Engagement Patterns were analyzed. Finally, the two post-intervention unprompted questions (Question 5 and the Quiz Question) were analyzed to explore the effect of conditions on subsequent student engagement and learning. 3.7.1 Engagement Profiles and Engagement Patterns   Engagement Profiles and Engagement Patterns were coded to provide a narrative of student engagement for a single student across the entire question set and for a single question across all conditions, respectively. Chi-square tests of independence were performed on the Engagement Profiles and the Engagement Patterns to explore differences between conditions. This was done because both of these measures were nominal data. 3.7.2 Effect of condition on engagement and learning   Remember that the intervention took place in Questions 2, 3, and 4, with students in the Problem-Solving and Self-Regulated Learning conditions receiving prompts in  49 these questions and the Control condition receiving no prompts. Question 5 and the Quiz Question – which was given in the same tutorial session in a set amount of time and separate from the question set – were both unprompted in all conditions. These two questions were analyzed as post-intervention measures of student learning and engagement. Multivariate (MANCOVA) analysis was performed on Question 5 and the Quiz Question, where Condition was the dependent variable and the three engagement scales (Completeness, Correctness, and Explanation scores) were the independent variables. Students Midterm 1 grades were used as a covariate. Interaction and main effects were explored. Since the quiz question contained multiple parts, average Completeness, Correctness, and Explanation scores were calculated for analysis.   In this chapter I described the research design, course context, and materials, as well as the method of coding and analysis. In the next chapter I will describe the results of this coding and analysis, describing the Engagement Profiles and Engagement Patterns, the consequent statistical analysis, and the statistical results of the effect of condition on students responses on later unprompted questions (Question 5 and Quiz Question).      50 4 Results  4.1 Descriptive statistics 4.1.1 Midterm 1 grades Student grades on Midterm 1, which took place prior to the intervention, were used as a pre-test measure of students prior knowledge. A one-way ANOVA revealed there was no significant difference between the mean Midterm 1 grades of students in the three conditions (Control: M = .74, SD = .13; Problem-Solving Condition: M = .74, SD = .17; Self-Regulated Learning condition: M = .74, SD = .16), F(2, 298) = .016, p > .98 [Appendix H].  4.1.2 Question Completeness The Completeness scale was developed to give a rough indicator of students’ engagement in the question set. An average Completeness score was created for each question and used to explore differences in overall engagement with the question set between the three conditions.  Table 4.1 summarizes the total number of students who completed each question in each condition. In the Control condition, students Completeness scores decreased consistently with each question. The only point where students completed more questions than the previous one is Question 2a, and this was a small increase. In the Problem-Solving condition, students Completeness scores tended to peak during the first part of the question and decrease over subsequent parts, with diminishing peaks for each question. In the Self-Regulated Learning condition, students Completeness scores decreased consistently and quickly, with a single large increase for Question 5. The Quiz  51 Question had high Completeness scores across all conditions, with all conditions having between 94% and 100% on all question parts.   Completeness scores were also used to demonstrate whether significant differences exist between conditions for a given question. Midterm 1 grades were used as a covariate in this analysis to control for prior knowledge. Interaction effects and main effects were explored, but no interaction effects were found for either question. As such, only main effects results are reported here.  Multivariate (MANCOVA) analysis on question Completeness1 demonstrated that significant differences existed between conditions overall (λ = .849, F (10, 580) = 4.938, p =< .01, η2 = .078) (summarized in Table 4.2, with complete tables available in Appendix G). This represented a small effect size. A test of Between-Subjects Effects demonstrated significant differences between conditions on all questions except Question 4. It is not possible to determine the nuances of these differences between conditions, since post-hoc tests are not applicable when conducting a MANCOVA.  These significant differences on question Completeness for the question set were also found when Midterm 1 was used as the dependent variable instead of condition (λ = .922, F (5, 290) = 4.925, p =< .01, η2 = .078). This also represented a small effect size. A test of Between-Subjects Effects demonstrated significant differences on question completion for all questions except Question 1. This demonstrates that prior knowledge did not impact whether students completed the first question of the tutorial, while completion of subsequent questions was related to prior knowledge.                                                  1 This analysis was conducted on question part, confirming that the same overall trends observed for question average also existed for each question part. For convenience, only Question average results are included here.    52 This analysis demonstrates that students in the three conditions completed different questions during the tutorial, and that students with varying levels of prior knowledge completed different questions.  Table 4.1: Count of students who completed each question part; binary (0/1 = not attempted; 2/3 = attempted/completed) Question Control % of Condition Problem-Solving % of Condition Self-Regulation % of Condition Total Total % 1a 65 97.0 113 100.0 120 100.0 298 99.3 1b 63 94.0 109 96.5 100 83.3 272 90.7 1c 62 92.5 109 96.5 105 87.5 276 92.0 2a 64 95.5 113 100.0 88 73.3 265 88.3 2b 55 82.1 101 89.4 66 55.0 222 74.0 2c 51 76.1 99 87.6 56 46.7 206 68.7 3a 50 74.6 103 91.2 51 42.5 204 68.0 3b 38 56.7 76 67.3 33 27.5 147 49.0 4 35 52.2 47 41.6 10 8.3 92 30.7 5 29 43.3 65 57.5 112 93.3 206 68.7 Quiz – Part A 63 94.0 113 100.0 118 98.3 294 98.0 Quiz – Part B 65 97.0 112 99.1 119 99.2 296 98.7 Quiz – Part C/Bonus 66 98.5 112 99.1 117 97.5 295 98.3   Table 4.2: Multivariate (MANCOVA) analysis for question Completeness (average completeness, Questions 1-5)    Descriptive Statistics  Condition Mean Std. Deviation N 1Comp_Ave 1 2.7264 .55634 67 2 2.8437 .38594 113 3 2.6083 .66899 120 Total 2.7233 .55948 300 2Comp_Ave 1 2.4030 .93481 67 2 2.6932 .63016 113 3 1.6472 1.20634 120 Total 2.2100 1.06983 300 3Comp_Ave 1 1.9030 1.26503 67 2 2.2257 .94015 113 3 .9542 1.20048 120 Total 1.6450 1.26166 300 4_Comp 1 1.57 1.427 67 2 1.45 1.225 113 3 .28 .756 120 Total 1.01 1.264 300 5_Comp 1 1.06 1.266 67 2 1.54 1.254 113 3 2.48 .722 120 Total 1.81 1.219 300   Multivariate Tests Dependent Variable df F Sig. Wilks’ Lambda Partial Eta-Squared Condition (10, 580) 4.938 .000 .849 .078 Midterm 1 (5, 290) 4.925 .000 .922 .078 4.2 Engagement Profiles  The purpose of creating Engagement Profiles was to provide insight into how students across all tutorial sections approached the question set overall. Originally, four Engagement Profiles were predicted to account for the majority of student responses. First students’ responses were categorized as one of these four Engagement Profiles to test the initial categorization scheme. Following this preliminary coding, the Other Engagement Profile was explored in more depth to identify any remaining patterns that  55 characterized student approaches. Table 4.3 summarizes the distribution of students across the four original Engagement Profiles.  Table 4.3: Engagement Profile count, following preliminary coding Engagement Profile Count of Students Percentage of Students  Run-out-of-Time  143 47.66%  Pick-Up  88 29.33%   Sampler  0 0%   Other  69 23% Total 300 100%   Upon a further exploration of the Other Engagement Profile, I identified that a large portion of these students had completed the entire question set and that no other Engagement Profile accounted for them. In addition, no students met the criteria of the Sampler Engagement Profile.  Based on these results, two modifications were made to the Engagement Profile categories. First, a Complete Engagement Profile was created. Second, I noted that all of the remaining students in the Other Engagement Profile tended to follow the principle of the Sampler Engagement Profile, with one difference from the original definition. These students completed at least one question in full in addition to meeting the requirements of the Sampler Engagement Profile. Therefore the definition of the Sampler was updated to reflect this pattern, and the Other category was removed. The definitions of the          56 Engagement Profiles were modified to the following, with updated Engagement Profile images (Figure 4.1).   1. Complete: When a student completes the entire question set, answering all parts of each question.  2. Run-Out-of-Time: When a student works ordinally through questions. This student does not skip any part of a question and continues working through questions until they run out of time. For example, a student might complete Questions 1, 2, and 3, but not attempt the remainder of the questions 3. Pick-Up: When a student works ordinally through the questions up until a point, when they skip over all remaining questions to answer the final question. For example, a student might complete Questions 1 and 2, then skip to Question 5. 4. Sampler: When a student attempts a minimum of two questions throughout the question set, working approximately in order. This student can complete as many or as few parts of a question as they want, with no requirement on question completeness. This student answers a subset of questions and does not completing the question set. For example, student might attempt 1a and 1b, but does not attempt 1c.   57   Both the Control and Problem-Solving conditions had approximately 80% of students categorized in the Complete or Run-out-of-Time Engagement Profiles. In contrast, approximately 80% of the Self-Regulated Learning condition was categorized in the Pick-Up Engagement Profile. Table 4.4 summarizes the student distributions within these modified Engagement Profiles, including information about which condition students were in. These results show that students in the Control and Problem-Solving conditions approached the question set in similar manners, although more students in the Problem-Solving condition completed the entire question set (34.33% in the Control condition to 53.10% in the Problem-Solving condition). The distribution of Engagement Profiles also show that students in the Self-Regulated Learning condition approached the question set differently than students in the other two conditions. A chi-square test of independence confirmed that the student Engagement Profiles differed significantly by condition, χ2 (6, N = 300) = 185.59, p < .01.      Figure 4.1: Engagement Profiles diagrams after coding; Top left: Complete, Top right: Run-out-of-time, Bottom left: Pick-Up, Bottom right: Sampler  Table 4.4: Engagement Profile distributions, by condition (count and percentage of condition)  Engagement Profile Control Condition Problem-Solving Condition Self-Regulated Learning condition Profile Total % of Total Population Count % of Condition Count % of Condition Count % of Condition   Complete    23 34.33% 60 53.10% 12 10.00% 95 31.67% Run-out-of-Time  32 47.76% 34 30.09% 4 3.33% 70 23.33% Pick-Up  4 5.97% 9 7.96% 95 79.17% 108 36.00% Sampler  8 11.94% 10 8.85% 9 7.50% 27 9.00% Grand Total 67 100.00% 113 100.00% 120 100.00% 300 100.00%       4.3 Engagement Patterns  The purpose of the Engagement Patterns was to use the three scales – Completeness, Correctness, and Explanation – to create single descriptor of how students approached each question. This is in contrast to Engagement Profiles, which describe how each student approaches the entire question set. The Engagement Patterns provided a single measure of the quality of student responses for each question – describing whether a response to a question was A) correct with a deep explanation, B) correct with no explanation, C) incorrect with a deep explanation, or D) incorrect with no explanation. Figure 4.2 provides a summary of the four categories and their shortened descriptors. It should be noted that the first two Engagement Patterns require students have completed the question, while the final two Engagement Patterns include students who did not complete the question or who completed it incorrectly.  Figure 4.2: Summary of Engagement Patterns       An Engagement Pattern was created for each question part in the question set and quiz question. Table 4.5 describes the Engagement Patterns trends for the three conditions. The raw count of students (and the percent of students in each condition, to provide a more accurate comparison for the differing group sizes) in each category is available in Appendix I. I begin here by describing the overall Engagement Patterns Correct/ Explanation Correct/ No Explanation Incorrect/ Explanation Incorrect/ No Explanation  60 trends for each condition. Following this, I explore whether these differences in Engagement Profile categories were significantly different among the three conditions.  Table 4.5: Summary of Engagement Pattern trends Questions Condition Trend 1a – 2a Similar trends across conditions Correct with varying degrees of explanation quality 2b – 3b  Control/PS SRL Correct with explanation Correct with explanation OR incorrect, no explanation (incorrect more common) 4 Control/PS SRL Correct or incorrect, but no explanation Incorrect, no explanation 5 Control/PS SRL Incorrect/no explanation Correct with explanation  Students in the Control condition began the question set with responses that were correct, with varying degrees of explanation quality. As they proceeded through the question set, their responses remained correct but students tended to have no explanation in their responses. Near the end of the question set, students’ responses shifted to incomplete or incorrect answers with varying degrees of explanation quality.  Students in the Problem-Solving condition began the question set with responses that were correct, with varying degrees of explanation quality. In contrast to the Control condition, students in the Problem-Solving condition maintained this trend throughout most of the question set. In Questions 4 and 5, students in the Problem-Solving condition began to differentiate and the largest portion of these students stopped providing explanations for their answers, but had varying degrees of correctness (Correct/No Explanation or Incorrect/No Explanation).   61 Students in the Self-Regulated Learning condition began the question set with responses that were correct, with varying degrees of explanation quality. Beginning in Question 2, this trend changed and most students in the Self-Regulated Learning condition either had correct answers with deep explanations or incorrect answers with no explanation (Correct/Explanation or Incorrect/No Explanation). In Question 4, almost all of these students did not complete the question (remember this is categorized as Incorrect/No Explanation). Finally, in Question 5 these students had responses that were correct with varying degrees of explanation quality.  Finally, students in all conditions had similar Engagement Pattern trends in the Quiz Question, with a distribution of students in each of the categories. The exception here was that few students were categorized as incorrect with a deep explanation (Incorrect/Explanation).  A chi-square test of independence was conducted for each Engagement Pattern to explore whether there were significant differences in response quality among conditions. These results are summarized in Table 4.6. Although it does not describe the direction of the relationship, the chi-squares tests of independence demonstrated that significant differences existed between conditions on all questions except Question 1a, and Quiz Question Parts B and C/Bonus.  It should be noted that the categories Incorrect/Explanation and Incorrect/No Explanation, which represent students who had incorrect answers with varying explanation quality, were combined for analysis. I selected these two Engagement Patterns to combine because if students had incorrect answers the explanation quality was a less meaningful distinction. As coders, it was difficult to differentiate between a deep  62 and a surface explanation if the answer was incorrect due to students’ faulty logic and incorrect assumptions. In addition, had they been left as two categories, the low numbers of students in each of these two Engagement Patterns would have violated the assumptions of the chi-square test.  Table 4.6: Engagement Pattern chi-square test of independence results Question Population Size Degrees of Freedom Chi-square statistic Significance 1a 300 4 8.70 .069 1b 300 1 6.35 .011 1c 300 4 11.71 .020 2a 300 4 47.65 < .01 2b 300 4 44.45 < .01 2c 300 4 45.72 < .01 3a 300 4 80.62 < .01 3b 300 4 48.55 < .01 4 300 4 55.81 < .01 5 300 4 43.70 < .01 Quiz – Part A 300 4 13.82 < .01 Quiz – Part B 300 4 3.84 .428 Quiz – Part C/Bonus 300 4 6.12 .191  4.4 Effect of condition on Completeness, Correctness, and Explanation scales  Multivariate (MANCOVA) analysis2 was used to explore differences in conditions for the three scales measuring student engagement and learning, controlling for prior knowledge (as indicated by Midterm 1 grades). Interaction effects and main effects were explored, but no interaction effects were found for either question. As such, only main effects results are reported here. Question Five and the Quiz Question were both analyzed to explore the effect of intervention on later, unprompted problem-solving (full results available in Appendix J).                                                   2 Linear Regression analysis was also conducted, due to the categorical nature of the scales. Identical trends were found and therefore results from the regression analysis are not included in this manuscript.   63  Originally, I intended to conduct multivariate analysis on both Question 5 and the Quiz Question to have two post-intervention measures of students’ engagement and learning. However, due to extremely low completion rates of Question 5 (Table 4.1), it was not appropriate to conduct this analysis. As a result, the Quiz Question serves as the only post-intervention measure. Multivariate (MANCOVA) analysis on the Quiz Question demonstrated that no significant differences between conditions existed overall (λ = .975, F (6, 580) = 1.246, p = .281, η2 = .013) (summarized in Table 4.8, with full results available in Appendix J). Since there were no significant differences between the three conditions in the overall model, no further analysis can be conducted. In contrast, there was a significant relationship between Midterm 1 grades and Quiz Question scores overall (λ = .922, F (5, 290) = 4.925, p =< .01, η2 = .078). This represented a small effect size. A test of Between-Subjects Effects demonstrated significant differences on the Correctness and Explanation scales. This demonstrates students with higher prior knowledge had higher Correctness and Explanation scores than students with less prior knowledge. A Pearson Correlation confirmed this relationship.    64  Table 4.7: Multivariate (MANCOVA) results, Quiz Question Descriptive Statistics  Condition Mean Std. Deviation N 6Comp_Ave 1 2.9242 .23963 66 2 2.9587 .17891 113 3 2.9692 .15028 119 Total 2.9553 .18414 298 6Corr_Ave 1 1.5354 .40879 66 2 1.3894 .45185 113 3 1.4006 .47850 119 Total 1.4262 .45603 298 6Exp_Ave 1 1.2525 .52643 66 2 1.0265 .50226 113 3 1.1485 .49231 119 Total 1.1253 .50951 298  Multivariate Tests Source Dependent Variable df F Sig. Wilks’ Lambda Partial Eta-Squared Condition Completeness (6, 580) 1.246 .281 .975 .013 Correctness Explanation Midterm 1 Completeness (3, 290) 7.339 .000 .929 .071 Correctness Explanation     In this chapter I described the results of the Engagement Profiles and Engagement Patterns, the consequent statistical analysis, and the statistical results of the effect of condition on students responses on later unprompted questions (Question 5 and Quiz Question). In the next chapter I will interpret these results and discuss the impact of these findings. I will also discuss the significance of the study findings and some of its limitations.    5 Discussion  5.1 Main findings  The primary goal of this research was to understand how different types of prompts might impact student engagement and learning while problem-solving. As a result, it was necessary to develop a method for categorizing engagement. This secondary goal was achieved by developing Engagement Profiles and Engagement Patterns. These two measures became integral in creating a fuller picture of how students approached the question set and quiz. As such, Engagement Profiles and Engagement Patterns are interpreted first, to set the stage for the analysis of the three conditions using the Completeness, Correctness and Explanation scales.  5.1.1 Engagement Profiles  The distribution of Engagement Profiles provides important information about how students in each condition engaged with the question set, allowing me to highlight the differences in engagement between conditions. It is important to remember throughout the discussion of Engagement Profile themes that they were constructed solely from the Completeness scale and only reflect one aspect of engagement, namely ‘quantity’ of engagement. The most important finding from the Engagement Profile data is that both the Control and Problem-Solving conditions had more than two-thirds of students categorized as Complete or Run-out-of-Time, while the Self-Regulated Learning condition had more than two-thirds of students categorized as Pick-Up. This demonstrates that, at a macro-level, students in the Self-Regulated Learning condition approached the question set differently than the other two conditions.   66   It also raises the question of why students in the Self-Regulated Learning condition might have had such a different approach. Exploring students Completeness scores more deeply, we can look to some major points of change in the data – both where students stopped completing questions and where they started completing them again – to make sense of this difference between the Self-Regulated Learning condition and the other two conditions. In particular, there are two points of significant changes in completeness that tell an interesting story of the approach students in the Self-Regulated Learning condition took to problem-solving.  Recall that in this question set students tackled five questions. The first question was unprompted in all conditions. It was only at Question 2 that the different prompts were introduced and the physical handout that the question set was printed on appeared different to students in the three conditions. In Question 2, students in the Control condition received no prompts, students in the Problem-Solving condition received a single prompt, and the Self-Regulated Learning condition received two prompts (with the first one being the same as the Problem-Solving condition). Additionally, at this point in the question set students in the Self-Regulated Learning condition saw the first of three final reflection prompts placed at the end of Questions 2, 3, and 4. Recall that these prompts were slightly removed from the context of the question and asked students to extrapolate general problem-solving strategies for future use. As the other context-dependent prompts students received in the Self-Regulated Learning condition appeared almost identical to the Problem-Solving condition, these did not cause large physical differences in the materials between the two conditions. In contrast, these final reflection  67 prompts took up approximately half a page and therefore were the largest physical difference in materials between the two conditions.  Given that, it is interesting that the first major drop off point for students in the Self-Regulated Learning condition was between Question 2a and 2b, when the first of the final reflection prompts became visible to students in the Self-Regulated Learning conditions. At this point, 18.3% fewer students answered Question 2b than Question 2a. This implies that students in the Self-Regulated Learning condition saw what could be perceived as a large reflection question and skipped ahead in the question set until they found a question without a final reflection question. It also implies that students were not interested in engaging with written reflections, as opposed to the content-based, often numerical prompts found in the Problem-Solving condition; and that the presence of these reflections deterred them from even attempting the question. It is not possible to know why students did not engage with these questions, but possible reasons might include that they were uninterested, did not find it valuable, or were not comfortable engaging with written reflections. Although students in the Self-Regulated Learning condition did not engage with the prompts, they did self-regulate in the form of making a judgment based on their perception of the question, the task, and its value.  These results relate to the second major point of change in completeness scores. At Question 5 there was a large influx of students in the Self-Regulated Learning condition completing the question who had not completed the previous question (8.3% of students answered Question 4, while 93.3% of students answered Question 5). Question 4 contained one of three final prompts removed from the question context, which was a major distinguishing feature between the Problem-Solving and Self-Regulated Learning  68 conditions. In contrast, Question 5 was unprompted. These large numbers of students who did not complete Question 4 but completed Question 5 reinforces the explanation that students view the final reflection questions and choose not to engage in the question at all because of the perception that they will be required to answer a large-scale reflection.  Returning to the Control and Problem-Solving conditions, which had similar distributions of Engagement Profiles, we can explore the nuances of how these students approached the question set differently. In these two conditions, two-thirds of students were categorized as Complete or Run-out-of-Time. These results demonstrate that students in the Control and Problem-Solving conditions approached the question set in similar fashions, with students working through questions in order and completing all parts of a question before moving on. The difference is that students in the Problem-Solving condition completed more questions than the students in the Control condition (53.10% and 34.33% respectively completed the entire question set). This result was unexpected because the Problem-Solving condition had prompts to respond to in addition to the main content-based questions in the question set. As such, it was expected that they would complete fewer questions than the Control condition as they did not have prompts to answer. One possible explanation for this result is that the prompts in the Problem-Solving condition helped students to successfully complete questions, and to do so more efficiently than without the prompts. Overall, these Engagement Profile results demonstrate that students in the Control and Problem-Solving conditions took similar approaches when engaging with the question set. Now the question becomes how the  69 differences in the number of question completed interact with the quality of the students’ responses.  5.1.2 Engagement Patterns   Engagement Patterns were intended to provide another dimension of analysis, supplementing the Engagement Profiles. Remember that Engagement Profiles described how an individual student approached the entire question set. In contrast, the Engagement Patterns described how all students approached each question part, using all three scales instead of just the Completeness scale. Using all three scales in a single measure adds value to the information gleaned from the Engagement Patterns because the scales were developed to measure multiple facets of engagement, reflects its complexity, and provides a deeper picture of student engagement. The three scales were used because we wanted to be able to distinguish quantity and quality, and to further divide quality into factual correctness and a student’s ability to articulate their reasoning.  Recall that Engagement Patterns were created by collapsing students’ scores on each of the three scales into a binary score that reflects one of the following: Correct/Explanation, Correct/No Explanation, Incorrect/Explanation, or Incorrect/No Explanation. The trends in Engagement Patterns are explored here and then they are interpreted through the chi-square test of independence analysis, which explores whether there are significant differences in Engagement Patterns between conditions for each question part.  For the first four parts of the question set (Questions 1a, 1b, 1c, and 2a), students’ responses followed a similar trend. Most students across the three conditions tended to have correct answers, with varying degrees of explanation quality (either deep  70 explanation or no explanation). Within these four question parts, the most common Engagement Pattern was a correct response with deep explanation.  The trend shifts at Question 2b. This is the first question where the most common Engagement Pattern for the three conditions was not the same. In addition, the two most common Engagement Patterns were incorrect with varying degrees of explanation quality for all conditions (Incorrect/Explanation or Incorrect/No Explanation). For the Control and Problem-Solving Conditions, the majority of students had a correct answer with an explanation; but for the Self-Regulated Learning condition, the two Engagement Patterns that represent the majority of the students were either correct with explanation or incorrect with no explanation. This is also the first question and condition where an incorrect answer is more common than a correct answer.  The dichotomy in Question 2b is likely due the differences in Engagement Profiles. At this point, Pick-Up and Run-out-of-Time students are not all completing this question, which would automatically put them in an incorrect category3. In contrast, Complete (and potentially Run-out-of-Time) students are still completing this question, which leaves all four Engagement Patterns possible. This dichotomy of student responses falling almost evenly into complete/correct or incomplete/incorrect continues until Question 4.                                                  3 Engagement Patterns incorporated Completeness scores, but for any question where students did not attempt the question the responses were automatically categorized as Incorrect/No Explanation. This was problematic for analysis and interpretation because a student who does not attempt a question is not the same as a student who completes it but provides a low quality answer, in terms of factual correctness and explanation quality. In the future, Engagement Patterns should solely be a measure of question quality and only rate students who attempted the question.   71 Student responses to Question 4 represent another shift in the trends. For Question 4, student responses in the Control Condition were categorized into only two Engagement Profiles, with responses being either correct or incorrect with no explanation (as no students who answered this question providing an explanation). These two Engagement Patterns also categorized the majority of students in the Problem-Solving Condition, while incorrect with no explanation categorized the majority of students in the Self-Regulated Learning condition. This is the only question where the most common Engagement Pattern across all three conditions was an incorrect response with no explanation.   The final tutorial question (Question 5) most clearly demonstrates the differences between conditions, with the results relating directly to the Engagement Profiles. Students in the Control Condition were categorized mostly as Complete or Run-out-of-Time. The majority of Complete students were categorized as having a correct answer with a deep explanation. All Run-out-of-Time students were categorized as having an incorrect/no explanation Engagement Pattern. This was expected, as a requirement of the Run-out-of-Time Engagement Profile was that students did not attempt Question 5. As mentioned previously, if students didn’t attempt a question it was categorized as Incorrect/No Explanation. Students in the Problem-Solving Condition were also categorized mostly as Complete or Run-out-of-Time. For the Complete students, there was more diversity in the quality of their responses although the majority of students had a correct answer with varying degrees of explanation quality. Similar to the Control Condition, all Run-out-of-Time students were categorized as incorrect/no explanation. Students in the Self-Regulated Learning condition were mostly categorized as Pick-Up.  72 This resulted in 93.3% of students in this condition answering Question 5, compared to 43.3% and 57.5% of students in the Control and Problem-Solving Conditions, respectively. Almost half of all students in the Self-Regulated Learning condition answered this question correctly, with a deep explanation. This is almost twice the percent of students in the other two conditions that were categorized in this Engagement Pattern. These results are likely due to the Self-Regulated Learning condition answering fewer questions earlier in the question set and therefore spending more time on Question 5 than students in the Control or Problem-Solving conditions. Remember that Teaching Assistants also recommended that students try Question 5 in preparation for the Quiz Questions. It is reasonable to expect that students took this question seriously as they expected it to help prepare them for the quiz, and that students in the Self-Regulated Learning condition spent more time working on this question than the other two conditions.  For Quiz Question – Part A, the most common Engagement Patterns for all three conditions were characterized by students not including an explanation in their response, whether their answer was correct or incorrect. For Quiz Questions – Part B, and C/Bonus, there was a similar trend. For these questions, the most common Engagement Patterns were characterized by students either having a correct answer with an explanation or having an incorrect answer with no explanation. This trend was consistent in all three conditions.  In addition to these more qualitative trends, a chi-square test of independence was conducted to determine whether there were significant differences in Engagement Patterns between conditions. The chi-square tests of independence demonstrated that  73 significant differences in Engagement Patterns existed between conditions on all questions except Question 1a, and Quiz Question – Parts B and C/Bonus. Since the chi-square test of independence does not describe the direction of the relationship, the most important conclusion we can draw from this is that students in all conditions answer the first question and the (majority of the) Quiz Question with same overall quality. This means that it is unlikely to see an effect of condition on the Quiz Question, as Engagement Patterns take into account all three measures of engagement and learning that used for assessment. In contrast to these two questions, there are significant differences between conditions in the quality of students’ responses on the remainder of the questions. The question becomes then what factors influenced these differences in quality? As these questions were analyzed directly, we can use the findings in the analysis of the Quiz Question to explore a possible hypothesis, which is that prior knowledge influenced the quality of responses.  5.1.3 Effect of condition on Completeness, Correctness, and Explanation scales  The main purpose of this research study was to explore whether different types of prompts would impact students’ responses during problem-solving, using Question 5 and the Quiz Question as post-intervention measures. As mentioned previously, Question 5 was designed to be a post-intervention measure but due to extremely low completion rates by students, analysis of the question was not possible. This was not necessarily a negative, as there were other factors that made Question 5 a less-than-perfect measure. For example, students had varying amounts of time to complete this question depending on how they approached the question set. By looking at the Completeness scores across the three conditions and extracting information from the Engagement Profiles data we  74 can prove this. Students in the Self-Regulated Learning condition were primarily categorized as Pick-Up students and completed fewer questions in the question set that the other two conditions. In addition, a higher proportion of these students completed Question 5 than in the other two conditions. It is likely then, that these students spent more time working on this question than students in the other conditions. As a result of these less-than-ideal circumstances, I was comfortable not using Question 5 as a post-intervention measure.   Regardless of the reasons it was not included for analysis, this meant that the Quiz Question became the single measure of student engagement and learning. Multivariate (MANCOVA) analysis on the Quiz Question demonstrated that no significant differences between conditions existed overall and could not be analyzed further. In contrast, significant differences were found for prior knowledge (as indicated by Midterm 1 grade) on the Correctness and Explanation scales. This demonstrates that students with higher prior knowledge had higher Correctness and Explanation scores than students with less prior knowledge.  These results indicate that the intervention did not have an impact on the student engagement and learning measures during the Quiz Question. Instead, prior knowledge was the more important influence on students Correctness and Explanation depth. Perhaps more importantly though, is the relationship between these findings and student Engagement Profiles. Although the intervention does not appear to have influenced later work, it did significantly impact student behaviour during the question set. Students in the three conditions approached the question set in unique ways, as a result of the differences in materials. This demonstrates that student engagement during the process of  75 problem-solving is malleable, and there is value in evaluating students during the process of problem-solving. Overall, this intervention influenced how students engaged in problem-solving during the question set, but did not impact students’ responses to the Quiz Question.  5.2 Significance of research  The significance of this research is considered along three dimensions – the theoretical contributions that can be made, the practical applications, and the methodological innovations and contributions.  First, this research contributes to self-regulated learning theory by exploring it in relation to problem-solving, and by anchoring it in context. Self-regulated learning theories, which are well-established, comprehensive, and address the complexity of the construct, lend strength to the theoretical understandings of problem-solving. In this research study, I explored how self-regulated learning theory can be used as lens through which problem-solving can be better understood. In addition, this research responds to the need for self-regulated learning theories to be anchored in context (Butler & Schnellert, 2008). Since self-regulated learning occurs in context during the process of learning, it is difficult to measure it in a quiz. Using the approach of categorizing engagement and learning we can gather insight into students’ self-regulated learning and problem-solving behaviours. For example, during the question set students made judgments about the value of a question and decided whether it was meaningful enough to engage with it. By categorizing this engagement, we can hypothesize about students’ self-regulated learning during the question set. These patterns of engagement provide an important connection between self-regulated learning behaviours and learned content.      76 Second, this research has practical implications for researchers interested in problem-solving at three levels – the course, the field of genetics education, and science education more generally. At the course level, the findings of this research were provided to the course instructors to inform future tutorial development. These findings will inform overall approaches to the course, with the major recommendation being to create a culture of reflection for students by including content-based and reflective prompts in the tutorial materials. Beyond this, the findings can be situated within the field of genetics education, which often shares common core principles such as problem-solving, data analysis, and conceptual questioning. Previous research has established that problem-solving is highly contextual, with students’ subject-specific beliefs and motivations playing an important role in their engagement (Wigfield, Klauda & Cambria, 2011), making the findings most applicable in genetics courses. In addition, genetics education is a relatively young field of disciplinary education research, so subject-specific findings help to establish understandings about problem-solving specific to genetics education (Middendorf & Pace, 2004).  Finally, we can shift the lens of analysis away from the context of genetics to problem-solving in science courses more generally. There are many subjects and courses that focus on problem-solving, conceptual understanding, and making sense of data. Courses that are underpinned by these values can benefit from this research by extrapolating the findings to their field. For example, a large body of research in physics education focuses on how students can make sense of data and solve conceptual problems (such as Ogilvie, 2009; Webb, 2012). Although these are often in course and lab settings  77 instead of tutorials, the principles remain and instructors can apply these questioning skills in alternative contexts.  This research also advances our understanding of how two different types of prompts impact student engagement and learning during problem-solving. It is often difficult to directly compare the effect of prompts because the educational contexts are seldom the same. In this research study, we were able to directly compared two prompting conditions to a control condition. This direct comparison demonstrated students who are exposed to different prompts engage with problems in different manners, with students engaging less when then are asked to answer questions that are not obviously related to the problem, particularly questions that are abstract or reflective.  Third, this research provides advances to the methodological approaches used to measure educational outcomes. Traditionally, student learning is measured through a post-test; using a measure such quiz scores as an indicator of comprehension. At the outset of this research, we identified that in order to explore student problem-solving in a holistic manner, we needed to measure comprehension in a more holistic and complex way. As such, we designed a study that explored engagement as an intermediate measure of learning; allowing us to explore the process and the outcome of students’ problem-solving. We believe that this is a novel approach to measurement and that it has great potential to be applied in other contexts.  In addition, throughout this research process we discovered that we needed a method for categorizing student engagement that could summarize the rich information that the multiple scales gave, but in a single measure. We developed the Engagement Profiles and Engagement Patterns as a result of this need. Engagement Profiles were  78 successful because of their simplicity – they provided a strong descriptive measure that summarized each students approach to problem-solving. Each of the scales individually provide information about students, but the Engagement Patterns supplemented this by providing a measure that summarizes a student’s engagement and learning for each question. 5.3 Limitations  There are a number of limitations of the study that should be discussed. First, the quasi-experimental nature of the research limited the ability to control many factors within the study. As a result of the in situ environment, factors such as participants and teaching assistants could not be controlled for. This includes the prior knowledge students bring to the intervention, their motivation, and the beliefs they hold about subject-specific factors, such as the importance of the course material, as well as more general beliefs about learning, knowledge, and understanding. This limitation is tempered by the benefits in situ research provides for educational research as it allows for real-world connection and conclusions.  Second, the constraints of the research setting resulted in additional limitations. For example, it was important to keep the tutorial sessions authentic for students. In regular tutorial sessions, students had been encouraged to work in groups. This posed a challenge because while tutorial questions were completed in groups, quizzes were completed independently, making it difficult to determine the unit of analysis. Working in groups was also problematic, as it would not have been possible to distinguish the contributions of each student to solving a given problem. As a result, the research was designed to have students working in pairs (groups of three or individuals were also  79 permitted if necessary). This maintained the collaborative environment but allowed more control for data analysis. In addition, because the quizzes were completed individually, the unit of analysis selected was the individual, which required us to make assumptions about how much each member of the group contributed to the problem-solving process and that all members were actively engaged throughout the tutorial. Another constraint of the research setting was that it limited how the prompts could be formatted. There is a large body of research on computer-based learning environments (Bulu and Petersen, 2010; Azevedo, Cromley & Seibert, 2004; Manlove, Lazonder & de Jong, 2007) and the value of providing adaptive prompts that are sensitive to the individual students understanding and needs (Devolder, 2012; Koedinger, Aleven, Roll & Baker, 2009; Roll, Wiese, Long, Aleven & Koedinger, 2014). Since the tutorials were low-tech in nature, it was not possible to provide this kind of adaptive prompting for students. The literature about prompting students also indicates that providing feedback is important for student learning. In this context, it was not possible to provide timely feedback on prompts due to the large number of students participating in the low-tech environment. Moreover, Teaching Assistants were not trained to provide feedback on the prompts and such comprehensive training would be outside the scope of this study.  The final constraint of the research setting was the number of Teaching Assistants involved in the research. As stated earlier, a primary concern was controlling for the differences between Teaching Assistants. We attempted to limit this by assigning each Teaching Assistant to a single condition and participating in their weekly preparation session to provide instructions. In addition, there was concern about Teaching Assistants providing feedback to students about content as well as problem-solving approaches to  80 the questions. As a result, Teaching Assistants were asked to limit the feedback they provided to students, instead encouraging them to answer the prompts. Teaching Assistants were instructed to only answer questions once students had attempted these prompts.  Finally, Teaching Assistants set the tone and culture of tutorials for their students. This impacts the research in two ways. First, there was a concern that Teaching Assistants might not be invested in the research or find it valuable. Since they were the primary link to students and provided all instructions, it was important for them to encourage students to take the intervention seriously. Second, the culture of the tutorial might impact the way students approached the question set. For example, the Teaching Assistant might have encouraged students throughout the term to focus on finishing all questions compared to completing questions in depth but not necessarily finishing all questions. It was not possible to control for these differences, and thus remains a limitation of this study.    Fourth, the scope of this study limited the research to a single week of interventions. This is problematic because providing an authentic educational context is important for students to engage with the materials in a genuine manner, or to ‘buy in’. Without this investment on the part of the students, it is difficult to determine whether (and the degree to which) prompts impact students problem-solving skills because they have not experienced the intervention fully. This was the case with many students in the Self-Regulated Learning condition, evidenced by the large number of students (95/120) categorized as Pick-Up students – answering only questions that did not contain prompts. Self-regulated learning prompts in particular, proved difficult to make authentic to students.   81 Two approaches to self-regulation prompts were used in this research – those that were highly contextual and integrated directly into the questions, and those that were decontextualized to allow students to make large-scale connections about problem-solving. So few students responded to the decontextualized prompts that it was not possible to conduct a deep analysis of them. Students appeared to be much more willing to engage in contextualized self-regulation prompts, but not to the same degree that students engaged with the highly contextualized problem-solving prompts. In the future, these types of interventions should take place over multiple weeks to develop a culture of answering questions that are not solely related to content knowledge. Alternatively, an intervention of this nature should take place in a course where this culture already exists and students have practice engaging in these types of prompts. This would also allow for longitudinal research that can address questions about the effects of skill development on engagement and learning.  Finally, problem-solving is developmental, and how students construct these skills changes over time. For example, problem-solving at the undergraduate level is very different in nature from problem-solving in elementary school or at the graduate level (Bransford, 2004). For that matter, problem-solving skills develop rapidly during undergraduate years alone. This research contributes to our understanding of how students solve problems at a point in their educational careers where possessing these skills are necessary for success, but students have not necessarily developed sophisticated problem-solving skills yet. Understanding the limitations of this context and the possibility for generalization is important for future research.   82 In this chapter I described the main findings and their meaning, the significance of this research study at multiple levels (course, field of genetics, and problem-solving courses generally), and a number of the limitations of the study. In the next chapter I will summarize these findings and contemplate some of the challenges I experienced conducting this study and possible future directions.      83 6 Conclusions  6.1 Summary of findings  This research study addressed the following three research questions. 1) How did students engage in the problem-solving tutorial question sets with varying types of prompts?  2) When compared to a control group, what was the effect of prompts (problem-solving and self-regulation) on how students engaged in the tutorial? 3) Was there an effect of prompting condition in the tutorial question set on student responses on a later, unprompted problem (as measured by completeness, correctness, and explanation)? Engagement Profiles demonstrated that students in different conditions engaged in the question set in different manners. In this case, students in the Control and Problem-Solving Conditions generally took the approach of working ordinally through the question set, either completing all questions or working until they ran out of time. Between these two conditions, students in the Problem-Solving condition completed more questions overall than students in the Control condition. In contrast, students in the Self-Regulated Learning condition generally answered only unprompted questions, skipping over the midsection of the tutorial and picking-up once the prompts stopped. In general, the Engagement Profiles demonstrated that students employ multiple strategies in their approaches to problem-solving and that these strategies are adapted to the material they are presented with. In addition, it appears that students make judgments about the value of answering a question prior to attempting it.   84  Question 5 was intended to be a post-intervention measure of differences between conditions. I intentionally created two post-intervention measures, due to concern that the increased pressure of the quiz setting would introduce uncontrolled variables that were different from those experienced during the tutorial question. Although students were encouraged to move on to Question 5 prior to completing the question set, the Engagement Profiles demonstrate that many students did not do so. These students, categorized as Run-out-of-Time, were concentrated in the Control and Problem-Solving Conditions. Due to low numbers of students in these conditions attempting Question 5, it proved to be a poor post-intervention measure.   The Quiz Question served as the main post-intervention measure, as all students were given the same amount of time to complete it and it had high overall completion rates. There were no significant differences on the Quiz Question between the conditions for the three scales – Completeness, Correctness, and Explanation. In contrast, when prior knowledge was analyzed there were significant differences on students Correctness and Explanation scales. This demonstrated that students with more prior knowledge had more correct answers with deeper explanations than students with less prior knowledge.  6.2 Challenges of measuring student engagement: a reflection for educational researchers Educational researchers are faced with a number of challenges when conducting interventions in authentic educational environments. There were a number of challenges I faced (and some that I overcame) throughout this process that I felt others considering work in this area could benefit from. These range from selecting an appropriate  85 educational setting to conduct the research in, to creating materials, to developing a coding manual. The most important lessons I learned are summarized here as well.  The first challenge faced when conducting research in authentic educational environments is selecting and getting access to an appropriate course. In my case, this was less of a challenge due to the culture of educational research and improvement at the institution. Additionally, I knew that I was interested in conducting research in the early years of undergraduate education because there was such potential for developing students’ skills at this time point. Finally, I selected genetics education for two reasons: it focuses on skills and conceptual understanding, and my area of expertise is Biology. As a result, I was able to connect to other biology education researchers and get permission from the course instructors to do an intervention within their course. I had strong support from the course coordinator and the biology education research group to implement this study, which was invaluable.  The second challenge I faced was developing the materials to be used in the intervention. The questions were developed with the help a former course instructor (and biology education researcher) and were approved by the course coordinator. This was a very collaborative process, and all parties were invested in creating strong conceptual questions that were appropriate for the intervention. Together, we engaged in a long period of discussion to determine the overall length of the question set, considering topics such as – how many questions should be included, which questions should have prompts, how many prompts should be included? As a result of these discussions, I decided on five questions to allow for a pre- and post-intervention unprompted question as well as having three prompted questions. I felt this struck the balance between having too few questions  86 to measure changes in student engagement, and having too many questions for students to complete. In reality, only experience can tell whether this balance was struck. In my case, it appears that five questions might have been too long, and I could have removed the pre-intervention unprompted question (Question 1) without sacrificing any of my research questions.  The final challenge I faced was developing the coding manual to analyze student responses. Using three scales was extremely effective for addressing my research questions, as it allowed me to piece apart multiple aspects of student engagement and learning. Having a measure of quantity (Completeness) and two measures of quality (Correctness and Explanation) added depth to my analysis and gave students credit for multiple aspects of their work. Creating the Engagement Profiles allowed me to then bring together these individual pieces of information to create a holistic picture of students in the three conditions. However, the coding manual took a great deal of time to develop, as it was difficult to come up with concrete definitions of each of the codes prior to conducting the inter-rater reliability. I created the clearest definitions possible for each code in the three scales, but for each tutorial question it was necessary to define the specific answers that would fall into those categories, due to the dependence on context to make sense of Correctness and Explanation.  During the inter-rater reliability, we struggled particularly to apply the Explanation code to incorrect answers. This is common in educational research, as incorrect answers often use incorrect logic, rooted in misunderstandings of the concepts, in order to answer the question. It proved extremely difficult to judge the quality and  87 depth of student explanations when they had such obvious misconceptions and their explanations were often nonsensical.  Finally, the coding manual, which worked relatively well with some discussion between the raters, did not apply as well to the Quiz Question. In all the questions in the tutorial, students were able to assume that the data they were being provided with was correct. In contrast, the Quiz Question asked students to question why the data was incorrect and to postulate a hypothesis for the data, knowing that it did not make sense. Students struggled with this question. Their difficulty making sense of the question made their responses extremely difficult to code, as they were often incorrect and their logic was difficult to follow. Students’ difficulties with this question were not expected, as they had analyzed similar data during the tutorial in a different context, with most students who attempted the similar tutorial question answering it correctly. It was not possible to prevent this challenge, but it is an important warning for future studies about students’ difficulties transferring knowledge to a new context.   Although I have discussed many challenges that we faced, there were two recurring lessons that had the most significant impact on our research which merit further reflection. The most meaningful lesson I learned was the importance of developing prompts that were authentic to students. In this work, authenticity meant embedding prompts in context and rooting skill development deeply in the content. This was not difficult for the problem-solving prompts, which lent themselves to the content. For these prompts, students were asked content-based questions that guided them through the problem-solving process. As such, students appeared to react much more positively to the prompts and, for the most part, engaged with them.   88 The self-regulated learning prompts were not as successful. The two types of self-regulated learning prompts included provide some insight into how valuable contextual prompts are. The main self-regulated learning prompts were provided after each problem-solving prompt, and were embedded directly in the question. Although they were process-oriented or reflective in nature, they were related directly to the question being answered. While many students attempted these prompts, the student engagement with them was much lower than with the problem-solving prompts. This is an inherent difficulty in self-regulated learning interventions, as students often do not see the benefit of answering prompts related to process, instead focusing on getting the correct answer. This has a great deal to do with the culture of learning within an institution, program, and course, and students often require a great deal of practice answering prompts about process.   The other important lesson I learned was that fewer, well-designed questions are likely more effective than adding questions that serve a data-driven purpose. Although I designed the question set intentionally and had a purpose for including each question, many students did not complete the entire question set, which undermined these intentions. In the end I were able to identify alternative ways to interpret the data, but I had to change the original plan. For example, we wanted to use both Question 5 and the Quiz Question as post-intervention measures because I was concerned the “high-stakes4” environment of the quiz might impact students’ behaviour. I had designed Question 5 to ensure we could draw conclusions about student engagement and learning within the low-                                                4 Students received 0.5% for completing the quiz, not for correctness. Our concern was that the formality of writing a quiz would influence students’ perceptions of the stakes associated with their answers.   89 stakes context of the question set. Due to the low completion rates of Question 5 by students in the Control and Problem-Solving Conditions, this analysis did not hold the same weight.  In addition, we did not anticipate students skipping over such large portions of the question set. We were not able to control which questions students answered due to the low-tech nature of the tutorial. This had the most impact on the prompts, which students appear to have passed over to focus instead on the main content-based portion of the questions. If students (particularly in the Self-Regulated Learning condition) did not have the option of answering only unprompted questions, the findings could have been significantly impacted as I originally intended to analyze the quality of prompt responses independent of question responses. Instead, I had to shift my interpretation of the Self-Regulated Learning condition, as it appears many of them did not experience the intervention as I designed it. In addition, I left Question 1 unprompted so that I could have a pre-intervention measure of student engagement, in addition to Midterm 1 grades (which were used to demonstrate comprehension of material prior to the intervention). Due to the length of the question set, many students answered Question 1 but did not make it through the prompted questions. In the end, the value added by having an unprompted question at the outset of the tutorial was not equal to the information we lost by students not answering as many prompted questions. There is always a fine balance to strike between developing content knowledge and skills, particularly when the focus is on skills. This was especially true in this case, as I was working with students in an introductory undergraduate course context. This context posed two challenges, in addition to finding a balance between content and skill  90 development. Survey courses such as the one in this study tend to draw students in the early years of their undergraduate degrees, and many of these students have not yet developed sophisticated problem-solving skills. In addition, the broad “surveying” of many genetics topics, and the fact that it is mandatory for many students, make it difficult to know if students are intrinsically motivated to understand the concepts (Partin & Haney, 2012). Conversely, the benefit of these factors is that they make introductory courses prime contexts to conduct an intervention due to the need and potential for students’ growth.  Although I faced a number of challenges, the potential benefits significantly outweighed those challenges. By reflecting on the challenges and the lessons learned, I hope that others in turn will be able to develop strong research projects in these rich educational environments.  6.3 Future directions  There are many possible avenues for future research related to this topic. This section provides practical recommendations specific to this study, as well as theoretical recommendations for educational research on problem-solving generally. To begin, there are a number of future analyses that could be conducted to provide greater depth to these findings. In addition, there are a number of modifications to the study design. The most necessary modifications are in the areas of Teaching Assistant preparation, length of implementation, reviewing the materials for future use, and revising the unit of analysis. The current analysis demonstrated that prior knowledge was a significant influence on Quiz Question performance. In the future, it would be interesting to conduct analysis on the following – irrespective of condition, were there significant differences  91 between Engagement Profiles and Quiz Question responses and were there significant differences between Midterm 1 and Engagement Profiles (Figure 6.1)? In addition, there are a number of modifications that should be made for future iterations of this study.   Figure 6.1: Current and future analysis   Teaching Assistants play a pivotal role in students’ course experiences and were the primary implementers of this intervention. As such, they played an integral role in how students experienced the intervention, making it necessary to address concerns over their influence and how they are prepared to implement the intervention. Our Teaching Assistant preparation dealt with the logistics of the intervention, during which we attempted to control for their impact as much as possible. In the future, the impact of Teaching Assistants should be accounted for in the research questions (such as analyzing results by teaching assistant as well as condition). Additionally, the preparation session should be more comprehensive. It would be useful to conduct research on how to train Teaching Assistants on implementing this type of intervention.  One of the challenges of this research study was finding an appropriate course in which to implement the intervention. Although the course we worked with had a culture of innovation and attempting to implement research-based instructional strategies, there  92 was no culture of self-reflection about learning. This made a single-week intervention difficult, as it was not authentic to students’ previous experiences. In the future, it is extremely important to curate this culture of reflection by incorporating prompts like these into multiple aspects of the course, training students on how to reflect on their learning, or extending the intervention over multiple weeks so students become accustomed to seeing prompts and engaging with them. Once this culture is fostered the stage will be set for more analysis of the impact of the prompts themselves.  In addition, the results demonstrated the intervention did not have an impact on students’ responses on the Quiz Question. However, differences in Engagement Profiles between the conditions demonstrated that students’ engagement during problem-solving is malleable and provides a reasonable expectation that an effect of condition is possible if the other recommendations – such a Teaching Assistant preparation, culture of reflection, or length of intervention – are implemented. Consequently, the materials should be reviewed and modified for future use, rooting them more deeply in theoretical understandings of problem-solving.  Another important issue to address is the unit of analysis. We allowed students to work in pairs to retain authenticity in the tutorial structure, where students traditionally worked in small groups to collaborate on questions. Working in pairs has many benefits as students learn from each other in meaningful ways (Vygotsky & Kozulin, 2011), but it made analysis of individual work difficult. Also, it was necessary to evaluate individual work as students completed the quiz individually. These kinds of challenges are to be expected when working in authentic classroom settings, but in the future designing a  93 method that uses a consistent unit of analysis would be an asset. This is not to say that the unit of analysis needs to be at the level of the individual, but that consistency is key.  This research has proposed a method for describing and evaluating student engagement by describing students’ approach to the process of problem-solving in addition to measuring learning. The method of categorizing students’ engagement has the potential to be applied to problem-solving research more generally. Also, this research made theoretical connections between self-regulated learning and problem-solving literature. These theoretical connections need to be explored in greater depth – by both the theory and research communities. In conclusion, this research study has made practical, theoretical and methodological contributions, creating space for a variety of future research in the area of problem-solving as well as genetics education.    94 7 References  Advancing Science: University of British Columbia Strategic Plan. (2011).  Alexander, P., Dinsmore, D., Parkinson, M., & Winters, F. (2011). Self-Regulated Learning in Academic Domains. In Handbook of Self-Regulation of Learning and Performance (pp. 393–407). Routledge.  Alters, B. (1997). Whose Nature of Science? Journal of Research in Science Teaching, 34(1), 39–55.  Azevedo, R., Cromley, J. G., & Seibert, D. (2004). Does adaptive scaffolding facilitate students’ ability to regulate their learning with hypermedia? Contemporary Educational Psychology, 29(3), 344–370.   Bandura, A. (1978). Self-efficacy: Toward a unifying theory of behavioral change. Advances in Behaviour Research and Therapy, 1(4), 139–161.  Belenky, D. M., & Nokes-Malach, T. J. (2012). Motivation and Transfer: The Role of Mastery-Approach Goals in Preparation for Future Learning. Journal of the Learning Sciences, 21(3), 399–432.   Bransford, J., Donovan, S., & Pellegrino, J. (2003). How people learn  : brain, mind, experience, and school. Washington, DC: National Academy Press.  Bulu, S. T., & Pedersen, S. (2010). Scaffolding middle school students’ content knowledge and ill-structured problem solving in a problem-based hypermedia learning environment. Educational Technology Research and Development, 58(5), 507–529.   Butler, D. L. (2002). Individualizing instruction in self-regulated learning. Theory into Practice, 41, 81-92.  Butler, D. L., Cartier, S. C., Schnellert, L., Gagnon, F, & Giammarino, M. (2011). Secondary students’ self-regulated engagement in reading: Researching self-regulation as situated in context. Psychological Test and Assessment Modeling, 53(1), 73-105.  Butler, D. L., & Schnellert, L. (2008). Bridging the research-to-practice divide: Improving outcomes for students. Education Canada, 48(5), 36-40.  Butler, D., & Winne, P. (1995). Feedback and Self-Regulated Learning: A Theoretical Synthesis.pdf. Review of Educational Research, 65(3), 245–281.   95 Carmel, J., & Yezierski, E. (2013). Are We Keeping the Promise? Investigation of Students’ Critical Thinking Growth. Research and Teaching, 42(5), 71–81.  Chase, W. G, & Simon, H. A. (1973). Perception in chess. Cognitive Psychology, 4, 55-81 (1973).  De Bruin, A. B. H., Rikers, R. M. J. P., & Schmidt, H. G. (2007). Improving metacomprehension accuracy and self-regulation in cognitive skill acquisition: The effect of learner expertise. European Journal of Cognitive Psychology, 19(4-5), 671–688.   Devolder, A., van Braak, J., & Tondeur, J. (2012). Supporting self-regulated learning in computer-based learning environments: systematic review of effects of scaffolding in the domain of science education. Journal of Computer Assisted Learning, 28(6), 557–573.   Elen, J., Clark, R. E., & Lowyck, J. (2006). In Handling complexity in learning environments theory and research (pp. 107–128). Oxford, UK; Boston, MA: Elsevier. Retrieved from http://www.myilibrary.com?id=64112  Hannafin M., Land S.M. & Oliver K. (1999) Open learning environments: foundations, methods, and models. In Instructional Design Theories and Models (ed. C. Reigeluth), pp. 115–140. Lawrence Erlbaum Associates, Mahwah, NJ.  Holmes, N. G., Day, J., Park, A. H., Bonn, D. A., & Roll, I. (2014). Making the failure more productive: scaffolding the invention process to improve inquiry behaviours and outcomes in productive failure activities. Instructional Science, 42(4), 523-538.  Jarvela, S., & Hadwin, A. (2013). New Frontiers: Regulating Learning in CSCL. Educational Psychologist, 48(1), 25–39.  Kitsantas, A., & Kavussanu, M. (2011). Intentional Conceptual Change: The Self-Regulation of Science Learning. In Handbook of Self-Regulation of Learning and Performance (pp. 203–216). Routledge.  Knight, J. (2010). Biology Concept Assessment Tools: Design and Use. Microbiology Australia, 31(1).  Knight, J., & Smith, M. (2010). Different but Equal? How Non-Majors and Majors Approach and Learn Genetics. CBE – Life Sciences Education.  Koedinger, K. R., Aleven, V., Roll, I., & Baker, R. S. J. d. (2009). In vivo experiments on whether supporting metacognition in intelligent tutoring systems yields robust learning. In D. J. Hacker, J. Dunlosky, & A. C. Graesser (Eds.), Handbook of metacognition in education (pp. 383-412). New York: Routledge.   96 Kruger, J., & Dunning, D. (1999). Unskilled and unaware of it: How difficulties in recognizing one's own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology, 77(6), 1121-1134.  Manlove, S., Lazonder, A. W., & Jong, T. (2007). Software scaffolds to promote regulation during scientific inquiry learning. Metacognition and Learning, 2(2-3), 141–155.   McDaniel, M. A., Anderson, J. L., Derbish, M. H., & Morrisette, N. (2007). Testing the testing effect in the classroom. European Journal of Cognitive Psychology, 19(4-5), 494–513.   McDonnell & Kalas (2012, July). Profile of common genetics misconceptions in 1st to 4th year undergraduate biology students. Poster session presented at Society for the Advancement of Biology Education Research (SABER) National Meeting, Minneapolis, MN.   Merriënboer, J. J. G., & Sweller, J. (2005). Cognitive Load Theory and Complex Learning: Recent Developments and Future Directions. Educational Psychology Review, 17(2), 147–177.   Metcalfe, J., Kornell, N., & Son, L. K. (2007). A cognitive-science based programme to enhance study efficacy in a high and low risk setting. European Journal of Cognitive Psychology, 19(4-5), 743–768.   Ogilvie, C. (2009). Changes in students’ problem-solving strategies in a course that includes context-rich, multifaceted problems. Physical Review Special Topics - Physics Education Research, 5(2).  Osborne, J., Simon, S., & Collins, S. (2003). Attitudes towards science: A review of the literature and its implications. International Journal of Science Education, 25(9), 1049–1079.   Partin, M. L., & Haney, J. J. (2012). The CLEM model: Path analysis of the mediating effects of attitudes and motivational beliefs on the relationship between perceived learning environment and course performance in an undergraduate non-major biology course. Learning Environments Research, 15(1), 103–123.   Partin, M. L., Haney, J. J., Worch, E. A., Underwood, E. M., Nurnberger-Haag, J. A., Scheuermann, A., & Midden, W. R. (2011). Yes I Can: The Contributions of Motivation and Attitudes on Course Performance among Biology Nonmajors. Journal of College Science Teaching, 40(6), 86–95.  Peters, E. E., & Kitsantas, A. (2010). Self‐regulation of student epistemic thinking in science: the role of metacognitive prompts. Educational Psychology, 30(1), 27–52.    97 Pintrich, P. R. (2004). A conceptual framework for assessing motivation and self-regulated learning in college students. Educational Psychology Review, 16(4), 385–407.  Roll, I., Wiese, E., Long, Y., Aleven, V., & Koedinger, K. R. (2014). Tutoring self- and co-regulation with intelligent tutoring systems to help students acquire better learning skills. In R. Sottilare, A. Graesser, X. Hu, & B. Goldberg (Eds.), Design Recommendations for Adaptive Intelligent Tutoring Systems: Volume 2 - Adaptive Instructional Strategies (pp. 169-182). Orlando, FL: U.S. Army Research Laboratory.  Roll, I., Holmes, N. G., Day, J., & Bonn, D. (2012). Evaluating metacognitive scaffolding in guided invention activities. Instructional Science, 40, 691-710.   Schoenfeld, A. H. (1992). Learning to think mathematically: Problem solving, metacognition, and sense making in mathematics. Handbook of Research on Mathematics Teaching and Learning, 334–370.  Schraw, G., Crippen, K. J., & Hartley, K. (2006). Promoting Self-Regulation in Science Education: Metacognition as Part of a Broader Perspective on Learning. Research in Science Education, 36(1-2), 111–139.   Semsar, K., Knight, J., Smith, M., & Birol, G. (2011). The Colorado Learning Attitudes about Science Survey (CLASS) for Use in Biology. CBE – Life Sciences Education, 10, 268-278.  Smith, M., Wood, B., Krauter, K., & Knight, J. (2011). Combining Peer Discussion with Instructor Explanation Increases Student Learning from In-Class Concept Questions. CBE – Life Sciences Education, 10, 55-63.  Smith, M. U. and Good, R. (1984), Problem solving and classical genetics: Successful versus unsuccessful performance. Journal of Research in Science Teaching, 21, 895–912.  Son, L. K. (2007). Introduction: A metacognition bridge. European Journal of Cognitive Psychology, 19(4-5), 481–493.   Stadtler, M., Scharrer, L., Brummernhenrich, B., & Bromme, R. (2013). Dealing With Uncertainty: Readers’ Memory for and Use of Conflicting Information From Science Texts as Function of Presentation Format and Source Expertise. Cognition and Instruction, 31(2), 130–150.   Taconis, R., Ferguson-Hessler, M. G., & Broekkamp, H. (2001). Teaching science problem solving: An overview of experimental work. Journal of Research in Science Teaching, 38(4), 442–468.   98 Taylor, J. L., Smith, K. M., & Spiegelman, G. B. (2009, May). Presentation at the American Society for Microbiology Conference for Undergraduate Educators, Fort Collins, CO.   Taylor, J., Smith, K. M., van Stolk, A. P., & Spiegelman, G. (2010). Using Invention to Change How Students Tackle Problems. CBE – Life Sciences Education.  Vygotsky, L. S. & Kozulin, A. (2011). The Dynamics of the Schoolchild’s Mental Development in Relation to Teaching and Learning. Journal of Cognitive Education & Psychology, 10(2), 198-211.  Webb, D. J. (2012). Improving student’s problem-solving ability as well as conceptual understanding without sacrificing the physics content of a class. arXiv Preprint arXiv:1210.3385. Retrieved from http://arxiv.org/abs/1210.3385  Wieman, C., Perkins, K., & Gilbert, S. (2010). Transforming Science Education at Large Research Universities.pdf. Change Magazine, March/April 2010, 7–14.  Wigfield, A., Klauda, S., & Cambria, J. (2011). Influences on the Development of Academic Self-Regulatory Processes. In Handbook of Self-Regulation of Learning and Performance (pp. 33–48). Routledge.  Willingham, D. T. (2008). Critical thinking. Arts Education Policy Review, 109(4). Retrieved from http://mres.gmu.edu/pmwiki/uploads/Main/CritThink.pdf  Yeager, D. S., & Walton, G. M. (2011). Social-Psychological Interventions in Education: They’re Not Magic. Review of Educational Research, 81(2), 267–301.   Zimmerman, B. J. (1989). A social cognitive view of self-regulated academic learning. Journal of Educational Psychology, 81, 329-339.  Zimmerman, B. (2008). Investigating Self-Regulation and Motivation - Historical Background, Methodological Developments, and Future Prospects. American Educational Research Journal, 45(1), 166–183.  Zimmerman, B. (2011). Self-Regulated Learning and Performance: Attaining Self-Regulation. In Handbook of Self-Regulation of Learning and Performance (pp. 13–39). Routledge.  Zimmerman, B. J., & Martinez-Pons, M. (1990). Student differences in self-regulated learning: Relating grade, sex, and giftedness to self-efficacy and strategy use. Journal of Educational Psychology, 82(1), 51.  Zimmerman, B., & Martinez-Pons, M. (1986). Development of a Structured Interview for Assessing Student Use of Self-Regulated Learning Strategies. American Educational Research Journal, 23(4), 614–628.  99  Zimmerman, B., & Schunk, D. (2011). Self-Regulated Learning and Performance: An Introduction and an Overview. In Handbook of Self-Regulation of Learning and Performance (pp. 1–14). Routledge.  Zohar, A., & Barzilai, S. (2013). A review of research on metacognition in science education: current and future directions. Studies in Science Education, 49(2), 121-169. Chicago.    100 8 Appendices  8.1 Appendix A: Summary of tutorial question set (for each condition)  Summary of Tutorial Materials   Conditions 1) Control (Blue questions) 2) Problem-Solving (Blue and red questions) 3) Self-Regulation (Blue, red, and purple questions)  Note: spacing of questions is not to scale.   QUESTION 1   The Mexican cavefish lives in a series of unconnected caves. Fish found in the caves have been blind for millennia, and interestingly cavefish can still interbreed with surface fish! The surface fish can see. Your research goal is to determine if blind cavefish from different populations are the result of mutations in the same gene or different genes, and to determine which genes are responsible for blindness. You isolate 5 populations of blind fish from 5 different cave ponds. You select an individual fish from each population and cross them each with a sighted surface fish. All of the offspring are sighted.  You then set up a series of crosses such that individual fish from each of the 5 blind populations are mated with each other. You obtain the following results where + represents offspring that are sighted, and – represents blind offspring:   a. Consider all of the information provided and determine how many genes are working to produce sight in these fish. Explain your reasoning. Be sure to define gene and allele symbols/labels.   b. Below is an illustration of where each population originated from. Draw the chromosomes labeled with alleles to indicate what gene(s) are causing blindness in the fish from each blind  101 population, as well as the surface sighted fish. The fish are 2n=8. State any assumptions you are making.     c. You isolate a 6th blind population. When you mate a pure breeding blind fish from population #6 with sighted fish all of the offspring are blind. Could you use this blind fish from population #6 for complementation testing with the other blind strains?  Explain why or why not.  QUESTION 2 In minks, wild types have an almost black coat. Breeders have developed many pure lines of color variants for the mink-coat industry. Two such pure lines are platinum (blue gray) and Aleutian (steel gray). These lines were used in crosses, with the following results:     a. Devise a genetic explanation of these three crosses. Show complete genotypes for the parents, the F1, and the F2 in the three crosses, and make sure that you show the alleles of each gene that you hypothesize for every mink.   What are the ratios observed in the scenario?   What do the ratios tell you about the problem? Why is it important to make sense of the data to answer this question?   b. Predict the F1 phenotypic ratios from crossing sapphire with platinum and with Aleutian pure lines.  What information do you know from the scenario that is necessary to answer this question?  Why is this information important? How does this help you answer the question?    102 c. If these F1s self, what will the phenotypic ratios in the two F2 populations be?   What work have you already completed that is useful to answer this question?   How do you know that previous is useful to answer this question?    What was a stumbling block you encountered when solving this problem? What did you do to successfully deal with it?   What strategies did you use that allowed you to successfully answer this problem? How can you use them in them to successfully answer another problem?  QUESTION 3 You are investigating a heritable heart disease. This disease results in abnormal development of the heart such that mice with this condition usually die at a much younger age than healthy mice. You are trying to determine how many genes are involved in this hereditary heart disease.   You have mice from 4 different strains, all exhibiting the abnormal heart condition (mouse #1-4). You crossed each of these mutants to true breeding wild-type and all of the offspring are wild-type. Below is the data from several crosses you performed to identify if the mice have mutations in the same gene or different genes. A “+” symbol represents normal, healthy mice, whereas a “-“ symbol represents mice that died very young due to heart disease.    Mouse #1 #2 #3 #4 Mouse #1 - - + + #2 - - + + #3 + + - + #4 + + + -  a. What can you conclude based on this data? Show all of your work including genes and genotypes.  What information in the data is important to understand to answer this question?   How do you make sense of this information? Why is it important to understand the +/- data presented to answer this question?  b. A fifth mouse (#5) with heart disease is identified from a separate population. When this mouse is crossed to a true breeding healthy mouse half of the offspring  103 are healthy, the other half have heart disease. Similarly, when this mouse (#5) is crossed to each of the four mice listed above the results are the same for each cross: half of the offspring are healthy, the other half are diseased. What does this suggest? Show all of your work, including genotypes. State any assumptions you are making.  Draw a diagram/representation of the new information that was just presented in this scenario.   Compare this representation of the data to the information you deciphered in Part A. Does this new information make sense compared to what you identified in Part A? Is it possible that there is another explanation for this new information?   How did creating a representation of the problem help you solve the problem? Why did it help you solve the problem?   The quiz question focuses on your ability to problem-solve. What did you do here that was helpful to solve the problem? When you do the quiz at the end of the tutorial, how can you use these problem-solving approaches to help you be successful on the quiz?    QUESTION 4 Recall the talon question from Part 1.   Imagine that you use X-rays to mutagenize fertilized owl eggs that are in the 1 cell stage of development (diploid zygote). You mutagenize two batches of fertilized eggs and let the eggs develop, and birds hatch, creating two M1 populations: A and B.   You count the number of wild-type (WT) and mutant owls in each population. Here is the data from your phenotype screening and counting. Assume one owl mating with one other produces one offspring.   Population M1 M2 (each M1 is crossed with a wild-type bird to create M2) M3 (M2s randomly mate with each other to create M3, multiple mating occurs) A 11 WT 1mutant 11 WT 2 mutant 9 WT 3 mutant   104 B 10 All WT 10 WT 7 WT 3 mutant  What can you conclude based on this data?  How are the mutants in population A different and/or the same from those in population B?   What are you being asked to do in this question?    What Data seems important to you to be able to answer this question? What information doesn’t make sense to you/strikes you as something you should know more about?  How do you know this data is important to understand in order to solve the problem?   The tutorial questions tackle many different concept areas but all focus on your ability to solve problems.  Were there approaches to interpreting problems and important data that you used across all the questions? What were they?   What clues were you able to identify in the questions to help you approach the problem? How will you approach the quiz question now that you have successfully answered these questions?    QUESTION 5 Filled-in symbols indicate deafness. Three families from different geographical locations are shown here, and the mating interactions between families. The families starting with individuals I-1 x I-2 and I-3 x I-4 were from one geographical location, the family starting with individuals II-5 x II-6 were from another distinct geographical location, and the family starting with individuals II-7 x II-8 were from yet another distinct geographical. You can see how members of these families eventually come together in the III and IV generations.    105   a. Study the pedigree and explain how deafness could be inherited in each family Propose a hypothesis to explain why none of the individuals in generation V are deaf, and why the deafness occurs in offspring of III-2 x III-3.   QUESTION 6 (QUIZ) A genetics researcher at UBC studying sensory perception did a mutagenesis in Caenorabditus elegans for the purpose of selecting mutants that, unlike wild type worms, did not move in the direction of a food source. He isolated three worms defective in sensing food from the M2 mutagenized population. These mutants were named Hungry (hgy #1, hgy #2, hgy # 3). He took the three mutants, crossed them to each other and determined whether the F1 progeny were normal or defective in sensing food in order to determine how many genes were identified by the three mutants. He got the following results: Phenotypes of the F1 progeny resulting from the indicated cross:  hgy #1 hgy #2 hgy #3 hgy #1 hgy  hgy hgy hgy #2  hgy wild type hgy #3   hgy  Remember, you are being marked on your answer and your process! Show all of your work and explain yourself as clearly as possible.  a. What did the researcher forget to do? Explain why this/these steps are important. (4 points) b. Explain what is unusual about the results the technician obtained. (2 points) c. Provide one reasonable hypothesis for the unusual results and suggest how many genes have been identified. Explain in a few sentences how you would test your hypothesis. (4 points) Bonus points: can you come up with a second hypothesis to explain the unusual result?     106 8.2 Appendix B: Instructions provided to Teaching Assistants prior to tutorial Cheat Sheet – Summary of Instructions for TAs   • Students have been told that their tutorial will be slightly different this week in instructions at the top of Part I   • Tell students to get into pairs and write their names on the sheet  • Pairs are working together on Part 2  • Remember – students don’t have to give consent but everyone’s work will be collected for a few days. If there is no signature on the form, then their data won’t be used.   • Timeline  o Encourage students to try question 5 when there is about 35 minutes left in tutorial. Tell them it is a good practice quiz question!  o Run the quiz during the last 20 minutes of the tutorial. Give them 20 minutes to write the quiz.   • Remember to collect EVERYTHING and bring all your documents to the prep session on Thursday. We will copy everything and return them so students can pick them up in they want.   TA Instructions for Biol 234 Intervention All TAs:  Summary of tutorial timeline:  Time (min) Event Conditions Control Problem-Solving Self-Regulation 20 Introduction and questions from previous work Instructions below Instructions below Instructions below 55 Q1 No prompts No prompts No prompts Q2 No prompts Red Prompts Red and purple prompts Q3 No prompts Red Prompts Red and purple prompts Q4 No prompts Red Prompts Red and purple prompts 15 Q5 No prompts No prompts No prompts  107 20 Q6 - Quiz Instructions to focus on process and accuracy.  No further prompts Instructions to focus on process and accuracy.  No further prompts Instructions to focus on process and accuracy.  No further prompts  1. During the introduction  a. Let students know that the course instructors are trying out something different with tutorials this week to help students get the most out of their time.  b. Get students to arrange themselves in pairs.  i. Note: who the pairs are doesn’t matter, just that they are all in groups of two. You may have one group of three if you have an odd number, but there should only be one at most.  c. Hand out tutorial questions. Students will receive one copy of the tutorial for each pair. Students will only submit one copy of the tutorial and it should contain all of the work for both students.  i. Ask students to write the names of both students on their tutorial so we can track people’s work.  d. Each group should be given two copies of the research consent form.  i. Instructions to students: all students will complete the tutorials, as usual. But students are not required to submit their responses to be included in the research study. Students should take a few minutes to read through the consent form and decide if they want to sign it. At the end of the tutorial when all of the worksheets are collected, students will submit their consent forms as well. If the consent form is unsigned, the data will be destroyed. If the consent forms are signed the data will be included in the study.  e. Once students have read the instructions they can get started on the questions!  2. Questions 1-4  a. These questions are the “intervention”. Students should work through them for an hour. It is not a problem if they don’t finish all of them, but it is important for them to try to be as detailed as possible and answer questions in full before moving on to the next one. This is especially true for the experimental conditions, where students might be tempted to ignore the prompts at the end of questions to move on to more “content”.   3. Question 5 a. With between 10-15 minutes left until the quiz, you should encourage students to move on to question 5. Let students know that this question is really good practice for the quiz question so they will get extra practice for the upcoming quiz.  b. Note: this question is completely unprompted and of similar difficulty to the quiz question. This is our “post” measure of students problem-solving abilities (because we expect them to write less on a quiz question due to their focus on getting the “right answer”).    108 4. Question 6 – quiz question a. Students will have 10 minutes to complete the quiz question individually. This question functions as it usually would in your tutorial – you will be marking them. The only difference is that we have prompted students to focus on the answer and the process, as both will be marked. This will hopefully encourage students to show their thinking!   5. Collecting materials  a. At the end of the tutorial you should have the following materials collected:  i. Tutorial worksheets (one per pair, with names of both members on the paper)  ii. Consent forms (two per pair)  iii. Quizzes (one per person with names on the paper)  b. We will collect all of the materials, and make copies of the quizzes so we can return them to you for marking.  Specific Conditions Control (Blue)  • Control condition functions like a regular tutorial • All questions are unprompted, except the added instructions in the tutorial  • Answer student questions the same way you usually would in tutorial  • Make sure you collect all worksheets, consent forms and quizzes  • For question 5: This question is the closest thing to the quiz question that they will get! The benefit here is that they can work through it in pairs and if they need to go find information they are able to. You can answer questions how you usually would in a quiz scenario, but remind students to try to answer the question as if it were a quiz scenario.  Experimental Condition 1 – Problem-solving Facilitated (Blue + Red)  • Question 1 – This question looks the same in all tutorials and is the same kind of question they would see normally!  o This question creates a baseline for student responses. o Support students as you usually would • Question 2-4 – These questions have prompts in them that will help students to work through the problem-solving process. They are likely questions that you would encourage students to think about if they were having difficulties answering the question, but we have made them explicit questions they need to answer instead.  • Scenarios:  o Students are confused about the content of the question § Make sure they have tried to answer the prompts before asking you questions  § If they have tried the prompts, ask them to explain their reasoning so far so you can understand where they are stuck § Suggest where they could go to find an answer to prompts i.e. text  o Students are confused about the prompts themselves § Rephrase the prompt   109 § Suggest problem-solving techniques that they can try out  • Ex. Question 2 asks them to make sense of the ratios… you could suggest that they think back to the concepts being focused on this week or that they can look at their notes to find similar information and try to identify the concepts they are being asked to apply.  § Do not give them the answers – it is important that they struggle through these questions and come up with their own answers.  • Question 5 – This question is the closest thing to the quiz question that they will get! The benefit here is that they can work through it in pairs and if they need to go find information they are able to. You can answer questions how you usually would in a quiz scenario, but remind students to try to answer the question as if it were a quiz scenario.  • Question 6 – Proceed as you usually would in a quiz situation!  Experimental Condition 2 – Self-Regulation Supported (Blue + Red + Purple)  • Question 1 – This question looks the same in all tutorials and is the same kind of question they would see normally!  o This question creates a baseline for student responses. o Support students as you usually would • Question 2-4 – These questions have two kinds of prompts in them. The first (red) will help students to work through the problem-solving process. The second (purple) will help students to make connections between the content and the problem-solving process in a meaningful way.  o The first set of questions are likely questions that you would encourage students to think about if they were having difficulties answering the question, but we have made them explicit questions they need to answer instead.  o The second set of questions come in two forms – connected to the content and abstracted from the content. You might ask the first form of these questions, but it is unlikely that you would ask the second form in a traditional tutorial  • Scenarios:  o Students are confused about the content of the question § Make sure they have tried to answer the prompts before asking you questions  § If they have tried the prompts, ask them to explain their reasoning so far so you can understand where they are stuck § Suggest where they could go to find an answer to prompts i.e. text  o Students are confused about the problem-solving prompts (red) § Rephrase the prompt  § Suggest problem-solving techniques that they can try out  • Ex. Question 2 asks them to make sense of the ratios… you could suggest that they think back to the concepts being focused on this week or that they can look at their notes to find similar information and try to identify the concepts they are being asked to apply.  § Do not give them the answers – it is important that they struggle through these questions and come up with their own answers.   110 o Students are confused about the self-regulation prompts (purple)  § Try to get them to talk through their logic and then get them to write down what they talked to you about  § Ask them to walk through the process they went through to answer the blue and red prompts. This will likely articulate the problem-solving process. We just want to make sure that they are actively articulating the process so they are forced to reflect on it.  § Note: they are more likely to get confused about the final prompt at the very end of each question because it is more abstract than the others. Encourage them to look back at their answers and to walk through the process to answer these questions.  • Question 5 – This question is the closest thing to the quiz question that they will get! The benefit here is that they can work through it in pairs and if they need to go find information they are able to. You can answer questions how you usually would in a quiz scenario, but remind students to try to answer the question as if it were a quiz scenario.  • Question 6 – Proceed as you usually would in a quiz situation!         111 8.3 Appendix C: Course syllabus Biology 234: Fundamentals of Genetics Fall 2014   The Major Theme:  This course examines fundamental genetic principles: mutation, phenotype, segregation, linkage, complementation, and gene interaction, as well as many applications of these fundamentals.   Text:  • Griffiths, A.J.F., Wessler, S.R., Carroll, S.B. and Doebley, J.  2011.  Introduction to Genetic Analysis, 10th Edition. W.H. Freeman and Company, New York. • Available at the UBC Bookstore in either looseleaf with e-book access (98$), new hardcover with e-book access $152 or hardcover used $114 or rental hardcover $85, and/or as an e-book ($98, but should go down). (You cannot sell softbound looseleaf copies back to the bookstore, but are free to sell them privately to other students as we will use the book again).  • Copies of texts are available for short term loan in the Genetics Help Office (Room 2521). We do not recommend the 9th edition. There are changes in terms of text content as well as the end of chapter problems (some problems are new whereas others have shifted chapters and numbers). We are not extremely familiar with the 9th edition and cannot advise on what to read or what problems to do in that edition. If you want to compare the 9th edition with a copy of the new 10th edition you can sit in the genetics help office and do that (BIOSCI 2519).  Connect: Course materials will be provided on-line using UBC’s Connect interface www.connect.ubc.ca . You can use the Connect discussion board to get assistance from your fellow classmates outside of class and tutorial  Reading Quizzes: Lists of readings for each week are found on Connect. There are reading quizzes based on the readings for the upcoming week that are due each Monday before 9am. (For the first week only there is a reading quiz due by 9am on Friday, Sept 6th). Your grade on each quiz will be an average based on your 2 possible attempts.   Tutorials: Attendance in tutorials is mandatory and vital to your success in this course. You cannot register in a tutorial which conflicts with your lecture.  Contact Dr. Berezowsky for any issues related to tutorial registration. Tutorial questions are based on material covered in lecture the previous week. Assigned tutorial problems for each week will be listed on Connect. Show your solved problems to your TA in tutorial and receive 0.5 marks for completion. There will also be 0.5 marks for a weekly tutorial quiz question. There are 12 tutorials but only 10 tutorial marks so you may miss a tutorial and still get full marks for this component.   Evaluation*: (one page, double sided sheet of hand-written notes allowed for midterms and final exam)  Midterm 1    Fri Oct 4th         50 min    (15%)  Midterm 2    Wed Nov 6th     50 min    (15%)  Final     TBA                3 hrs   (55%)  Tutorials (~10%), reading quizzes (~3%), in-class activities, etc. (4%), (15%) Tentative, and possibly subject to change  Missed quiz or exams: Any missed midterm exam(s) must be reported to Dr. Berezowsky with supporting documentation and upon approval, the midterm weight will be added to the final exam. Alternatively, a mark of 0% will be entered. There are no “makeup” midterm exams.   112  Any missed final exam must be reported to Science Advising (or appropriate Faculty advising office) with supporting documentation. Standing Deferred status will only be granted if the student is in good academic standing in the course (see UBC calendar). Deferred exams will be scheduled through the Registrars’ Office and are generally held in late July to early August following the completion of the academic year. Deferred exams are not held within the same examination period from which they are deferred.   There are no provisions to alter final exam schedule due to travel plans, etc. The last day of exams is Dec 18th so do not schedule flight earlier than Dec 19th. If you qualify under the “three exams within 24 hour” provision you must see Science Advising (or appropriate Faculty advising office).   Please note this course schedule is tentative. Always check Connect for readings, tutorial questions, and possible changes to this schedule.    DATE: TOPICS TEXT READINGS (Griffiths 10th ed.) TUTORIAL PROBLEMS  PHENOTYPE AND MUTATION   Wed Sept 4th Introduction/ Ch 1: 1-11, 17-18, 23-25 Ch 8: section 8.2 288-292 (for reading quiz due Friday 9am) Ch 7: Fig 7-25 p273 See CONNECT for more detailed reading guide. Tutorials start in Week 2 and are based on topics from the previous week Fri Sept 6th DNA, Genomes, Genes Mon Sept 9th DNA as code, Chromosome structure, Mutations Ch 7: 272-273, 275-278 Ch 1: 12-14, Ch 2 :40, 44-46 Ch 16: 553-558 and 560-562 Ch 19: 683-685 Article on phenotypes in armadillos, weblink in Connect Ch 10: 345-347 Ch 4: 138-139 & Fig 4-15 (pg140) See CONNECT for more detailed reading guide. See CONNECT for TUTORIAL PROBLEMS Wed Sept 11th Phenotype Fri Sept 13th Reverse Genetic Analysis (Predicting functional effects of various mutations in various parts of a gene at the RNA and protein levels (eg. Loss of function)). Mon Sept 16th Dominance v rsus recessiveness See CONNECT for Readings  Wed Sept 18th National Day of Reconciliation. In lieu of attending class students are required to attend events either at the PNE or on campus. Fri Sept 20th Rev se genetic analysis cont.  SEGREGATION   Mon Sept 23rd Mitosis and Meiosis   Wed Sept 25th 1 gene, 2 alleles, 3 alleles Fri Sept 27th 2 genes, 2 alleles Mon Sept 30th X-linkage, dosage   Wed Oct 2nd Pedigrees, Probability Fri Oct 4th Midterm #1  LINKAGE   Mon Oct 7th Recombination   Wed Oct 9th Linkage Fri Oct 11th Why do we map genes? Linkage cont. Mon Oct 14th Thanksgiving. No class.   Wed Oct 16th Molecular markers Reading quiz due Wed. 9am  Fri Oct 18th Molecular markers continued.  COMPLEMENTATION   Mon Oct 21st Mutant screens   Wed Oct 23rd Mutant screens continued. Complementation Fri Oct 25th   APPLICATIONS OF THE GENETIC PILLARS   Mon Oct 28th ene Interaction   Wed Oct 30th Gene Interaction Fri Nov 1st Gene Interaction Mon Nov 4th Review   Wed Nov 6th Midterm #2   Fri Nov 8th Ploidy Reading Quiz due Friday 9am  Mon Nov 11th Remembrance Day. No class.   Wed Nov 13th Ploidy Reading Quiz due Wed. 9am  Fri Nov 15th Genomics   Mon Nov 18th Genomics   Wed Nov 20th Somatic cell genetics   Fri Nov 22nd Cancer   Mon Nov 25th Cancer   Wed Nov 27th Wrap up and review   Fri Nov 29th Review. Last day of classes       114 8.4 Appendix D: Consent form  University of British Columbia Consent Agreement  Title: Measuring self-regulation during problem set-up under two scaffolding conditions  As students of Biol 234, you are being asked to participate in a research study. Before you give your consent to be a volunteer, it is important that you read the following information and ask as many questions as necessary to be sure you understand what you will be asked to do.  Purpose of the Study: Undergraduate students are expected to gain a wide variety of skills to be successful, including the ability to regulate their own behaviour and problem solve. These skills are difficult to develop, especially in early years of undergraduate degrees where students are adjusting to university. This study looks to understand how students go about problem solving in Biol 234, which requires strong problem solving skills to be successful.   Description of the Study: Students in Biol 234 will participate in one tutorial where the materials are slightly different from the previous tutorials. Two different tutorials are possible – a content-based tutorial or a skills-based tutorial. The tutorial type has been randomly assigned to tutorial sections. Students will be asked the same questions in each tutorial, but the sub-questions will be slightly different. At the completion of the tutorial, students will complete one quiz question, which is a regular requirement of the tutorial component of the course. Tutorial answers and quiz answers will be evaluated for evidence of self-regulatory behaviours and understanding of course material. This study seeks to understand how these resources interplay with students problem-solving abilities.   All students have also been asked to complete a survey regarding their motivation, self-regulation, self-efficacy and goal orientation attitudes prior to participating in the tutorial.   What is Experimental in this Study: There are two different types of resources being introduced into the tutorial to evaluate if there is a difference in students problem-solving abilities. Both resources are expected to increase students problem-solving abilities compared to the traditional tutorial.   Risks or Discomforts: Although the researchers are not aware of the apparent risks of the study, as all conditions are intended to improve student learning and course organization. Regardless, we understand that students might feel uncomfortable participating in a research study. Should this situation arise, please inform the co-investigator Heather Fisher at any point during the term to remove yourself from the study. All data will be destroyed should you wish to discontinue with the study, with no consequences to the student.    Benefits of the Study:  We expect you will benefit from participation in the study in the following ways: a) Increased self-regulation and problem-solving skills.  b) Increased understanding of the material.  c) Potential changes to the course organization   Confidentiality:  All the data collected in the study will be strictly confidential and nobody except for the researchers will have access to it. All the data will be stored electronically on a password protected computer on a secure UBC server. The data will be erased and destroyed in five years after the completion of the study (Fall 2018). The confidentiality will be maintained during the publication of the results of the study: no names or any other personal information will be included in the publications. All personal information will be removed from the study to ensure confidentiality.   115  Incentives to Participate: The participant will not be paid to participate in this study.  Voluntary Nature of Participation: Participation in this study is voluntary. Your choice of whether or not to participate will not influence your future relations with the University of British Columbia and the Faculty of Science. If you decide to participate, you are free to withdraw your consent and to stop your participation at any time without penalty or loss of benefits to which you are allowed.   Questions about the Study: If you have any questions about the research, please contact the study investigators at any point. If you have questions regarding your rights as a human subject and participant in this study, you may contact the University of British Columbia Behavioural Research Ethics Board for information at the Office of Research Services  Agreement:  Your signature below indicates that you have read the information in this agreement and have had a chance to ask any questions you have about the study. Your signature also indicates that you agree to be in the study and have been told that you can change your mind and withdraw your consent to participate at any time. You have been given a copy of this agreement.   You have been told that by signing this consent agreement you are not giving up any of your legal rights.  _____________________________________  Name of Participant (please print)    _____________________________________  __________________ Signature of Participant      Date    _____________________________________   __________________ Signature of Investigator      Date    116 8.5 Appendix E: Tutorial question set (for each condition)  Control Condition   QUESTION 1    The Mexican cavefish lives in a series of unconnected caves. Fish found in the caves have been blind for millennia, and interestingly cavefish can still interbreed with surface fish! The surface fish can see. Your research goal is to determine if blind Cavefish from different populations are the result of mutations in the same gene or different genes and to determine which genes are responsible for blindness.  You isolate 5 populations of blind fish from 5 different cave ponds. You select an individual fish from each population and cross them each with a sighted surface fish. All of the offspring are sighted.   You then set up a series of crosses such that individual fish from each of the 5 blind populations are mated with each other. You obtain the following results where + represents offspring that are sighted, and – represents blind offspring:    d. Consider all of the information provided and determine how many genes are working to produce sight in these fish. Explain your reasoning. Be sure to define gene and allele symbols/labels.                 117 e. Below is an illustration of where each population originated from. Draw the chromosomes labeled with alleles to indicate what gene(s) are causing blindness in the fish from each blind population, as well as the surface sighted fish. The fish are 2n=8. State any assumptions you are making.                               f. You isolate a 6th blind population. When you mate a pure breeding blind fish from population #6 with sighted fish all of the offspring are blind. Could you use this blind fish from population #6 for complementation testing with the other blind strains?  Explain why or why not.          118 QUESTION 2  In minks, wild types have an almost black coat. Breeders have developed many pure lines of color variants for the mink-coat industry. Two such pure lines are platinum (blue gray) and Aleutian (steel gray). These lines were used in crosses, with the following results:     d. Devise a genetic explanation of these three crosses. Show complete genotypes for the parents, the F1, and the F2 in the three crosses, and make sure that you show the alleles of each gene that you hypothesize for every mink.             e. Predict the F1 phenotypic ratios from crossing sapphire with platinum and with Aleutian pure lines.         f. If these F1s self, what will the phenotypic ratios in the two F2 populations be?           119 QUESTION 3  You are investigating a heritable heart disease. This disease results in abnormal development of the heart such that mice with this condition usually die at a much younger age than healthy mice. You are trying to determine how many genes are involved in this hereditary heart disease.    You have mice from 4 different strains, all exhibiting the abnormal heart condition (mouse #1-4). You crossed each of these mutants to true breeding wild-type and all of the offspring are wild-type.  Below is the data from several crosses you performed to identify if the mice have mutations in the same gene or different genes. A “+” symbol represents normal, healthy mice, whereas a “-“ symbol represents mice that died very young due to heart disease.    Mouse #1 #2 #3 #4 Mouse #1 - - + + #2 - - + + #3 + + - + #4 + + + -  c. What can you conclude based on this data? Show all of your work including genes and genotypes.        d. A fifth mouse (#5) with heart disease is identified from a separate population. When this mouse is crossed to a true breeding healthy mouse half of the offspring are healthy, the other half have heart disease. Similarly, when this mouse (#5) is crossed to each of the four mice listed above the results are the same for each cross: half of the offspring are healthy, the other half are diseased. What does this suggest? Show all of your work, including genotypes. State any assumptions you are making.           120 QUESTION 4  Recall the talon question from Part 1.   Imagine that you use X-rays to mutagenize fertilized owl eggs that are in the 1 cell stage of development (diploid zygote). You mutagenize two batches of fertilized eggs and let the eggs develop, and birds hatch, creating two M1 populations: A and B.   You count the number of wild-type (WT) and mutant owls in each population. Here is the data from your phenotype screening and counting. Assume one owl mating with one other produces one offspring.   Population M1 M2 (each M1 is crossed with a wild-type bird to create M2) M3 (M2s randomly mate with each other to create M3, multiple mating occurs) A 11 WT 1mutant 11 WT 2 mutant 9 WT 3 mutant  B 10 All WT 10 WT 7 WT 3 mutant   What can you conclude based on this data? How are the mutants in population A different and/or the same from those in population B?        121 QUESTION 5  Filled-in symbols indicate deafness. Three families from different geographical locations are shown here, and the mating interactions between families. The families starting with individuals I-1 x I-2 and I-3 x I-4 were from one geographical location, the family starting with individuals II-5 x II-6 were from another distinct geographical location, and the family starting with individuals II-7 x II-8 were from yet another distinct geographical. You can see how members of these families eventually come together in the III and IV generations.       b. Study the pedigree and explain how deafness could be inherited in each family Propose a hypothesis to explain why none of the individuals in generation V are deaf, and why the deafness occurs in offspring of III-2 x III-3.      122 Problem-Solving Condition   QUESTION 1    The Mexican cavefish lives in a series of unconnected caves. Fish found in the caves have been blind for millennia, and interestingly cavefish can still interbreed with surface fish!  The surface fish can see. Your research goal is to determine if blind Cavefish from different populations are the result of mutations in the same gene or different genes and to determine which genes are responsible for blindness.  You isolate 5 populations of blind fish from 5 different cave ponds. You select an individual fish from each population and cross them each with a sighted surface fish. All of the offspring are sighted.   You then set up a series of crosses such that individual fish from each of the 5 blind populations are mated with each other. You obtain the following results where + represents offspring that are sighted, and – represents blind offspring:    g. Consider all of the information provided and determine how many genes are working to produce sight in these fish. Explain your reasoning. Be sure to define gene and allele symbols/labels.                   123 h. Below is an illustration of where each population originated from. Draw the chromosomes labeled with alleles to indicate what gene(s) are causing blindness in the fish from each blind population, as well as the surface sighted fish. The fish are 2n=8. State any assumptions you are making.                              i. You isolate a 6th blind population. When you mate a pure breeding blind fish from population #6 with sighted fish all of the offspring are blind. Could you use this blind fish from population #6 for complementation testing with the other blind strains? Explain why or why not.           124 QUESTION 2  In minks, wild types have an almost black coat. Breeders have developed many pure lines of color variants for the mink-coat industry. Two such pure lines are platinum (blue gray) and Aleutian (steel gray). These lines were used in crosses, with the following results:     g. Devise a genetic explanation of these three crosses. Show complete genotypes for the parents, the F1, and the F2 in the three crosses, and make sure that you show the alleles of each gene that you hypothesize for every mink.   What are the ratios observed in the scenario?                            125 h. Predict the F1 phenotypic ratios from crossing sapphire with platinum and with Aleutian pure lines.  What information do you know from the scenario that is necessary to answer this question?                 i. If these F1s self, what will the phenotypic ratios in the two F2 populations be?   What work have you already completed that is useful to answer this question?                    126 QUESTION 3  You are investigating a heritable heart disease. This disease results in abnormal development of the heart such that mice with this condition usually die at a much younger age than healthy mice. You are trying to determine how many genes are involved in this hereditary heart disease.    You have mice from 4 different strains, all exhibiting the abnormal heart condition (mouse #1-4). You crossed each of these mutants to true breeding wild-type and all of the offspring are wild-type.  Below is the data from several crosses you performed to identify if the mice have mutations in the same gene or different genes. A “+” symbol represents normal, healthy mice, whereas a “-“ symbol represents mice that died very young due to heart disease.    Mouse #1 #2 #3 #4 Mouse #1 - - + + #2 - - + + #3 + + - + #4 + + + -  e. What can you conclude based on this data? Show all of your work including genes and genotypes.  What information in the data is important to understand to answer this question?                       127 f. A fifth mouse (#5) with heart disease is identified from a separate population.  When this mouse is crossed to a true breeding healthy mouse half of the offspring are healthy, the other half have heart disease.  Similarly, when this mouse (#5) is crossed to each of the four mice listed above the results are the same for each cross: half of the offspring are healthy, the other half are diseased.  What does this suggest?  Show all of your work, including genotypes.  State any assumptions you are making.  Draw a diagram/representation of the new information that was just presented in this scenario.  Compare this representation of the data to the information you deciphered in Part A. Does this new information make sense compared to what you identified in Part A? Is it possible that there is another explanation for this new information?                            128 QUESTION 4  Recall the talon question from Part 1.   Imagine that you use X-rays to mutagenize fertilized owl eggs that are in the 1 cell stage of development (diploid zygote). You mutagenize two batches of fertilized eggs and let the eggs develop, and birds hatch, creating two M1 populations: A and B.   You count the number of wild-type (WT) and mutant owls in each population. Here is the data from your phenotype screening and counting. Assume one owl mating with one other produces one offspring.   Population M1 M2 (each M1 is crossed with a wild-type bird to create M2) M3 (M2s randomly mate with each other to create M3, multiple mating occurs) A 11 WT 1mutant 11 WT 2 mutant 9 WT 3 mutant  B 10 All WT 10 WT 7 WT 3 mutant  What can you conclude based on this data? How are the mutants in population A different and/or the same from those in population B?   What are you being asked to do in this question?      What Data seems important to you to be able to answer this question? What information doesn’t make sense to you/strikes you as something you should know more about?                129 QUESTION 5  Filled-in symbols indicate deafness. Three families from different geographical locations are shown here, and the mating interactions between families. The families starting with individuals I-1 x I-2 and I-3 x I-4 were from one geographical location, the family starting with individuals II-5 x II-6 were from another distinct geographical location, and the family starting with individuals II-7 x II-8 were from yet another distinct geographical. You can see how members of these families eventually come together in the III and IV generations.       c. Study the pedigree and explain how deafness could be inherited in each family. Propose a hypothesis to explain why none of the individuals in generation V are deaf, and why the deafness occurs in offspring of III-2 x III-3.     130 Self-Regulated Learning condition   QUESTION 1    The Mexican cavefish lives in a series of unconnected caves. Fish found in the caves have been blind for millennia, and interestingly cavefish can still interbreed with surface fish! The surface fish can see. Your research goal is to determine if blind Cavefish from different populations are the result of mutations in the same gene or different genes and to determine which genes are responsible for blindness.  You isolate 5 populations of blind fish from 5 different cave ponds. You select an individual fish from each population and cross them each with a sighted surface fish. All of the offspring are sighted.   You then set up a series of crosses such that individual fish from each of the 5 blind populations are mated with each other. You obtain the following results where + represents offspring that are sighted, and – represents blind offspring:    j. Consider all of the information provided and determine how many genes are working to produce sight in these fish. Explain your reasoning. Be sure to define gene and allele symbols/labels.                   131 k. Below is an illustration of where each population originated from. Draw the chromosomes labeled with alleles to indicate what gene(s) are causing blindness in the fish from each blind population, as well as the surface sighted fish. The fish are 2n=8.  State any assumptions you are making.                               l. You isolate a 6th blind population.  When you mate a pure breeding blind fish from population #6 with sighted fish all of the offspring are blind.  Could you use this blind fish from population #6 for complementation testing with the other blind strains?  Explain why or why not.         132 QUESTION 2  In minks, wild types have an almost black coat. Breeders have developed many pure lines of color variants for the mink-coat industry. Two such pure lines are platinum (blue gray) and Aleutian (steel gray). These lines were used in crosses, with the following results:     j. Devise a genetic explanation of these three crosses. Show complete genotypes for the parents, the F1, and the F2 in the three crosses, and make sure that you show the alleles of each gene that you hypothesize for every mink.   What are the ratios observed in the scenario?      What do the ratios tell you about the problem? Why is it important to make sense of the data to answer this question?                       133 k. Predict the F1 phenotypic ratios from crossing sapphire with platinum and with Aleutian pure lines.  What information do you know from the scenario that is necessary to answer this question?     Why is this information important? How does this help you answer the question?                  l. If these F1s self, what will the phenotypic ratios in the two F2 populations be?   What work have you already completed that is useful to answer this question? How do you know that previous is useful to answer this question?                   134 What was a stumbling block you encountered when solving this problem? What did you do to successfully deal with it?          What strategies did you use that allowed you to successfully answer this problem? How can you use them in them to successfully answer another problem?           QUESTION 3  You are investigating a heritable heart disease.  This disease results in abnormal development of the heart such that mice with this condition usually die at a much younger age than healthy mice.  You are trying to determine how many genes are involved in this hereditary heart disease.    You have mice from 4 different strains, all exhibiting the abnormal heart condition (mouse #1-4). You crossed each of these mutants to true breeding wild-type and all of the offspring are wild-type.  Below is the data from several crosses you performed to identify if the mice have mutations in the same gene or different genes.   A “+” symbol represents normal, healthy mice, whereas a “-“ symbol represents mice that died very young due to heart disease.    Mouse #1 #2 #3 #4 Mouse #1 - - + + #2 - - + + #3 + + - + #4 + + + -      135 g. What can you conclude based on this data?  Show all of your work including genes and genotypes.  What information in the data is important to understand to answer this question?      How do you make sense of this information? Why is it important to understand the +/- data presented to answer this question?                             h. A fifth mouse (#5) with heart disease is identified from a separate population.  When this mouse is crossed to a true breeding healthy mouse half of the offspring are healthy, the other half have heart disease.  Similarly, when this mouse (#5) is crossed to each of the four mice listed above the results are the same for each cross: half of the offspring are healthy, the other half are diseased.  What does this suggest?  Show all of your work, including genotypes.  State any assumptions you are making.   136 Draw a diagram/representation of the new information that was just presented in this scenario.  Compare this representation of the data to the information you deciphered in Part A. Does this new information make sense compared to what you identified in Part A? Is it possible that there is another explanation for this new information?                   How did creating a representation of the problem help you solve the problem? Why did it help you solve the problem?          The quiz question focuses on your ability to problem-solve. What did you do here that was helpful to solve the problem? When you do the quiz at the end of the tutorial, how can you use these problem-solving approaches to help you be successful on the quiz?        137 QUESTION 4  Recall the talon question from Part 1.   Imagine that you use X-rays to mutagenize fertilized owl eggs that are in the 1 cell stage of development (diploid zygote). You mutagenize two batches of fertilized eggs and let the eggs develop, and birds hatch, creating two M1 populations:  A and B.   You count the number of wild-type (WT) and mutant owls in each population.  Here is the data from your phenotype screening and counting.  Assume one owl mating with one other produces one offspring.   Population M1 M2 (each M1 is crossed with a wild-type bird to create M2) M3 (M2s randomly mate with each other to create M3, multiple mating occurs) A 11 WT 1mutant 11 WT 2 mutant 9 WT 3 mutant  B 10 All WT 10 WT 7 WT 3 mutant  What can you conclude based on this data?  How are the mutants in population A different and/or the same from those in population B?   What are you being asked to do in this question?      What Data seems important to you to be able to answer this question? What information doesn’t make sense to you/strikes you as something you should know more about?     How do you know this data is important to understand in order to solve the problem?            138                  The tutorial questions tackle many different concept areas but all focus on your ability to solve problems.  Were there approaches to interpreting problems and important data that you used across all the questions? What were they?       What clues were you able to identify in the questions to help you approach the problem? How will you approach the quiz question now that you have successfully answered these questions?                     139  QUESTION 5  Filled-in symbols indicate deafness.  Three families from different geographical locations are shown here, and the mating interactions between families.  The families starting with individuals I-1 x I-2 and I-3 x I-4 were from one geographical location, the family starting with individuals II-5 x II-6 were from another distinct geographical location, and the family starting with individuals II-7 x II-8 were from yet another distinct geographical.  You can see how members of these families eventually come together in the III and IV generations.     d. Study the pedigree and explain how deafness could be inherited in each family Propose a hypothesis to explain why none of the individuals in generation V are deaf, and why the deafness occurs in offspring of III-2 x III-3.       140 Quiz Question   A genetics researcher at UBC studying sensory perception did a mutagenesis in Caenorabditus elegans for the purpose of selecting mutants that, unlike wild type worms, did not move in the direction of a food source.  He isolated three worms defective in sensing food from the M2 mutagenized population.  These mutants were named Hungry (hgy #1, hgy #2, hgy # 3).  He took the three mutants, crossed them to each other and determined whether the F1 progeny were normal or defective in sensing food in order to determine how many genes were identified by the three mutants.  He got the following results:  Phenotypes of the F1 progeny resulting from the indicated cross:  hgy #1 hgy #2 hgy #3 hgy #1 hgy  hgy hgy hgy #2  hgy wild type hgy #3   hgy  Remember, you are being marked on your answer and your process! Show all of your work and explain yourself as clearly as possible.   a. What did the researcher forget to do?  Explain why this/these steps are important.  (4 points)           b. Explain what is unusual about the results the technician obtained. (2 points)              141 c. Provide one reasonable hypothesis for the unusual results and suggest how   many genes have been identified.   Explain in a few sentences how you would test your hypothesis. (4 points)                     Bonus points:  can you come up with a second hypothesis to explain the unusual result?       142 8.6 Appendix F: Coding manual    143    144    145    146    147    148    149    150   8.7 Appendix G: Multivariate (MANCOVA) analysis for question Completeness by condition (Midterm 1 covariate)  Between-Subjects Factors  N Condition 1.0 66 2.0 113 3.0 119   Descriptive Statistics  Condition Mean Std. Deviation N 1Comp_Ave 1 2.7264 .55634 67 2 2.8437 .38594 113 3 2.6083 .66899 120 Total 2.7233 .55948 300 2Comp_Ave 1 2.4030 .93481 67 2 2.6932 .63016 113 3 1.6472 1.20634 120 Total 2.2100 1.06983 300 3Comp_Ave 1 1.9030 1.26503 67 2 2.2257 .94015 113 3 .9542 1.20048 120 Total 1.6450 1.26166 300 4_Comp 1 1.57 1.427 67 2 1.45 1.225 113 3 .28 .756 120 Total 1.01 1.264 300 5_Comp 1 1.06 1.266 67 2 1.54 1.254 113 3 2.48 .722 120 Total 1.81 1.219 300    152 Multivariate Testsa Effect Value F Hypothesis df Error df Sig. Partial Eta Squared Intercept Pillai's Trace .453 48.093b 5.000 290.000 .000 .453 Wilks' Lambda .547 48.093b 5.000 290.000 .000 .453 Hotelling's Trace .829 48.093b 5.000 290.000 .000 .453 Roy's Largest Root .829 48.093b 5.000 290.000 .000 .453 Condition * Midterm1 Pillai's Trace .068 2.052 10.000 582.000 .026 .034 Wilks' Lambda .933 2.046b 10.000 580.000 .027 .034 Hotelling's Trace .071 2.040 10.000 578.000 .028 .034 Roy's Largest Root .042 2.438c 5.000 291.000 .035 .040 Midterm1 Pillai's Trace .078 4.925b 5.000 290.000 .000 .078 Wilks' Lambda .922 4.925b 5.000 290.000 .000 .078 Hotelling's Trace .085 4.925b 5.000 290.000 .000 .078 Roy's Largest Root .085 4.925b 5.000 290.000 .000 .078 Condition Pillai's Trace .155 4.884 10.000 582.000 .000 .077 Wilks' Lambda .849 4.938b 10.000 580.000 .000 .078 Hotelling's Trace .173 4.991 10.000 578.000 .000 .079 Roy's Largest Root .138 8.014c 5.000 291.000 .000 .121 a. Design: Intercept + Condition * Midterm1 + Midterm1 + Condition b. Exact statistic c. The statistic is an upper bound on F that yields a lower bound on the significance level.               153 Tests of Between-Subjects Effects Source Dependent Variable Type III Sum of Squares df Mean Square F Sig. Partial Eta Squared Corrected Model 1Comp_Ave 6.273a 5 1.255 4.224 .001 .067 2Comp_Ave 86.302b 5 17.260 19.829 .000 .252 3Comp_Ave 120.953c 5 24.191 20.035 .000 .254 4_Comp 121.601d 5 24.320 20.063 .000 .254 5_Comp 114.604e 5 22.921 20.447 .000 .258 Intercept 1Comp_Ave 69.190 1 69.190 232.960 .000 .442 2Comp_Ave 12.766 1 12.766 14.666 .000 .048 3Comp_Ave .949 1 .949 .786 .376 .003 4_Comp .006 1 .006 .005 .944 .000 5_Comp 5.366 1 5.366 4.787 .029 .016 Condition * Midterm1 1Comp_Ave 1.690 2 .845 2.845 .060 .019 2Comp_Ave 4.087 2 2.044 2.348 .097 .016 3Comp_Ave 4.316 2 2.158 1.787 .169 .012 4_Comp 1.968 2 .984 .812 .445 .005 5_Comp 4.306 2 2.153 1.921 .148 .013 Midterm1 1Comp_Ave .237 1 .237 .798 .373 .003 2Comp_Ave 14.043 1 14.043 16.133 .000 .052 3Comp_Ave 20.945 1 20.945 17.347 .000 .056 4_Comp 13.607 1 13.607 11.225 .001 .037 5_Comp 10.322 1 10.322 9.208 .003 .030 Condition 1Comp_Ave 2.153 2 1.076 3.624 .028 .024 2Comp_Ave 13.320 2 6.660 7.651 .001 .049 3Comp_Ave 8.986 2 4.493 3.721 .025 .025 4_Comp 2.484 2 1.242 1.025 .360 .007 5_Comp 15.981 2 7.990 7.128 .001 .046 Error 1Comp_Ave 87.319 294 .297    2Comp_Ave 255.913 294 .870    3Comp_Ave 354.989 294 1.207    4_Comp 356.386 294 1.212    5_Comp 329.566 294 1.121      154 Source Dependent Variable Type III Sum of Squares df Mean Square F Sig. Partial Eta Squared Total 1Comp_Ave 2318.556 300     2Comp_Ave 1807.444 300     3Comp_Ave 1287.750 300     4_Comp 782.000 300     5_Comp 1427.000 300     Corrected Total 1Comp_Ave 93.592 299     2Comp_Ave 342.214 299     3Comp_Ave 475.943 299     4_Comp 477.987 299     5_Comp 444.170 299     a. R Squared = .067 (Adjusted R Squared = .051) b. R Squared = .252 (Adjusted R Squared = .239) c. R Squared = .254 (Adjusted R Squared = .241) d. R Squared = .254 (Adjusted R Squared = .242) e. R Squared = .258 (Adjusted R Squared = .245)   8.8 Appendix H: Midterm 1 by condition ANOVA   Control condition = 1.00; Problem-Solving condition = 2.00; Self-Regulated Learning condition = 3.00  Descriptives Midterm1    N Mean Std. Deviation Std. Error 95% Confidence Interval for Mean Minimum Maximum Lower Bound Upper Bound 1.00 67 .7384 .13002 .01588 .7066 .7701 .36 .98 2.00 113 .7350 .16796 .01580 .7037 .7663 .00 .97 3.00 120 .7383 .16271 .01485 .7089 .7677 .14 1.00 Total 300 .7371 .15760 .00910 .7192 .7550 .00 1.00   ANOVA Midterm1    Sum of Squares df Mean Square F Sig. Between Groups .001 2 .000 .015 .985 Within Groups 7.426 297 .025   Total 7.426 299        156  Multiple Comparisons Dependent Variable: Midterm1   Tukey HSD   (I) Condition (J) Condition Mean Difference (I-J) Std. Error Sig. 95% Confidence Interval Lower Bound Upper Bound 1.00 2.00 .00331 .02438 .990 -.0541 .0607 3.00 .00002 .02411 1.000 -.0568 .0568 2.00 1.00 -.00331 .02438 .990 -.0607 .0541 3.00 -.00329 .02073 .986 -.0521 .0455 3.00 1.00 -.00002 .02411 1.000 -.0568 .0568 2.00 .00329 .02073 .986 -.0455 .0521     157 8.9 Appendix I: Engagement Pattern tables   Question 1a Count      Engagement Profile Control Condition Problem-Solving Condition Self-Regulated Learning condition Engagement Profile Total A B C D Total A B C D Total A B C D Total Complete    10 11 2  23 40 20   60 4 8   12 95 Run-out-of-Time    17 15   32 27 7   34  4   4 70 Pick-Up    2 2   4 7  2  9 59 30 4 2 95 108 Sampler    4 2  2 8 6 3  1 10 9    9 27 Engagement Profile Subtotal 33 30 2 2 67 80 30 2 1 113 72 42 4 2 120 300     158 Question 1a Percentage of Condition    Engagement Profile Control Condition Problem-Solving Condition Self-Regulated Learning condition A B C D Total A B C D Total A B C D Total Complete    14.9 16.4 3.0  34.3 35.4 17.7   53.1 3.3 6.7   10.0 Run-out-of-Time    25.4 22.4   47.8 23.9 6.2   30.1  3.3   3.3 Pick-Up    3.0 3.0   6.0 6.2  1.8  8.0 49.2 25.0 3.3 1.7 79.2 Sampler    6.0 3.0  3.0 11.9 5.3 2.7  0.9 8.8 7.5    7.5 % of Condition 49.3 44.8 3.0 3.0 100 80.0 26.5 1.8 0.9 100 60.0 35.0 3.3 1.7 100     Question 1b Count        Engagement Profile Control Condition Problem-Solving Condition Self-Regulated Learning condition Engagement Profile Total A B C D Total A B C D Total A B C D Total Complete     19  4 23  60   60  12   12 95 Run-out-of-Time     27  5 32  32  2 34  2  2 4 70 Pick-Up     2  2 4  7  2 9  75  20 95 108 Sampler     4  4 8  5  5 10  7  2 9 27 Engagement Profile Subtotal  52  15 67  104  9 113  96  24 120 300     160 Question 1b Percentage of Condition    Engagement Profile Control Condition Problem-Solving Condition Self-Regulated Learning condition A B C D Total A B C D Total A B C D Total Complete     28.4  6.0 34.3  53.1   53.1  10.0   10.0 Run-out-of-Time     40.3  7.5 47.8  28.3  1.8 30.1  1.7  1.7 3.3 Pick-Up     3.0  3.0 6.0  6.2  1.8 8.0  62.5  16.7 79.2 Sampler     6.0  6.0 12.0  4.4  4.4 8.8  5.8  1.7 7.5 % of Condition  77.6  22.4 100  92.0  8.0 100  80.0  20.0 100     Question 1c Count        Engagement Profile Control Condition Problem-Solving Condition Self-Regulated Learning condition Engagement Profile Total A B C D Total A B C D Total A B C D Total Complete    14 7  2 23 41 17  2 60 6 4  2 12 95 Run-out-of-Time    12 15  5 32 18 16   34  2  2 4 70 Pick-Up    2 2   4 5 4   9 47 28 3 17 95 108 Sampler    6   2 8 1 4  5 10 4 3  2 9 27 Engagement Profile Subtotal 34 24  9 67 65 41  7 113 57 37 3 23 120 300     162 Question 1c Percentage of Condition    Engagement Profile Control Condition Problem-Solving Condition Self-Regulated Learning condition A B C D Total A B C D Total A B C D Total Complete    20.9 10.4  3.0 34.3 36.3 15.0  1.8 53.1 5.0 3.3  1.7 10.0 Run-out-of-Time    17.9 22.4  7.5 47.8 15.9 14.2   30.1  1.7  1.7 3.3 Pick-Up    3.0 3.0   6.0 4.4 3.5   8.0 39.2 23.3 2.5 14.2 79.2 Sampler    9.0   3.0 12.0 0.9 3.5  4.4 8.8 3.3 2.5  1.7 7.5 % of Condition 50.7 35.8  13.4 100 57.5 36.3  6.2 100 47.5 30.8 2.5 19.2 100     Question 2a Count       Engagement Profile Control Condition Problem-Solving Condition Self-Regulated Learning condition Engagement Profile Total A B C D Total A B C D Total A B C D Total Complete    23    23 53  7  60 12    12 95 Run-out-of-Time    20 6 3 3 32 33   1 34 2   2 4 70 Pick-Up    4    4 9    9 48 9 4 34 95 108 Sampler    6 2   8 7   3 10 5  4  9 27 Engagement Profile Subtotal 53 8 3 3 67 102  7 4 113 67 9 8 36 120 300     164 Question 2a Percentage of Condition    Engagement Profile Control Condition Problem-Solving Condition Self-Regulated Learning condition A B C D Total A B C D Total A B C D Total Complete    34.3    34.3 46.9  6.2  53.1 10.0    10.0 Run-out-of-Time    29.9 9.0 4.5 4.5 47.8 29.2   0.9 30.1 1.7   1.7 3.3 Pick-Up    6.0    6.0 8.0    8.0 40.0 7.5 3.3 28.3 79.2 Sampler    9.0 3.0   12.0 6.2   2.7 8.8 2.5  3.3  7.5 % of Condition 79.1 11.9 4.5 4.5 100 90.3  6.2 3.5 100 55.8 7.5 6.7 30.0 100     Question 2b Count       Engagement Profile Control Condition Problem-Solving Condition Self-Regulated Learning condition Engagement Profile Total A B C D Total A B C D Total A B C D Total Complete    21  2  23 52 6 2  60 12    12 95 Run-out-of-Time    22   10 32 25 2 2 5 34 2   2 4 70 Pick-Up    2  2  4 7  2  9 35 9 9 42 95 108 Sampler    4  2 2 8 4   6 10   2 7 9 27 Engagement Profile Subtotal 49  6 12 67 88 8 6 11 113 49 9 11 51 120 300     166 Question 2b Percentage of Condition    Engagement Profile Control Condition Problem-Solving Condition Self-Regulated Learning condition A B C D Total A B C D Total A B C D Total Complete    31.3  3.0  34.3 46.0 5.3 1.8  53.1 10.0    10.0 Run-out-of-Time    32.8   14.9 47.8 22.1 1.8 1.8 4.4 30.1 1.7   1.7 3.3 Pick-Up    3.0  3.0  6.0 6.2  1.8  8.0 29.2 7.5 7.5 35.0 79.2 Sampler    6.0  3.0 3.0 12.0 3.5   5.3 8.8   1.7 5.8 7.5 % of Condition 73.1  9.0 17.9 100 77.9 7.1 5.3 9.7 100 40.8 7.5 9.2 42.5 100     Question 2c Count       Engagement Profile Control Condition Problem-Solving Condition Self-Regulated Learning condition Engagement Profile Total A B C D Total A B C D Total A B C D Total Complete    23    23 48 8 2 2 60 10 2   12 95 Run-out-of-Time    11 3 4 14 32 22 2  10 34 2   2 4 70 Pick-Up    2   2 4 6 3   9 27 4 4 60 95 108 Sampler    2 2  4 8 2  2 6 10    9 9 27 Engagement Profile Subtotal 38 5 4 20 67 78 13 4 18 113 39 6 4 71 120 300     168 Question 2c Percentage of Condition    Engagement Profile Control Condition Problem-Solving Condition Self-Regulated Learning condition A B C D Total A B C D Total A B C D Total Complete    34.3    34.3 42.5 7.1 1.8 1.8 53.1 8.3 1.7   10.0 Run-out-of-Time    16.4 4.5 6.0 20.9 47.8 32.8 1.8  8.8 30.1 1.7   1.7 3.3 Pick-Up    3.0   3.0 6.0 5.3 2.7   8.0 22.5 3.3 3.3 50.0 79.2 Sampler    3.0 3.0  6.0 12.0 1.8  1.8 5.3 8.8    7.5 7.5 % of Condition 56.7 7.5 6.0 29.9 100 69.0 11.5 3.5 15.9 100 32.5 5.0 3.3 59.2 100     Question 3a Count       Engagement Profile Control Condition Problem-Solving Condition Self-Regulated Learning condition Engagement Profile Total A B C D Total A B C D Total A B C D Total Complete    6 17   23 29 31   60 8 4   12 95 Run-out-of-Time    4 15  13 32 16 10  8 34    4 4 70 Pick-Up     2  2 4 2 7   9 12 18  65 95 108 Sampler     6  2 8 3 4  3 10 2 5  2 9 27 Engagement Profile Subtotal 10 40  17 67 50 52  11 113 22 27  71 120 300     170 Question 3a Percentage of Condition    Engagement Profile Control Condition Problem-Solving Condition Self-Regulated Learning condition A B C D Total A B C D Total A B C D Total Complete    9.0 25.4   34.3 25.7 27.4   53.1 6.7 3.3   10.0 Run-out-of-Time    6.0 22.4  19.4 47.8 14.2 8.8  7.1 30.1    3.3 3.3 Pick-Up     3.0  3.0 6.0 1.8 6.2   8.0 10.0 15.0  54.2 79.2 Sampler     9.0  3.0 12.0 2.7 3.5  2.7 8.8 1.7 4.2  1.7 7.5 % of Condition 14.9 59.7  25.4 100 44.2 46.0  9.7 100 18.3 22.5  59.2 100     Question 3b Count       Engagement Profile Control Condition Problem-Solving Condition Self-Regulated Learning condition Engagement Profile Total A B C D Total A B C D Total A B C D Total Complete    10 9 2 2 23 40 8 6 6 60 4 6  2 12 95 Run-out-of-Time    6 9  17 32 10 4  20 34    4 4 70 Pick-Up       4 4    9 9 4 12  79 95 108 Sampler     2  6 8  2  8 10 3   6 9 27 Engagement Profile Subtotal 16 20 2 29 67 50 14 6 43 113 11 18  91 120 300     172 Question 3b Percentage of Condition    Engagement Profile Control Condition Problem-Solving Condition Self-Regulated Learning condition A B C D Total A B C D Total A B C D Total Complete    14.9 13.4 3.0 3.0 34.3 35.4 7.1 5.3 5.3 53.1 3.3 5.0  1.7 10.0 Run-out-of-Time    9.0 13.4  25.4 47.8 8.8 3.5  17.7 30.1    3.3 3.3 Pick-Up       6.0 6.0    8.0 8.0 3.3 10.0  65.8 79.2 Sampler     3.0  9.0 12.0  1.8  7.1 8.8 2.5   5.0 7.5 % of Condition 23.9 29.9 3.0 43.3 100 44.2 12.4 5.3 38.1 100 9.2 15.0  75.8 100     Question 4 Count       Engagement Profile Control Condition Problem-Solving Condition Self-Regulated Learning condition Engagement Profile Total A B C D Total A B C D Total A B C D Total Complete     21  2 23 10 24 2 24 60 2 6 2 2 12 95 Run-out-of-Time     6  26 32 2 6 2 24 34    4 4 70 Pick-Up       4 4    9 9    95 95 108 Sampler     2  6 8   1 9 10   3 6 9 27 Engagement Profile Subtotal  29  38 67 12 30 5 66 113 2 6 5 107 120 300     174 Question 4 Percentage of Condition    Engagement Profile Control Condition Problem-Solving Condition Self-Regulated Learning condition A B C D Total A B C D Total A B C D Total Complete     31.3  3.0 34.3 8.8 21.2 1.8 21.2 53.1 1.7 5.0 1.7 1.7 10.0 Run-out-of-Time     9.0  38.8 47.8 1.8 5.3 1.8 21.2 30.1    3.3 3.3 Pick-Up       6.0 6.0    8.0 8.0    79.2 79.2 Sampler     3.0  9.0 12.0   0.9 8.0 8.8   2.5 5.0 7.5 % of Condition  43.3  56.7 100 10.6 26.5 4.4 58.4 100 1.7 5.0 4.2 89.2 100     Question 5 Count       Engagement Profile Control Condition Problem-Solving Condition Self-Regulated Learning condition Engagement Profile Total A B C D Total A B C D Total A B C D Total Complete    19 4   23 26 20 2 12 60 8 2  2 12 95 Run-out-of-Time       32 32    34 34    4 4 70 Pick-Up     4   4 5 4   9 59 24 4 8 95 108 Sampler    2   6 8  2  8 10 3 4  2 9 27 Engagement Profile Subtotal 21 8  38 67 31 26 2 54 113 70 30 4 16 120 300     176 Question 5 Percentage of Condition    Engagement Profile Control Condition Problem-Solving Condition Self-Regulated Learning condition A B C D Total A B C D Total A B C D Total Complete    28.4 6.0   34.3 23.0 17.7 1.8 10.6 53.1 6.7 1.7  1.7 10.0 Run-out-of-Time       47.8 47.8    30.1 30.1    3.3 3.3 Pick-Up     6.0   6.0 4.4 3.5   8.0 49.2 20.0 3.3 6.7 79.2 Sampler    3.0   9.0 12.0  1.8  7.1 8.8 2.5 3.3  1.7 7.5 % of Condition 31.3 11.9  56.7 100 27.4 23.0 1.8 47.8 100 58.3 25.0 3.3 13.3 100     Quiz Question – Part A Count       Engagement Profile Control Condition Problem-Solving Condition Self-Regulated Learning condition Engagement Profile Total A B C D Total A B C D Total A B C D Total Complete    5 10  8 23 5 39  16 60 1 5  6 12 95 Run-out-of-Time    8 19  5 32 5 19 2 8 34  1  3 4 70 Pick-Up     3  1 4  4  5 9 14 35 2 44 95 108 Sampler     4  4 8 1 4  5 10 1 6  2 9 27 Engagement Profile Subtotal 13 36  18 67 11 66 2 34 113 16 47 2 55 120 300     178 Quiz Question – Part A Percentage of Condition    Engagement Profile Control Condition Problem-Solving Condition Self-Regulated Learning condition A B C D Total A B C D Total A B C D Total Complete    7.5 14.9  12.0 34.3 4.4 34.5  14.2 53.1 0.8 4.2  5.0 10.0 Run-out-of-Time    11.9 28.4  7.5 47.8 4.4 16.8 1.8 7.1 30.1  0.8  2.5 3.3 Pick-Up     4.5  1.5 6.0  3.5  4.4 8.0 11.7 29.2 1.7 36.7 79.2 Sampler     6.0  6.0 12.0 0.9 3.5  4.4 8.8 0.8 6  1.7 7.5 % of Condition 19.4 53.7  26.9 100 9.7 58.4 1.8 30.1 100 13.3 39.2 1.7 45.8 100     Quiz Question – Part B Count       Engagement Profile Control Condition Problem-Solving Condition Self-Regulated Learning condition Engagement Profile Total A B C D Total A B C D Total A B C D Total Complete    14 7 2  23 32 7 1 20 60 8   4 12 95 Run-out-of-Time    17 3 4 8 32 23 5  6 34 1 1  2 4 70 Pick-Up    1   3 4 3 2 1 3 9 64 12 3 16 95 108 Sampler    5 2  1 8 2 3  5 10 4 1 1 3 9 27 Engagement Profile Subtotal 37 12 6 12 67 60 17 2 34 113 77 14 4 25 120 300     180 Quiz Question – Part B Percentage of Condition    Engagement Profile Control Condition Problem-Solving Condition Self-Regulated Learning condition A B C D Total A B C D Total A B C D Total Complete    20.9 10.4 3.0  34.3 28.3 6.2 0.9 17.7 53.1 6.7   3.3 10.0 Run-out-of-Time    25.4 4.5 6.0 12.0 47.8 20.4 4.4  5.3 30.1 0.8 0.8  1.7 3.3 Pick-Up    1.5   4.5 6.0 2.7 1.8 0.9 2.7 8.0 53.3 10.0 2.5 13.3 79.2 Sampler    7.5 3.0  1.5 12.0 1.8 2.7  4.4 8.8 3.3 0.8 0.8 2.5 7.5 % of Condition 55.2 17.9 9.0 17.9 100 53.1 15.0 1.8 30.1 100 64.2 11.7 3.3 20.8 100     Quiz Question – Part C/Bonus Maximum Score Count       Engagement Profile Control Condition Problem-Solving Condition Self-Regulated Learning condition Engagement Profile Total A B C D Total A B C D Total A B C D Total Complete    14 1  8 23 20 7 6 27 60 5 2  5 12 95 Run-out-of-Time    9 4 3 16 32 7  2 25 34 1 1  2 4 70 Pick-Up       4 4    9 9 23 11 6 55 95 108 Sampler    1  2 5 8 1  1 8 10 1 1  7 9 27 Engagement Profile Subtotal 24 5 5 33 67 28 7 9 69 113 30 15 6 69 120 300     182 Quiz Question – Part C/Bonus Maximum Score Percentage of Condition    Engagement Profile Control Condition Problem-Solving Condition Self-Regulated Learning condition A B C D Total A B C D Total A B C D Total Complete    20.9 1.5  11.9 34.3 17.7 6.2 5.3 23.9 53.1 4.2 1.7  4.2 10.0 Run-out-of-Time    13.4 6.0 4.5 23.9 47.8 6.2  1.8 22.1 30.1 0.8 0.8  1.7 3.3 Pick-Up       6.0 6.0    8.0 8.0 19.2 9.2 5.0 45.8 79.2 Sampler    1.5  3.0 7.5 12.0 0.9  0.9 7.1 8.8 0.8 0.8  5.8 7.5 % of Condition 35.8 7.5 7.5 49.3 100 24.8 6.2 8.0 61.1 100 25.0 12.5 5.0 57.5 100     8.10 Appendix J: Multivariate (MANCOVA) analysis for Quiz Question  Independent Variable: Conditions  Control condition = 1.00; Problem-Solving condition = 2.00; Self-Regulated Learning condition = 3.00  Dependent variable: Quiz Question (averages) 6 = Quiz Question; Comp = Completeness scale; Corr = Correctness scale; Exp = Explanation scale Covariate: Midterm 1  Between-Subjects Factors  N Condition 1 66 2 113 3 119   Descriptive Statistics  Condition Mean Std. Deviation N 6Comp_Ave 1 2.9242 .23963 66 2 2.9587 .17891 113 3 2.9692 .15028 119 Total 2.9553 .18414 298 6Corr_Ave 1 1.5354 .40879 66 2 1.3894 .45185 113 3 1.4006 .47850 119 Total 1.4262 .45603 298 6Exp_Ave 1 1.2525 .52643 66 2 1.0265 .50226 113 3 1.1485 .49231 119 Total 1.1253 .50951 298      184 Multivariate Testsa Effect Value F Hypothesis df Error df Sig. Partial Eta Squared Intercept Pillai's Trace .903 896.542b 3.000 290.000 .000 .903 Wilks' Lambda .097 896.542b 3.000 290.000 .000 .903 Hotelling's Trace 9.275 896.542b 3.000 290.000 .000 .903 Roy's Largest Root 9.275 896.542b 3.000 290.000 .000 .903 Condition * Midterm1 Pillai's Trace .022 1.064 6.000 582.000 .383 .011 Wilks' Lambda .978 1.062b 6.000 580.000 .384 .011 Hotelling's Trace .022 1.059 6.000 578.000 .386 .011 Roy's Largest Root .016 1.529c 3.000 291.000 .207 .016 Midterm1 Pillai's Trace .071 7.339b 3.000 290.000 .000 .071 Wilks' Lambda .929 7.339b 3.000 290.000 .000 .071 Hotelling's Trace .076 7.339b 3.000 290.000 .000 .071 Roy's Largest Root .076 7.339b 3.000 290.000 .000 .071 Condition Pillai's Trace .025 1.249 6.000 582.000 .280 .013 Wilks' Lambda .975 1.246b 6.000 580.000 .281 .013 Hotelling's Trace .026 1.243 6.000 578.000 .282 .013 Roy's Largest Root .019 1.802c 3.000 291.000 .147 .018 a. Design: Intercept + Condition * Midterm1 + Midterm1 + Condition b. Exact statistic c. The statistic is an upper bound on F that yields a lower bound on the significance level.     185  Tests of Between-Subjects Effects Source Dependent Variable Type III Sum of Squares df Mean Square F Sig. Partial Eta Squared Corrected Model 6Comp_Ave .167a 5 .033 .985 .427 .017 6Corr_Ave 4.073b 5 .815 4.123 .001 .066 6Exp_Ave 7.806c 5 1.561 6.579 .000 .101 Intercept 6Comp_Ave 88.514 1 88.514 2609.915 .000 .899 6Corr_Ave 10.909 1 10.909 55.214 .000 .159 6Exp_Ave 2.105 1 2.105 8.872 .003 .029 Condition * Midterm1 6Comp_Ave .079 2 .039 1.162 .314 .008 6Corr_Ave .190 2 .095 .481 .619 .003 6Exp_Ave .232 2 .116 .489 .614 .003 Midterm1 6Comp_Ave .010 1 .010 .290 .591 .001 6Corr_Ave 1.873 1 1.873 9.480 .002 .031 6Exp_Ave 5.168 1 5.168 21.776 .000 .069 Condition 6Comp_Ave .090 2 .045 1.322 .268 .009 6Corr_Ave .314 2 .157 .794 .453 .005 6Exp_Ave .079 2 .040 .167 .846 .001 Error 6Comp_Ave 9.903 292 .034    6Corr_Ave 57.692 292 .198    6Exp_Ave 69.295 292 .237    Total 6Comp_Ave 2612.667 298     6Corr_Ave 667.889 298     6Exp_Ave 454.444 298     Corrected Total 6Comp_Ave 10.070 297     6Corr_Ave 61.765 297     6Exp_Ave 77.101 297     a. R Squared = .017 (Adjusted R Squared = .000) b. R Squared = .066 (Adjusted R Squared = .050) c. R Squared = .101 (Adjusted R Squared = .086)   

Cite

Citation Scheme:

        

Citations by CSL (citeproc-js)

Usage Statistics

Share

Embed

Customize your widget with the following options, then copy and paste the code below into the HTML of your page to embed this item in your website.
                        
                            <div id="ubcOpenCollectionsWidgetDisplay">
                            <script id="ubcOpenCollectionsWidget"
                            src="{[{embed.src}]}"
                            data-item="{[{embed.item}]}"
                            data-collection="{[{embed.collection}]}"
                            data-metadata="{[{embed.showMetadata}]}"
                            data-width="{[{embed.width}]}"
                            async >
                            </script>
                            </div>
                        
                    
IIIF logo Our image viewer uses the IIIF 2.0 standard. To load this item in other compatible viewers, use this url:
https://iiif.library.ubc.ca/presentation/dsp.24.1-0165913/manifest

Comment

Related Items