@prefix vivo: . @prefix edm: . @prefix ns0: . @prefix dcterms: . @prefix skos: . vivo:departmentOrSchool "Science, Faculty of"@en, "Earth and Ocean Sciences, Department of"@en ; edm:dataProvider "DSpace"@en ; ns0:degreeCampus "UBCV"@en ; ns0:identifierCitation "Jolley, Alison Rae. 2010. Identifying Landscapes and their Formation Timescales: Comparing Knowledge and Confidence of Beginner and Advanced Geoscience Undergraduate Students. Undergraduate Honours Thesis. Department of Earth and Ocean Sciences. University of British Columbia."@en ; ns0:rightsCopyright "Jolley, Alison Rae"@en ; dcterms:creator "Jolley, Alison Rae"@en ; dcterms:issued "2010-04-07T16:44:45Z"@en, "2010-04-07"@en ; dcterms:description """The Landscape Identification and Formation Test (LIFT) was created in response to previous data from the Student Attitudes about Earth Science Survey (SAESS) at the University of British Columbia (Vancouver). The SAESS data suggested that upper-level students become less confident in landscape identification and formation timescales over the course of a term. The LIFT specifically probes the relationships among student confidence and knowledge in landscape identification and formation timescales and general knowledge in geologic time. The LIFT was validated with “think-aloud” interviews with students and correct answers were determined from interviews with experts. Results from the LIFT suggest that advanced students have higher conceptual knowledge, higher confidence in their knowledge, and are more self-aware than beginner students. Advanced students became more confident in landscape identification and formation timescales over the course of the term, contradicting the results seen in the previous administration of the SAESS. Students are better at identifying landscapes than assessing how long they take to form and are better with extreme timescales, two critical points that should be taken into consideration with future curricular reform."""@en ; edm:aggregatedCHO "https://circle.library.ubc.ca/rest/handle/2429/23321?expand=metadata"@en ; skos:note "IDENTIFYING LANDSCAPES AND THEIR FORMATION TIMESCALES: COMPARING KNOWLEDGE AND CONFIDENCE OF BEGINNER AND ADVANCED GEOSCIENCE UNDERGRADUATE STUDENTS by ALISON RAE JOLLEY A THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF BACHELOR OF SCIENCE (HONOURS) in THE FACULTY OF SCIENCE (Geological Sciences) This thesis conforms to the required standard ……………………………………… Supervisor THE UNIVERSITY OF BRITISH COLUMBIA (Vancouver) APRIL 2010 © Alison Rae Jolley, 2010 ii ABSTRACT The Landscape Identification and Formation Test (LIFT) was created in response to previous data from the Student Attitudes about Earth Science Survey (SAESS) at the University of British Columbia (Vancouver). The SAESS data suggested that upper-level students become less confident in landscape identification and formation timescales over the course of a term. The LIFT specifically probes the relationships among student confidence and knowledge in landscape identification and formation timescales and general knowledge in geologic time. The LIFT was validated with “think-aloud” interviews with students and correct answers were determined from interviews with experts. Results from the LIFT suggest that advanced students have higher conceptual knowledge, higher confidence in their knowledge, and are more self-aware than beginner students. Advanced students became more confident in landscape identification and formation timescales over the course of the term, contradicting the results seen in the previous administration of the SAESS. Students are better at identifying landscapes than assessing how long they take to form and are better with extreme timescales, two critical points that should be taken into consideration with future curricular reform. iii TABLE OF CONTENTS TITLE PAGE...................................................................................................................................i ABSTRACT....................................................................................................................................ii TABLE OF CONTENTS...............................................................................................................iii LIST OF FIGURES........................................................................................................................v LIST OF TABLES..........................................................................................................................vi LIST OF APPENDICES...............................................................................................................vii ACKNOWLEDGEMENTS..........................................................................................................viii INTRODUCTION...........................................................................................................................1 PREVIOUS WORK.........................................................................................................................2 Novice and Expert Perceptions of Science..................................................................................3 Measuring Student Perceptions and Confidence.........................................................................4 Conceptual Knowledge in Geology.............................................................................................5 DEVELOPMENT OF THE LANDSCAPE IDENTIFICATION AND FORMATION TEST.......................................................................................................................6 Deploying the Validated LIFT...................................................................................................12 RESULTS......................................................................................................................................12 Average Scores for 2 nd Year vs. 4 th Year Students....................................................................12 Average Scores for Beginner vs. Advanced Students...............................................................13 Student Responses for Different Landscapes............................................................................15 Student Confidence in Their Answers.......................................................................................16 Student Confidence When Identification is Correct..................................................................17 Responses to the SAESS Statement...........................................................................................18 DISCUSSION................................................................................................................................19 Students are Better at Landscape Identification than Formation Timescales............................20 Overall student knowledge.....................................................................................................20 Student experience with landscape formation timescales......................................................20 Long term evolution of landscape knowledge........................................................................21 Variety in Success with Landscape Identification and Formation Timescales..........................21 Geologically and regionally specific landscapes are harder to identify...............................21 iv Students are most comfortable with extreme timescales........................................................21 Experts are also most comfortable with extreme timescales.................................................22 Implications for future versions of the LIFT..........................................................................23 The Correlation between Confidence and Knowledge..............................................................23 Separation by geologic time scores.......................................................................................23 Advanced Students are more Self-Aware than Beginner Students............................................24 LIFT Results Do Not Support Previous SAESS Data...............................................................25 Difficulties with Development and Administration of the LIFT...............................................27 Classroom observations.........................................................................................................27 Problematic images...............................................................................................................27 Future Recommendations..........................................................................................................29 Curricular implications.........................................................................................................29 Revision of the LIFT...............................................................................................................29 Further administration..........................................................................................................30 Characteristics of expertise...................................................................................................30 Consideration of the SAESS data...........................................................................................31 Gender studies.......................................................................................................................31 Studies with high school students..........................................................................................31 Studies with graduate students...............................................................................................31 CONCLUSION..............................................................................................................................32 REFERENCES CITED..................................................................................................................34 APPENDICES...............................................................................................................................36 v LIST OF FIGURES Figure 1. Previous SAESS results on the statement “When I look at a landscape, I have an idea of how long it took to form.”……………………………………………………………..1 Figure 2. Outline of steps involved in development and administration of the LIFT………….....7 Figure 3. Range of responses from expert interviews…………………………………………....11 Figure 4a. Score on identification and formation timescale sections for 2 nd and 4 th year students………………………………………………………………………………………......14 Figure 4b. Score on identification and formation timescale sections for beginner and advanced students……...…………………………………………………………………….......14 Figure 5a. Distribution of right and wrong answers on the identification section for all students…………………………………………………………………………………………..15 Figure 5b. Distribution of right and wrong answers on the formation timescale section for all students……………………………………………………………………………………15 Figure 6a. Average confidence and score of both the beginner and advanced groups on the identification section……………………………………………………………………...17 Figure 6b. Average confidence and score of both the beginner and advanced groups on the formation timescale section…………………………………………………………........17 Figure 7a. Confidence of beginner students in formation timescale answers when landscape was correctly identified…………………………………………………………..…...18 Figure 7b. Confidence of advanced students in formation timescale answers when landscape was correctly identified…………………………………………………………….…18 Figure 8a. Agreement with the expert of beginner students on the SAESS statement at the start and end of term…………………………………………………………………….....19 Figure 8b. Agreement with the expert of advanced students on the SAESS statement at the start and end of term…………………………………………………………………….....19 Figure 9. Distribution of responses to the sand dunes question on the LIFT…………………....29 vi LIST OF TABLES Table 1. Average scores on the LIFT sections for the second and fourth year geology classes...............................................................................................................................13 Table 2. Average scores on the LIFT sections for the beginner and advanced groups.................14 Table 3. List of question numbers and corresponding landscapes/features……………………...15 Table 4. Average confidence on the LIFT sections for the beginner and advanced groups..........16 Table 5. Distribution of expert responses on the LIFT………………………………..…………22 Table 6. Problematic images, landscapes, and features encountered throughout the development process of the LIFT………………………………………………………………..28 vii LIST OF APPENDICES Appendix 1. Version 1.0 of the LIFT (used in this study).............................................................36 Appendix 2. Image slides with credits...........................................................................................45 Appendix 3. Interview consent form.............................................................................................51 Appendix 4. Changes made during the interview process.............................................................54 Appendix 5. Suggested image revisions........................................................................................56 viii ACKNOWLEDGEMENTS I would like to take the time to thank all of the individuals who helped make this study possible. Many were willing to sit down with me and discuss ideas for development and general best practices. This includes Jamil Rhajiak (research topics, insights on process, and use of the Geologic Time Assessment Tool), Dr. Wendy Adams (research topics, development ideas, think aloud interviews, presentation of data, and statistics), Dr. Louis Deslauriers (assessing student confidence), and Dr. Carl Wieman (research topics and development ideas). Thank you also to Dr. Julie Libarkin and Emily Ward, for providing answers to the Geoscience Concept Inventory (GCI). I acknowledge the Carl Wieman Science Education Initiative (CWSEI) and the Department of Earth and Ocean Sciences and its head, Dr. Greg Dipple, for assistance with the funding of validation interviews and travel to conferences. Those valuable learning experiences helped to shape both this study and myself as a person. To all of the students who participated in interviews or took the LIFT in the classroom, I appreciate your willingness to help improve education in the Department of Earth and Ocean Sciences at UBC. Amey, Dean, Scott, and Rebecca Jolley – thanks for being so supportive and for assisting in the transcription of 96 student forms into a digital format. Without your help, this process would have taken much longer and been much less enjoyable! To Bruce Thomson and Drs. Mike Church, Brett Eaton, Oldrich Hungr, and Jim Mortensen, thank you for taking time to participate in expert interviews for the LIFT. Your answers and suggestions were invaluable in building and assessing the final version. Thank you also to Dr. Josh Caulkins for participating in an expert interview. Along with Brett Gilley and Erin Lane, your willingness to brainstorm with me was truly appreciated. You all provided me with such valuable feedback on research topics, development of the LIFT, interpretation/presentation of data, and a little laughter too. To Dr. Stuart Sutherland, my appreciation for providing an updated version of Rhajiak’s (2009) Geologic Time Assessment Tool, class results from the pre-post test, participating in an expert interview, and sharing lecture time for the administration of the LIFT. ix Thank you to Dr. Mary Lou Bevier for participating in an expert interview, sharing lecture time for administration of the LIFT, answering questions, and facilitating the earth and ocean sciences undergraduate thesis course, alongside Drs. Douw Steyn and Tara Ivanochko. You all were so wonderful in candidly sharing your past experiences, helping to guide us both in our theses and our future plans. Finally, Dr. Sara Harris and Francis Jones. I can’t thank you enough. Your continued encouragement, as well as honest and timely feedback, helped to make this such an enjoyable experience. If only a small fraction of your passion for teaching and learning rubbed off on me, I will be substantially better for it. 1 INTRODUCTION Geologic time is a key concept that should be understood by any person holding a degree in an earth and/or ocean science field. Professional registration organizations, such as The Association of Professional Engineers and Geoscientists of British Columbia (APEGBC), recognize geologic time as vital to professional training (http://www.apeg.bc.ca/ reg/docs/2006geologysyllabus.pdf, 2006). Familiarity with the concept of deep time is important for understanding paleontology and stratigraphy (Zen, 2001), as well as the concept of landscape formation (Allen, 2008). A sense of geologic time allows those creating geologic histories to understand the scope of the content, and realistically interpret the timescale and involved processes. Many other concepts in geology require a sense of deep time. To successfully teach the concept of geologic time, one must understand how beginning learners transition to expert learners (Petcovic and Libarkin, 2007). Qualities relevant to this study include how students perceive the discipline and how confident are they in their own knowledge of geologic time. Results from pre- and post-term attitude surveys used in the Department of Earth and Ocean Sciences at the University of British Columbia (Vancouver) suggest that upper-level students (≥ 3rd year) become less confident with the timescales of landscape formation by the end of term, whereas lower-level students (≤ 2nd year) become more confident (Figure 1). This study aims to better understand this result, and determine if student confidence accurately reflects knowledge. Figure 1. Upper-level majors (“3rd Yr. Maj.”) become less confident with timescales of landscape formation by the end of term, whereas lower-level majors and non-majors (“1st Yr. Serv.”, “2nd Yr. Maj.”, and “3rd Yr. Serv.”) become more confident. Blue (bottom) = agree with expert, grey (middle) = neutral, red (top) = disagree with expert. Left bar = beginning of term, right bar = end of term. 2 On the path from beginner to geoscience expert, an individual‟s confidence in his/her abilities to work with geologic time should increase overall, but this increase may not be linear. There could be certain moments or periods of time where confidence does not increase as knowledge builds. As students become aware of the complexity of their studies, their confidence may decrease. These moments of realization could either catalyze or inhibit the learning process. The results from this study should increase our understanding of why upper-level students showed a decrease in confidence, and provide a validated test to measure students‟ conceptual knowledge and confidence in landscape formation. In this study students‟ conceptual knowledge is correlated with their self-reported confidence levels. A new assessment instrument has been developed to quantify conceptual knowledge about the timescales of landscape formation and to measure confidence. A previously validated tool, the Student Attitudes about Earth Science Survey (SAESS), is used to measure student perceptions of geoscience in terms of overall expert-like behavior (Lane et al., in preparation). The Landscape Identification and Formation Test (LIFT) was created in response to the SAESS statement (#13), “When I look at a landscape, I have an idea of how long it took to form.” The instrument was developed to realistically probe students‟ abilities and confidence in interpreting landscapes. These newly developed landscape identification and formation timescale questions are combined with a subset of questions from Rhajiak‟s Geologic Time Assessment Tool (2009) to characterize a student‟s understanding of geologic time. These data are used to assess general student knowledge of landscape formation, investigate the correlation between confidence and knowledge and whether or not it supports previous data from the SAESS, and compare the ability to self-assess confidence levels between beginner and advanced students. PREVIOUS WORK It is often difficult to extend thoughts or decisions beyond the scale of a human lifetime, especially for those who have not studied geologic time. Kastens et al. (2009) identifies two main facets of temporal thinking that geoscientists have that the general population does not: 1) “a long view of time” and 2) “an expect[ation] of low frequency, high impact events”. These skills dealing with deep time are crucial for geologists, and are 3 important even for those who are not within the geoscientist population, such as landscape architects (Clary et al., 2009). Because a long view of time is not common in today‟s world, and because deep time is an abstract perception, the role of educational institutions in teaching this concept is paramount. Novice and Expert Perceptions of Science Hammer (1994) discusses the differences between novice and expert perceptions in terms of 1) the structure of knowledge, 2) the content of knowledge, and 3) the source of knowledge. First, experts see knowledge within a discipline as comprising one coherent unit, and can usually re-derive things if forgotten, or solve them in a different manner. Novices do not see this coherence, and think of the discipline as a grouping of separate pieces. They believe that to know these pieces is simply to remember them (Chi et al., 1981). Second, expert scientists generally are comfortable relating knowledge, concepts and equations, using mathematics much like another language. They find this language fully translatable, and common sense. Problems are solved by conceptualization. Novices tend to think that knowledge is made up of facts and equations which are not a representation of concepts or words. Problem solving is guided by searching for the correct equation and mathematically computing the answer (Larkin, 1983). Finally, experts believe that gaining knowledge is a highly independent process (which can be achieved without an instructor), and making sense of ideas is extremely important. Knowledge represents the real world. Novices believe that knowledge is handed down from positions of authority (or don‟t actively seek out alternative sources), learning is only remembering what they have been told, and that knowledge is isolated from its representation of the real world (Kitchener & King, 1981). The above qualities greatly affect the capabilities of a student, including their understanding of geologic timescales. As students tie their knowledge into a larger, conceptual framework that represents the real world, concepts become increasingly intuitive and natural. In terms of landscape formation, this framework begins with geologic identification of landscapes/features. It further progresses to link with processes, rates of these processes, total time for these processes, and how these processes physically result in landforms in the real world. Understanding how students create a framework can aid in the teaching and learning process. For example, if it is known that students who do poorly do not 4 commonly think of landscape formation in terms of the real world, teaching can be altered to include context and real examples. The perception of a discipline itself impacts significant portions of the student‟s learning (Gal, 1997). Student beliefs about the nature of a discipline impact how students approach learning – “process considerations”, how they translate their understanding to the world outside of the classroom – “outcome considerations”, and whether or not they choose to continue on in their studies of a discipline – “access considerations” (Gal, 1997). Identifying landscapes and thinking about the timescales on which they form affects all of these considerations, most significantly the outcome considerations. Measuring Student Perceptions and Confidence Few studies, either quantitative or qualitative, focus on the nature of perceptions of experts and novices in earth and ocean sciences. The Student Attitudes about Earth Science Survey (SAESS) was designed at the University of British Columbia (UBC) (Vancouver) to measure these perceptions (Lane et al., in preparation). The SAESS applies to all sub- disciplines in the field of earth and ocean science (geology, geophysics, oceanography, and atmospheric science), and is interpreted consistently by all people with varying degrees of knowledge about the discipline, such as majors and non-majors. As mentioned above, one very important concept in the world of geologic time is landscape formation. Experts understand the timescale on which landscapes form, requiring a sense of the deep nature of time. In the SAESS at UBC (Vancouver), students self-reported their ability to judge landscape formation timescales. Over the course of a term, upper-level geology majors reported a decrease in agreement with the expert on the statement “When I look at a landscape, I have an idea of how long it took to form.” Their perceptions on this concept are different from those of lower-level geology majors and non-majors, who typically report a slight increase in agreement over the course of a term. This response is likely a function of confidence. There is a general trend of increase in agreement with the expert from 1 st year to the first term of 3 rd year; however, the second term of 3 rd year is the first time a decrease in agreement is evident. The 3 rd year students did begin with a higher percentage agreement with the expert, however. 5 Confidence is a function of an individual‟s personality, as well as their training. The study of confidence is a young field, with limited research on university-level students (e.g., Besterfield-Sacre et al., 1998), especially geology students (e.g., Jordan et al., 2009). Studies of elementary school students that attempt to gauge student confidence found that confidence of the younger students was high regardless of achievement, but the two were tightly coupled with the older students (grades 3-6) (Kloosterman and Cougan, 1994). They also found that the confidence of “moderate achievers” is no different than that of “high achievers”, and that high achievers are more perceptive of their own achievement level. A student‟s confidence, i.e. self-assessment of their ability, has been suggested to be a domain general skill dependent upon their own metacognitive knowledge (Schraw, 1997). Conceptual Knowledge in Geology Many methods are commonly used to teach and/or assess student knowledge of geologic time. These often involve the scaling of events in geologic time in terms of a smaller framework, such as a calendar (Hidalgo and Otero, 2004), or the construction of a timescale of an adult‟s life to understand the difference between relative and absolute dating (e.g. Hermann and Lewis, 2004). The understanding of student perceptions, confidence, and knowledge towards geologic time can better inform these methods, and help develop new ones. The majority of previous studies in conceptual knowledge of geology focus more on younger students (e.g., Dodick and Orion, 2003), and broader geological understanding (e.g., Libarkin and Anderson, 2005). Rhajiak (2009) specifically addressed geologic time concepts with post-secondary geology students, targeting beginner and advanced majors in the discipline. His Geologic Time Assessment Tool focuses on vital concepts specifically identified by instructors, and distinguishes between beginner and advanced students. However, it does not explicitly address the timescale of landscape formation, though it was specifically identified by instructors as a key concept that students should know. This study focuses on landscape formation, and extends considerations to the perceptions and confidence of beginner and advanced students. All three of the components mentioned above; novice and expert perceptions of science, measuring student perceptions of confidence, and conceptual knowledge, help 6 elucidate the intricacies of student understanding of geologic time. The combination of these three components builds a robust assessment tool that can be used to study conceptual knowledge and confidence of landscapes and their formation timescales. DEVELOPMENT OF THE LANDSCAPE IDENTIFICATION AND FORMATION TEST The LIFT asks students to identify landscapes from images, select the best answer for the timescale of formation of the landscape pictured, and rate their confidence on both parts (Appendix 1). The basic steps of its development are outlined in Figure 2. Images and formation timescales featured within the LIFT were selected to specifically focus the conceptual background of the test, to maintain simplicity and clarity, and to contain a range of timescales of formation. Instead of having students picture a landscape based on a single word or phrase, they view pictures of unidentified landscapes. This simulates the “real world”, where they encounter a landscape and start to think about how long it takes to form. This type of situation is what statement #13 on the SAESS is referring to, looking at a landscape and having an idea of how long it takes to form. Maintaining consistency with the survey question allows for a realistic interpretation and correlation with the attitude and confidence levels of students Initially sixteen different landscapes and images were selected by considering common landscapes with various timescales of formation. Many of these ideas were guided by perusing Trenhaile‟s (2007) geomorphology textbook, which is used at UBC. This large pool of images was initially chosen in order to be able to weed out any that were not understood by students during validation interviews, with the plan to select ten for the final test. Images were all found online (credits listed in Appendix 2). Individual student interviews were conducted in the month prior to test administration. Ten paid student volunteers, five beginner (first year of geology) and five advanced (second or third year of geology), participated in a one-on-one think-aloud interview with the researcher (W. Adams, personal communication, 2009). All interviews were recorded and annotated. 7 Figure 2: Outline of steps involved in development and administration of the LIFT. Interviews were structured as follows: first the student signed the consent form (Appendix 3), a copy of which was emailed to them prior to the interview day. Next they answered the statement “When I look at a landscape, I have an idea of how long it took to form” from the SAESS (strongly disagree to strongly agree on a five point Likert-scale), which had previously been validated by eighteen interviews with a more varied population (Lane et al., in preparation). Because the SAESS validation interviews contained this statement among thirty others, the question was repeated in the interviews to give it an appropriate focus for the context, and to match the student‟s answer to their conceptual scores on the test questions to follow. Next they were presented with the landscape images, and were asked to first go through the test silently. Afterwards they went back through the images, verbalizing the thought processes used to determine their answer. If any questions were asked of the researcher, they were not answered until the end of the interview. In version one (six interviewees), they were asked to estimate how long the landscape in the image took to form and rate their confidence in that answer. In later versions (four interviewees) they were first asked to identify the landscape, rate their confidence in the identification, and then estimate Survey of UBC‟s geomorphology text book Search for images online Development of test questions Student interviews Expert interviews Creation of answer key Administration of the LIFT Continual revision Final revision 8 the time it took to form and rate their confidence in their time estimate (see Appendix 4 for changes made during the interview process). This was a more effective way to gauge a student‟s knowledge, as the SAESS statement implies two components (identifying the landscape and having an idea of how long it took to form). Finally, the students were asked to answer the SAESS statement a second time. The SAESS statement was consistently understood and interpreted as intended, as shown with interviews conducted for the validation of the SAESS. Based on the think-aloud comments on the statement, students had varied confidence in their ability to identify landscapes and the time they take to form. During these validation interviews, some seemed to recognize the scope of their knowledge as a limiting factor in their confidence (being aware that there were likely many things in the world that they hadn‟t encountered), but still either agreed or strongly agreed with the statement. Others felt unfamiliar with landscapes and not very confident at the start of the interview, but actually increased in their self- reported confidence at the end of the interview, noting that they were surprised by the amount of landscapes or features that they could recognize. These comments may hint at reasons why upper-level students seem to be less confident with landscape identification and formation timescales, or why they may change their answer after a test (or even at the end of a term). Based on the interview process, four images were removed due to obscurity or lack of clarity. Most of the landscapes were fully recognized, or students were at least able to hypothesize the processes that formed them, and hence, how long they took to form. Recognizing a landscape is a prerequisite for determining how long it takes to form, but images shown should not be too obscure, otherwise students would spend too much time and effort identifying them (if they were even able to). The landscapes that were removed due to their obscurity were karst, stromatolite mounds, Olympus Mons on Mars, and a mesa. These were almost never recognized in the interviews. Salt flats were also not included in the final version, as clear photos could not be located. Many students thought that they were ice, but this may be an artifact of the clarity of the photograph, not of a lack of knowledge. One student suggested using the term “feature” in relevant images, instead of “landscape” for all of the images. They felt that this improved the clarity of the questions, as this would allow them to view the image in a more specific context. All other students were 9 asked about this idea, and agreed that it made things significantly clearer. This change was made in later interview versions, and was maintained with the final version of the LIFT. Two versions of the confidence rating scale were tested, as studies in the Department of Physics at UBC indicate that percentage scales are preferred by students (L. Deslauriers, personal communication, 2009). Half of the questions used a 5 point Likert-scale, and the other half used a percentage scale in fifths. One of the students said they preferred the Likert- scale, whereas each of the other nine either said they preferred the percentage scale or found the two to be equivalent. For this reason, the percentage scale was selected. Students also indicated that they found fifths (<20%, 20-40%, 40-60%, 60-80%, and >80%) clearer than quarters. Two display options were tested in the interviews: color images printed on paper, and color images projected using a slide show. Nine out of the ten students preferred to view the images on a slide show, as opposed to printed out on a piece of paper, as they felt the images were more clear this way. Seven experts in the fields of geography and geology took the test themselves (six in an interview format, and one over email). Their answers were used to build the answer key, and to assist in developing various answer choices for the questions. Questions were asked in an open-ended format, only requesting an order of magnitude for the answer. After all interviews had been completed, the answers were tallied and analyzed. If a certain order of magnitude was provided as an answer by only one or two experts, it was removed from the answer pool. This narrowed down ten of the twelve images to an answer specific to two orders of magnitude and one specific to three orders of magnitude (volcano). The final one did not have a best answer as the range of expert responses was too large (sand dunes) (Figure 3). This last question was left on the test to investigate the pattern of student responses. Experts were asked to comment on the types of landscapes included, noting any unclear, extraneous, or missing images. All felt that the test was sufficient at covering the range of landscapes/features that a graduating geology student should be familiar with, and they noted a few landscapes that could be added. Those that were mentioned were deemed to be either too specific (e.g. drumlin field) or too difficult to display in an image (e.g. Basin and Range province). 10 In the validation interviews, questions asking both student and expert participants to (1) identify a landform and (2) estimate how long it took to form were both open-ended, i.e., no answer choices were provided. In the final test, the questions regarding identification remained open-ended. For estimates of how long the landscape took to form, scales were developed for each image, based on both student and expert interview responses. The scales were designed to include common student responses, to include the common expert response, and to ensure there was only one best answer. Eight additional multiple choice questions were added to the test to determine students‟ conceptual knowledge of deep time, and to distinguish between lower and upper- level students. Questions were selected based on results of a pre-post test given to a geologic time-focused class for non-majors (“Earth and Life through Time”) in fall 2009 (S. Sutherland, personal communication, 2009). The questions on this test were either directly from the Geologic Time Assessment Tool (Rhajiak, 2009), slightly modified from this form, or newly created (S. Sutherland, personal communication, 2009). Questions for the LIFT geologic time section were selected based upon a comparison of the results before and after the time-focused portion of the class (approximately halfway through). Those questions which showed strong gains over the course of the term should be more useful to distinguish between “beginner” and “advanced” students, as they are evidence of gains in knowledge that take place after learning about the topic of geologic time. 11 La n d sc ap e Im p ac t C ra te r * X X X X Fa u lt * X X X X La n d sl id e * X X X X La va F lo w X * X X X M u d C ra ck s X * X X X Sa n d D u n e A ll u vi al F an X X X * H o o d o o s * X X X R iv e r X X X * X U -s h ap e d V al le y X * X X X V o lc an o X X * X M o u n ta in s X X X * se co n d s m in u te s h o u rs d ay s w e e ks m o n th s ye ar s 10 s o f yr s 10 0s o f 10 00 s o f 10 s o f 10 0s o f m il li o n s 10 s o f m il l- yr s yr s 10 00 s o f yr s 10 00 s o f yr s io n s o f yr s R an ge o f Ex p e rt A n sw e rs n o c o rr e ct a n sw e r F ig u re 3 . R an g e o f re sp o n se s fr o m e x p er t in te rv ie w s (t h o se i n g re en a re t h e co n se n su s af te r an sw er s g iv en b y t w o o r le ss e x p er ts w er e re m o v ed ). T h e ra n g e o f re sp o n se s is l ar g er i n t h e m id d le o f th e ti m es ca le . A n sw er c h o ic es o n t h e L IF T a re m ar k ed w it h a n X , an d t h e co rr ec t an sw er s ar e m ar k ed w it h a n a st er is k . N o te t h at o n ly o n e an sw er c h o ic e w as u se d f ro m t h e co n se n su s ra n g e o f ex p er ts . O n e ac h q u es ti o n , th e lo w es t o rd er o f m ag n it u d e w as g iv en t h e su ff ix “ o r le ss ” an d t h e h ig h es t o rd er o f m ag n it u d e w as g iv en t h e su ff ix “ o r m o re ”. E .g ., t h e lo w es t ch o ic e fo r th e im p ac t cr at er i s “m in u te s o r le ss ” an d t h e h ig h es t is m il li o n s o f y ea rs o r m o re .” T h er e ar e n o q u es ti o n s o n t h e L IF T t h at re p re se n t ti m es ca le s b et w ee n w ee k s an d 1 0 0 0 s o f y ea rs . 12 Deploying the Validated LIFT One lower-level class (“Introductory Mineralogy”, n=71) and one upper-level class (“Advanced Paleontology”, n=25) was selected for the formal administration of the test, in order to obtain responses from two different groups of majors in the geology program. Instructors contributed approximately one half hour of class time for administration of the test. Students had a paper copy of the LIFT in front of them (Appendix 1). They answered the SAESS statement on the first page, tore off this page and handed it in so they could not change their answer later on. After this they were given 45 seconds per image to view it and answer the corresponding questions. After seeing all images, ten minutes were given for completion of the eight geologic time questions and the SAESS question was answered again. Tests were filled out by hand and marked by the researcher. All questions had five possible answers and one correct answer, except for the open-ended landscape identification questions. Demographic information collected included gender, age, major, and specific courses taken in the field of geology. Those particularly relevant were Geologic Time (2 nd year), Earth Science for Engineers (2 nd year), Field Schools (one 2 nd year and one 3 rd year), Geomorphology (3 rd year), Tectonic Evolution of North America (3 rd year), and Advanced Paleontology (4 th year). To produce a large data set in which ability and confidence could be explicitly matched, an additional question was placed after both the landscape identification and the formation timescale question, asking students how confident they are in their answer in terms of a percentage. The confidence element was incorporated in order to address the fact that the SAESS statement likely reflects their confidence, which may or may not align with their answers to the conceptual questions. RESULTS Average Scores for 2 nd Year vs. 4 th Year Students Students in the 4 th year class scored significantly higher than those in the 2 nd year class on the geologic time questions; however, there is no significant difference between the 13 class averages for both the identification and the formation timescale (Table 1). Students score higher on the landscape identification section than the formation timescale section in both groups. Table 1. Average scores on the LIFT sections for the second and fourth year geology classes. Identification Score Standard Deviation Formation Timescale Score Standard Deviation Geologic Time Score* Standard Deviation 2 nd Yr. (n=71) 7.24 2.07 4.38 1.78 3.87 1.48 4 th Yr. (n=25) 7.80 2.08 4.52 2.12 5.04 1.70 *=p<0.01. Average Scores for Beginner vs. Advanced Students Students in these two classes have varying degrees of geologic knowledge, as indicated by the demographic information collected at the end of the LIFT. These differences are even apparent within the two classes, due to the prerequisite structure of these classes. For example, the 4 th year class in this study only has one prerequisite: an introductory class on geologic time and basic paleontology. This class has two versions: one for non-majors and one for majors. Therefore, it is possible to take this 4 th year class as a non-major who has only taken one other geoscience course. Because of these differences, the students are re-grouped based upon their score on the eight question geologic time section of the LIFT, which better represents their expertise. Those who scored a five or higher are considered to be „advanced‟ students, and those who scored less than five are considered to be „beginner‟ students. This division is an arbitrary choice of number, which splits the total group (n=96) into roughly equal halves (beginner n=55, advanced n=41). This eliminates any skewing of the data by students who are taking upper-level courses outside of their field of expertise, or who are more expert-like but are taking a lower-level class (e.g. transfer student or graduate student lacking a prerequisite), etc. This method separates the groups by a more accurate measure, evident in the change in distribution of the landscape identification vs. formation timescale score (Figure 4). Additionally, the standard deviations are lower for the beginner/advanced groups than the 14 2 nd /4 th year groups in five out of the six cases (Table 2). This separation also reduces the disparity in standard deviation between the 2 nd and 4 th year groups. Table 2. Average scores on the LIFT sections for the beginner and advanced groups. Identification Score* Standard Deviation Formation Timescale Score** Standard Deviation Geologic Time Score** Standard Deviation Beginner (n=55) 6.87 2.10 3.80 1.77 3.05 0.97 Advanced (n=41) 8.07 1.86 5.24 1.68 5.68 0.93 *=p<0.01, **=p<0.001. Figure 4. Score on identification vs. score on formation timescale for 2 nd /4 th yr. (a) and beginner/advanced (b). The advanced students plot more distinctly at the upper end of the graph than the 4 th year students. The spread of the advanced group is less than that of the 4 th year group. (a) (b) 15 Student Responses for Different Landscapes Students from both groups have difficulty identifying landscapes/features that are more specific to geology (e.g. mud cracks or u-shaped valley) (Figure 5a), and/or a certain region (e.g. hoodoos). Students from both groups also have higher percentages of correct answers for the formation timescales with short timescale landscapes/features (e.g. impact) and long timescale landscapes/features (e.g. mountains) (Figure 5b). Eleven of the twelve questions show a higher percentage of correct answers for identification than for corresponding formation timescales (Figure 5). Number 5, the fault, is the anomaly. Question Number and Landscape Type 1. Alluvial fan 2. Lava flow 3. Impact crater 4. Hoodoos 5. Fault 6. Mountains 7. Sand Dunes 8. Volcano 9. River 10. Mud cracks 11. Landslide 12. U-shaped valley Figure 5. Distribution of right and wrong answers on the identification section (a) and formation timescale section (b) for all of the students. Identification is lower with geologically and regionally specific images, and extreme formation timescales have a higher percentage of right answers. Questions are ordered by increasing formation timescale (L to R). Table 3. List of question numbers and corresponding landscapes/features. (a) (b) (b) 16 Student Confidence in Their Answers In order to perform mathematical calculations with the numbers, the values used for confidence are the numbers in the middle of the confidence range answer that the students gave. For example, the <20% range is represented by 10% confidence, and the 40-60% range is represented by 50% confidence. Average individual student confidence in the identification section ranges from 21.76% to 88.33% for the beginner group and 29.17% to 85.00% for the advanced group. Average confidence in the formation timescale section ranges from 10.00% to 84.55% for the beginner group and 30.00% to 77.27% for the advanced group. There are significant differences both between the identification and formation timescale sections within each group and between groups by section (Table 4). In both groups the average confidence in identification is higher than that in the formation timescale. Table 4. Average confidence on the LIFT sections for the beginner and advanced groups. Identification Confidence^ Standard Deviation Formation Timescale Confidence* Standard Deviation Beginner** (n=55) 60.44 17.72 48.02 19.14 Advanced** (n=41) 67.60 11.92 57.67 12.09 ^=p<0.05, *=p<0.01, **=p<0.001. In both sections of the LIFT, advanced students both have higher scores and higher average confidences. The advanced cluster is more tightly grouped than the beginner in both cases (Figure 6). 17 Student Confidence When Identification is Correct When students misidentify a landscape/feature, their answers regarding the formation timescale for that landscape/feature are not necessarily reliable, since they were thinking of a different landscape/feature when answering the formation timescale question. Responses for all eleven formation timescale questions have been combined into one pool, based solely upon instances where a landscape/feature was identified correctly by a student. This is then coupled with that student‟s confidence in their answer to the corresponding formation timescale question, and whether they answered this question right or wrong. Among students who identified a landscape correctly, those who also answered the formation timescale correctly have higher average confidence than those who answered the formation timescale incorrectly (Figure 7). Among the advanced group, there is a larger separation in average confidence between those who got the formation timescale right or wrong (p=3.6x10 -7 ). The Figure 6. Average confidence and score of both the beginner and advanced groups on the identification section (a) and the formation timescale section (b). The advanced students have higher confidence and scores and are more tightly grouped than the beginners in both sections. (a) (b) 18 advanced students who got the formation timescale right are more right-skewed than the beginners who got the formation timescale right (p=0.01). However, the two groups have similar distributions for those who got the formation timescale wrong (p=0.04). Responses to the SAESS Statement Both the 2 nd and 4 th year classes that took the LIFT also took the SAESS at the beginning and end of term. However, there was a low response rate for the end of term. This was especially true in the case of the 2 nd year class (only 26% of the class completed both the pre- and post-survey). Because the LIFT was given during the last week of class (the same timing as the post-survey) and included the statement directly from SAESS, the data from the statement at the beginning of the LIFT was used as the post-survey data. Figure 7. Confidence of beginner students (a) and advanced students (b) in formation timescale answers when landscape was correctly identified. There is a greater distinction between confidence in right and wrong answers for the advanced students. The advanced students who got the formation timescale right are significantly different than beginner students who got the formation timescale right. (a) (b) 19 Both beginner and advanced students increase in their agreement with the expert over the course of the term (Figure 8). Overall, advanced students have a higher percentage agreement with the expert, and increased by a larger amount. These results are inconsistent with those seen on the SAESS in previous administrations (Figure 1). DISCUSSION On a previous administration of the SAESS, upper-level students appeared to become less confident in recognizing landscapes and having an idea of how long they took to form, over the course of a term (Figure 1). Data from the LIFT suggests that though this is possible, advanced students still have both higher conceptual knowledge and confidence than beginner (a) (b) Figure 8. Agreement with the expert of beginner students (a) and advanced students (b) on the SAESS statement, “When I look at a landscape, I have an idea of how long it took to form,” both at the start of (“Pre”) and the end of (“Post”) term. Both groups of students increased in their agreement, though the advanced students had a higher agreement to begin with. 20 students, and they are more aware of their knowledge. Confidence and knowledge correlate positively with each other in both the beginner and advanced groups. Students are familiar with most of the landscapes/features contained in the LIFT, though they score lower on those which are more specific to geology or specific to a certain region/environment unlike the one that UBC is situated in. They score much higher on timescales at the extreme low end (minutes or less) and the extreme high end (tens of millions of years). Students are Better at Landscape Identification than Formation Timescales Overall student knowledge Students score higher on the identification section than the formation timescale section. Most students are more familiar with identifying landscapes than they are with determining how long they took to form. The average score of both the 2 nd year and 4 th year classes is a failing score on the landscape formation timescale section. The average score for both groups on the landscape identification section is a passing score. The clusters are tighter in the identification section (Figure 6a), further supporting the interpretation that students are much more comfortable and knowledgeable in identification of landscapes/features vs. formation timescales of the same landscapes/features. Student experience with landscape formation timescales Timescales are taught and learned at a more expert level than identification of landscapes/features. Moreover, landscapes/features can be picked up in passing when outdoors, or by looking at pictures and photographs (which are widely used in all levels of geology classes). There is a greater chance of a person happening to ask questions about what a feature is (and getting answers from friends, family, etc.) than how long it took to form. Results from the LIFT suggest that timescales of formation are not emphasized at either the lower or upper-levels in the geology curriculum, except, perhaps, for those landscapes at the extreme ends of the range of timescales. 21 Long term evolution of landscape knowledge It is surprising that scores in the identification and formation timescale sections would be similar between a 2 nd year class and a 4 th year class. This could indicate that knowledge of landscapes/features doesn‟t evolve much during a geologist‟s time as an undergraduate, but later on throughout a career. This is most likely true in the case of formation timescales, where large gains in knowledge have to be made, as the average score is a failing score on this section. Instructors in the geosciences do display a more expert-like level of knowledge and a high level of confidence in this section. Tracking this transformation between undergraduates, graduates, and experts would be a valuable use of the LIFT in the future. Variety in Success with Landscape Identification and Formation Timescales Geologically and regionally specific landscapes are harder to identify The percentage of students who answered questions right/wrong can reveal a lot about an instrument. It can point out questions that are too easy or too hard, or highlight interesting results that would benefit from further investigation. Landscapes/features that were identified by a high percentage of students were: the impact crater, fault, mountains, sand dunes, volcano, and river. These are all things that are not too difficult to recognize, and are likely features that a person would be acquainted with at a young age. Those that were difficult for students to identify were: the hoodoos, mud cracks, and u-shaped valley. The landslide also had a low percentage of correct identifications, but this is likely due to the difficulties that students had with the image used. These features that were difficult for students to identify are things that are geologically specific, and in the case of the hoodoos (which only 20% of students identified correctly), regionally specific as well. Students are most comfortable with extreme timescales Nearly all of the landscape/feature formation timescales have a low percentage of correct answers. Those that were higher were: impact crater, fault, and mountains (Figure 5). This echoes what was observed (and what many students explicitly stated) in the interviews. They are more comfortable and knowledgeable on those landscapes/features that take short or long timescales to form. It is those that are in the middle range of the timescale that students have trouble with. 22 Even the experts displayed the largest amount of consensus on the extreme timescales (Figure 3). Quite possibly the middle range timescales aren‟t taught due to a general sense of ambiguity in the geoscience community. One of the questions (number 5, the fault) actually has a larger percentage of people that answered the formation timescale correctly than identified the landscape correctly. This is explained when verbatim responses that students gave for their identification of the landscape are examined. Some students said “devastated landscape” or “city” or similar words/phrases when attempting to identify the landscape. Though it is true that the fault affected a city, these answers were marked as incorrect. This is because they do not explain the landscape/feature involved. Students who gave these answers were always those who indicated having little experience with taking courses in the geosciences. Experts are also most comfortable with extreme timescales After the final version of the LIFT was deployed, instructors who participated in interviews were invited to take the test once more. An online version of the test was created for this purpose. Instructors showed greater consensus with extreme timescales, though few responses were obtained (Table 5). The correct answer, as based upon their original interview responses, was unanimously selected on four of the eleven questions, selected by the majority (but not unanimously) on five, and selected by the minority on two (number 8, the volcano, and number 9, the river). Table 5. Distribution of expert responses on the LIFT (n=4).* Answer Choice Question Number (Increasing Order of Magnitude) 3 5 11 2 10 1 4 9 12 8 6 A 4 4 3 0 0 0 2 0 0 0 0 B 0 0 0 4 3 0 1 0 2 1 0 C 0 0 0 0 1 1 1 2 1 1 0 D 0 0 1 0 0 2 0 1 1 2 4 E 0 0 0 0 0 1 0 1 0 0 0 *Correct answers are bolded. 23 Implications for future versions of the LIFT The distributions of right/wrong answers suggest that all of the questions in the LIFT (with the exception of the landslide) are in good standing, though the sand dunes question should be modified or removed due to lack of a correct answer. Those with extreme higher or lower percentages of correct answers all have justifiable explanations, and are left in because those reasons are sound. The Correlation between Confidence and Knowledge There is a positive correlation between confidence and knowledge for all students, beginner or advanced. Advanced students do plot at the higher end of the graph (Figure 6a), with larger values of confidence and knowledge than the beginner students. Separation by geologic time scores Answers to the geologic time section of the LIFT show a significant difference between the 2 nd year and 4 th year classes, suggesting that the use of this section for separating beginner and advanced students is justified. The geologic time section also benefits from being a measure independent of the LIFT data, and thus does not create analyses that loop back upon themselves. When separated by class the data is more variable, but a difference in geologic time score would still be expected, as the majority of the students in each class are at the level requested. Additionally, the fourth year class tested was an advanced paleontology class with a lower-level paleontology prerequisite, so even those who aren‟t geology majors should still have a greater knowledge of geologic time than second year geology majors who have not taken this lower-level paleontology prerequisite. However, the non-geology majors in the 4 th year class who are familiar with geologic time are likely not familiar with landscape formation, as none of them indicated having taken a geomorphology class on the demographics section of the LIFT. When beginner and advanced groups are generated based upon their scores on the geologic time section of the LIFT, the average scores in the identification and formation timescale sections separate to a larger extent than the average scores when analyzed between 24 the two classes. This suggests further validity in the use of the geologic time section as a measure of expertise, and further use of the section for this purpose is highly recommended. These data suggest that there is some change in the level of formation timescale knowledge, but it is not evident when analyzed by class. The class that a student is enrolled in is not necessarily an accurate indication of experience, at least not in the Department of Earth and Ocean Sciences at UBC. This change in formation timescale knowledge is only evident when the students are separated by an alternate measure of expertise, the geologic time section. The average score for the advanced group is still a failing score, however, so this is an area where undergraduates are lacking in knowledge, even at the level required to graduate with a Bachelor‟s degree. This may be something that evolves in everyone through increases in experience level, or maybe the level of knowledge remains roughly the same unless a person chooses to specialize in geomorphology, field geology, or another specialty resulting in significant exposure to landscapes/features in time-related contexts. Advanced Students are more Self-Aware than Beginner Students The tightness of the clusters (Figure 6) is a useful means to compare groups, as they reflect the students‟ ability to accurately think about their knowledge and confidence. A cluster covering such a large area (such as that of the beginner group in Figure 6a), suggests two things: a greater range in scores and confidence, and more specifically, an inability to accurately judge the level of knowledge (when the range for confidence with a low score reaches such high values, for example). Both groups show differences between their right and wrong histograms (Figure 7); however, it appears that the right histograms are where the biggest differences are between groups. The wrong histograms for the beginner and advanced groups are barely significantly different (p=0.04); however, the right histograms for the beginner and advanced groups are significantly different (p=0.01). Both the right vs. wrong histograms within the beginner and advanced groups are significantly different; however, there is a greater difference in right/wrong mean confidence for the advanced group than the beginner group (p=3.6x10 -7 vs. p=0.001). These data imply that beginner and advanced students have similar capabilities of determining when they are wrong, but advanced students are more capable and confident at pinpointing when they are right. 25 LIFT Results Do Not Support Previous SAESS Data All of the data obtained from in-class administrations of the LIFT are inconsistent with results seen on the SAESS. On the SAESS, third and fourth year students decreased in their agreement with the expert on the statement “When I look at a landscape, I have an idea of how long it took to form”. On the SAESS statement administered in conjunction with the LIFT during fall 2009, advanced students increased in their agreement with the expert on this statement. In addition, LIFT results show that advanced students have higher levels of confidence which correlate with higher scores on the test. There are a few possible explanations for this discrepancy. 1) The separation of the two classes into beginner and advanced groups may have eliminated this result. 2) The students may only display this decrease in confidence over the course of a term (not at a single point in time). 3) The students may only display this decrease in confidence in second term, January-April. This is where the result was seen on the SAESS, but the LIFT was given at the end of first term, September-December, in this study. 4) The SAESS results do not accurately represent the population. 5) The advanced students sampled in this study were not analogous to those displaying the drop in confidence on the SAESS. And finally, 6) the students still maintain an overall high level of confidence (and advanced students have higher confidence than beginner students) despite the decrease. The following discussion shows that some of these possible causes can be ruled out based on the data, and others remain as possibilities. 1) The separation of a beginner and advanced group could not have eliminated the result. Much of the data has been analyzed with class as a determining factor, and though using an alternate measure of expertise does improve the resolution of the clusters, it does not drastically change the position of the two groups on the plot (e.g. the fourth year class plots at the higher confidence-higher score end and the second year class plots at the lower confidence-lower score end). In both cases the graph still has a positive slope (score increases with confidence). 2) Student responses on the LIFT could change over the course of the term. Available time for this study only allowed for an analysis of data at a single point in time, so this is still a possible contributor. However, it does not explain the change in the SAESS responses. 26 3) The students could possibly have different results in the second term. Again, time permitted for this study only allowed for data collection in the first term, so a second term change is possible. However, it does not seem likely that things would change so drastically, based on results from this and other SAESS statements. No class types (e.g. 1 st year service class, 2 nd year majors class) have changed from increase to decrease on the given statement (#13) between the first and second term in three years of administration of the SAESS, though small changes have been observed. However, a 3 rd year geology majors class has never taken part in the first term administration of SAESS. 4) It is not likely that the SAESS results do not accurately represent the population. Extensive preparation and validation of the SAESS has occurred over the course of three years, including monitoring population data and interviewing students with a variety of backgrounds to ensure consistent interpretation of all of the statements (Lane et. al, in preparation). All results suggest that the population is accurately represented, and that statements are interpreted in the same way by all students. The LIFT was developed with similar rigor. Thus it is very unlikely that this would be the cause of the differences in LIFT and SAESS results. 5) It is possible that the advanced students in this study are not analogous to those who previously displayed the decrease in confidence on the SAESS. This study groups the students in a different way (by score on the geologic time section), and the upper-level class included in the study does not include all of the upper-level geology majors. It also includes a small group of non-majors (n=9). 6) Of all possible explanations, it is most likely that the change in perception of the students on the statement “When I look at a landscape, I have an idea of how long it took to form” is notable, but does not reflect the correlation between their overall knowledge and level of confidence in landscape/feature identification and formation timescales. It is possible that though students could become less expert on this statement over the course of a term, they still have positive correlations of confidence and knowledge. Furthermore, their response to a general statement like the one on the SAESS could be inconsistent with how they respond to specific questions involving real scenarios and numbers. 27 Difficulties with Development and Administration of the LIFT Results from the administration of the test, in addition to observations in the classroom and future interview data, can be used to further refine the LIFT. Being aware of possible issues at all stages of development is important, as in some cases parts of the test are acceptable in an interview setting, but do not fare as well in larger scale test administration. Listening to the sounds in the classroom and recognizing the physical setting of test administration can also assist in a future process of fine tuning. Classroom observations A problem that was observed in the classroom was the student reaction to question 11, the landslide, and more specifically the image that was used. This image was cropped during the interview process, and no students had trouble recognizing what it was. However, in the classroom, there was a noticeable change in the noise level when this image was shown. As soon as this image was projected, students began whispering “What is that?”, “What angle are we looking at this from?”, etc. It appeared that while this image was completely clear in the interviews, it was not effective with a larger population of students in a classroom setting. A different image should be used with future versions of the LIFT. Otherwise, students were quiet, and did not ask each other questions during the test. There were no complaints of being rushed with the times given for the images, or the individual questions and sections. Problematic images The following is a list of all landscapes/features that were removed from the test due to an outcome of an interview (or interviews) (Table 6). What has been learned is of particular use to those who use images to teach, either in lectures or assessments. Most of these landscapes/features were unclear even to upper-level students who demonstrated a high level of knowledge with landscape/feature identification. The issue with the sand dunes did not arise until interviews were conducted with experts. All experts identified the landscape correctly, but when their answers for the formation timescales were tallied, there was an abnormally large range of answers (five orders of magnitude). This result is likely because some were stating the timescale for a 28 desert to form, others were stating the timescale for a single dune to form, and still others were stating the timescale for a dune to change shape. A method of eliminating this confusion was not devised, but the question was left on the test in order to compare the student range of responses with the expert range of responses. Table 6. Problematic images, landscapes, and features encountered throughout the development process of the LIFT. Landscape/Feature Reason for Removal (or Reason for Exclusion from Data Set) Karst Obscurity Mesa Obscurity Salt Flats Lack of clarity (two images tested) Sand Dunes No consensus in timescale among experts Stromatolite Mounds Obscurity Volcano (Bird’s Eye View of Olympus Mons, Mars) Obscurity, Lack of clarity with vantage point of the image On the sand dunes formation timescale question, students have a wider range of answers than the experts, and the spread increases from beginner to advanced (Figure 9). Many beginner students (25%) chose the longest timescale available (1 000 000s of years or more). Based on interview and identification responses, it is likely that students who chose the shorter timescales were thinking about individual dunes, and those who chose the longer timescale were thinking about the whole desert. This lack of consistency and clarity suggests that the whole question needs to be reworked, selecting either a different image (say, a single dune) or clarifying the context. 29 Figure 9. Distribution of responses to the sand dunes question on the LIFT (#7c). The answer spread increases as level of expertise decreases. Future Recommendations Curricular implications Average scores are better in the landscape identification section than the landscape formation timescale section. In fact, the average score in the formation timescale section is below 50%, even for the advanced students. This highlights a gap in the current curriculum, a lack of instruction of the timescales of processes that form landscapes. This concept has already been identified by instructors at UBC (Vancouver) as a facet of geologic time that even beginner students should have knowledge of, and was a particularly motivating statement for this study (Rhajiak, 2009). Revision of the LIFT The LIFT should no longer be administered with the current image for the landslide question (#11), as it was unclear to students in the administration of the test. A proposed new image for this question has been located (Appendix 5). The sand dunes question (#7) can be removed from the test, as it did not provide any useful information in its current form. Alternatively, other images could be considered for this question and it could be left on the test (Appendix 5). Further clarification of images should always be considered, though caution should be exercised and subsequent validation of these images by think-aloud interview should be conducted wherever possible. Clarity must always be ensured before using these images in a test format. 30 The SAESS statement asked at the beginning and end of the test need only be asked at the beginning of the test, as data from this study showed that the vast majority of students did not change their answer to this statement by the end of the test. However, this may be because tearing off the page actually assisted them in remembering their answer to the statement. If the SAESS statement is only asked at the beginning of the test, it still needs to be torn off before beginning the landscape questions, as the possibility of students changing their answer during the test remains. Thoroughness of the LIFT could be improved with the addition of further images using landscapes/features that have formation timescales in the intermediate range. This would allow for further studies regarding the idea that students and experts are more comfortable with extreme timescales. It would also increase the amount of resolution in this intermediate range, where the current version of the LIFT could be improved (Figure 3). A new version of the LIFT could be created which tests the formation timescales in a less restrictive sense, e.g. through the use of a timescale that can be bracketed (on paper or with an online applet), or a question that lists all possible orders of magnitude and has a range of correct answers. However, significant effort was taken to include only one best answer on the version used in this study, based upon responses from experts in interviews. Further administration The LIFT can be administered either with or without the confidence rating scales and SAESS statement. All of the questions can stand alone without these additives. In other words, the LIFT could be used solely as a conceptual test on landscapes/features and landscape/feature formation timescales. It has the potential to be used as either a one-time assessment or a pre-post assessment to track conceptual knowledge and/or confidence in that knowledge over the course of a given time period. Characteristics of expertise It is recommended that the LIFT is not administered without the geologic time section. These questions are extremely useful as an independent measure of expertise, and provided a more valuable separation of the data set than the enrolled class alone. Further use 31 of age or program year of a student as the sole measure of expertise should be avoided whenever possible in any study. Consideration of the SAESS data As the SAESS uses the enrolled class as the measure of expertise of the students, the outcomes of this study should be considered. It may be more effective to place a small number of general conceptual questions at the end of the survey when the group‟s expertise in the earth and ocean science program is of importance. When the SAESS is used to track changes within a class, this information is unnecessary. Gender studies Data from Jordan et al.‟s (2009) study involving post-secondary non-science majors in a geology class suggests that the confidence of female students is higher when they are correct and that of male students is always high regardless of their correctness. Males also rate their confidence higher than females overall. The LIFT could be used to further investigate this notion, and see if it applies to geology majors at a beginner vs. an advanced level. Studies with high school students The LIFT could be given to high school students enrolled in and not enrolled in a geology class, to compare their answers to that of post-secondary geology majors. The landscape identification section would be of particular importance, as 2 nd year and 4 th year students have similar scores on this section. It would be useful to see whether or not there is a change between high school and post-secondary, and if so, how significant the change is. Studies with graduate students Graduate students could take the LIFT, and this data could be combined with high school students, undergraduate students, and experts. This data would detail the trajectory of student knowledge and confidence over the course of their studies. This type of longitudinal study could even be conducted with a specific cohort of students. 32 CONCLUSION The LIFT was developed to explore an unexpected result observed with the SAESS at UBC (Vancouver), where upper-level students became less confident in their ability to look at a landscape and have an idea of how long it took to form. The LIFT was created through an extensive validation process that included selection of images, think-aloud interviews with open-ended questions conducted with both students and experts, and the generation of multiple choice selections and an answer key incorporating student and expert responses. The test was administered to two classes at UBC (Vancouver), one at the lower-level and one at the upper-level. Usable data was obtained from a total of 96 students; 71 in the lower-level class and 25 in the upper-level class. Students were separated based on their score in the geologic time section of the LIFT, which focuses on general conceptual knowledge of geologic time. This created two groups: beginner students and advanced students. This method of separating data based on an independent measure of expertise was crucial to its interpretation, as it refined the clusters of data and eliminated skew. The utility of this method points to the need of including questions that gauge expertise on other surveys and assessments, instead of solely relying on the class a student is enrolled in as a measure of experience. The results of the LIFT are not consistent with those previously found on the SAESS. The LIFT results suggest that both beginner and advanced students‟ confidence positively correlates with their knowledge, but advanced students do display greater conceptual knowledge and confidence. They are also more self-aware than beginner students, who are less capable of gauging when they are right or wrong. All of the students score higher on the landscape identification section than the formation timescale section, implying a lack of focus on landscape/feature formation timescales in the content of classes taught at UBC (Vancouver). Students also are less competent at recognizing landscapes/features that are geologically or regionally specific, and are more competent with formation timescales that are on the extreme ends of the range. Some types of landscapes were found to be too specific, and went mostly unrecognized by all of the students in the interviews (e.g., karst, stromatolite mounds). Others were not clear enough to be consistently recognized and interpreted (e.g., sand dunes, 33 salt flats). Careful design and validation is important for any assessment development, something that requires substantial consideration and effort. There are many ways that the LIFT can be used in the future, including further assessment of curricula and student learning through a one-time or pre-post test and studies with demographically different groups (e.g., high school or graduate students). More data in this relatively new field of confidence studies in the geosciences will only continue to increase understanding of the student learning process, and how it can be improved with the help of teaching and/or curricula. 34 REFERENCES CITED Allen, P.A., 2008, From landscapes into geological history, Nature, v. 451, p. 274-276. Association of Professional Engineers and Geoscientists of British Columbia, 2006, Geology uniform syllabus, http://www.apeg.bc.ca/reg/docs/2006geologysyllabus.pdf (16 March 2010). Besterfield-Sacre, M.E., Amaya, N.Y., Shuman, L.J., Atman, C.J., Porter, R.L., 1998, Understanding student confidence as it relates to first year achievement, Frontiers in Education Conference, p. 258-263. Chi., M.T.H., Feltovich, P.J., and Glaser, R, 1981, Categorization and representation of physics problems by experts and novices, Cognitive Science, v.5, p. 121-152. Clary, R.M, Brzuszek, R.F., and Wandersee, J.H, 2009, Students‟ geocognition of deep time conceptualized in an informal educational setting, Journal of Geoscience Education, v. 57, p. 275-285. Dodick, J., and Orion, N, 2003, Measuring student understanding of geological time, Science Education, v. 87, p. 708-731. Gal, I., Ginsburg, L., and Schau, C. 1997, Monitoring attitudes and beliefs in statistics education, In: Gal, I. & Garfield, J.B., editors, The Assessment Challenge in Statistics Education, Fairfax, VA, IOS Press., p. 37-51. Hammer, D, 1994, Epistemological beliefs in introductory physics, Cognition and Instruction, v. 12, p. 151-183. Hermann, R., and Lewis, B. 2004, A formative assessment of geologic time for high school earth science students, Journal of Geoscience Education, v. 52, p. 231-235. Hidalgo, A.J., and Otero, J, 2004, An analysis of the understanding of geological time by students at the secondary and post secondary level, International Journal of Science Education, v. 26, p. 845-857. Jordan, S., Libarkin, J.C., and Clark, S.K., 2009, Too much, too little, or just right? An investigation of confidence, demographics, and correctness [abstract], Geological Society of America Abstracts with Programs, v. 41, p. 667. 35 Kastens, K. A., Manduca, C.A., Cervato, C., Frodeman, R., Goodwin, C., Liben, L. S., Mogk, D.W., Spangler, T.C., Stillings, N.A., and Titus, S, 2009, How geoscientists think and learn, Eos Trans, AGU, v. 90, p. 265–266. Kitchener, K.S., and King, P.M, 1981, Reflective judgment: concepts of justification and their relationship to age and education, Journal of Applied Developmental Psychology, v. 2, p. 89-116. Kloosterman, P., and Cougan, M.C, 1994, Students‟ beliefs about learning school mathematics, The Elementary School Journal, v. 94, p. 375-388. Lane, E., Jolley, A., Kennedy, B., and Frappé-Sénéclauze, T-P, in preparation, Untitled Student Attitudes about Earth Science Survey (SAESS) Manuscript. Libarkin, J.C, 2001, Development of an assessment of student conception of the nature of science, Journal of Geoscience Education, v. 49, p. 435-442. Libarkin, J.C., and Anderson, S.W, 2005, Assessment of learning in entry-level geoscience courses: results from the Geoscience Concept Inventory, Journal of Geoscience Education, v. 53, p. 394-401. Petcovic, H.L, and Libarkin, J.C, 2007, Research in science education: the expert-novice continuum, Journal of Geoscience Education, v. 55, p. 333-339. Rhajiak, J.A.N, 2009, Understanding geological time: a proposed assessment mechanism for beginner and advanced geology students at the University of British Columbia (Vancouver) [undergraduate honors thesis], Vancouver (BC), University of British Columbia, 64 p. Schraw, G., 1997, The effect of generalized metacognitive knowledge on test performance and confidence judgments, The Journal of Experimental Education, v. 65, p. 135-146. Trenhaile, A.S. 2007. Geomorphology: a Canadian Perspective, Oxford, Oxford University Press, 498 p. Zen, E., 2001, What is deep time and why should anyone care?, Journal of Geoscience Education, v. 49, p. 5-9. 36 Appendix 1. Version 1.0 of the LIFT (used in this study). Please answer the question below. When you have, tear off this page and pass it to the right end of your row. Do not forget to write your student number on this page and page 2 of this booklet! It will be used to match up this page with the rest of your booklet and attitude survey responses, as well as for participation purposes. If you completed a paid validation interview with me in October or November, please draw a star beside your student number on both this sheet and page 2 of this booklet. Student Number: _________________________________ I. When I look at a landscape, I have an idea of how long it took to form. Circle a number. Strongly Disagree 1 2 3 4 5 Strongly Agree 37 Student Number: _________________________________ Part 1 - Landscape Formation: You will have 45 seconds to look at each photograph (on the projected slides) and write down what type of landscape/ feature is shown in the image, as well as how long the depicted landscape/ feature took to form (NOT how old it is). Following each question, please answer parts b and d (your confidence in your answer). If you are unsure of the type of landscape, try to hypothesize the process or processes that created it. Please fill out your answers directly on the test booklet, there is NO scantron. By no means are you expected to know the answers to all of these questions, and your score on this test will not contribute to your grade in the course, or even be communicated to your professor/s. Your professor/s will only have access to the student numbers of those who completed the test, for participation purposes. 1. a) What type of feature is this? _______________________________________________ b) How confident are you that you recognized the type of feature that is present in the image? <20% 20-40% 40-60% 60-80% >80% c) How long did this feature take to form? Choose the BEST answer. a) days or less b) years c) 100s of years d) 10s of 1000s of years e) 100s of 1000s of years or more d) How confident are you in your estimation of the time the feature took to form? <20% 20-40% 40-60% 60-80% >80% 2. a) What type of landscape is this? _______________________________________________ b) How confident are you that you recognized the type of landscape that is present in the image? <20% 20-40% 40-60% 60-80% >80% c) How long did this landscape take to form? Choose the BEST answer. a) minutes or less b) days c) years d) 100s of years e) 10s of 1000s of years or more d) How confident are you in your estimation of the time the landscape took to form? <20% 20-40% 40-60% 60-80% >80% 38 3. a) What type of feature is this? _______________________________________________ b) How confident are you that you recognized the type of feature that is present in the image? <20% 20-40% 40-60% 60-80% >80% c) How long did this feature take to form? Choose the BEST answer. a) minutes or less b) 100s of years c) 10s of 1000s of years d) 100s of 1000s of years e) 1 000 000s of years or more d) How confident are you in your estimation of the time the feature took to form? <20% 20-40% 40-60% 60-80% >80% 4. a) What type of landscape is this? _______________________________________________ b) How confident are you that you recognized the type of landscape that is present in the image? <20% 20-40% 40-60% 60-80% >80% c) How long did this landscape take to form? Choose the BEST answer. a) 10s of 1000s of years or less b) 100s of 1000s of years c) 1 000 000s of years d) 10s of 1 000 000s of years e) 1 000 000 000s of years or more d) How confident are you in your estimation of the time the landscape took to form? <20% 20-40% 40-60% 60-80% >80% 5. a) What type of landscape is this? _______________________________________________ b) How confident are you that you recognized the type of landscape that is present in the image? <20% 20-40% 40-60% 60-80% >80% c) How long did this landscape take to form? Choose the BEST answer. a) minutes or less b) days c) months d) years e) 100s of years or more d) How confident are you in your estimation of the time the landscape took to form? <20% 20-40% 40-60% 60-80% >80% 39 6. a) What type of landscape is this? _______________________________________________ b) How confident are you that you recognized the type of landscape that is present in the image? <20% 20-40% 40-60% 60-80% >80% c) How long did this landscape take to form? Choose the BEST answer. a) 1000s of years or less b) 10s of 1000s of years c) 100s of 1000s of years d) 10s of 1 000 000s of years e) 1 000 000 000s of years or more d) How confident are you in your estimation of the time the landscape took to form? <20% 20-40% 40-60% 60-80% >80% 7. a) What type of feature is this? _______________________________________________ b) How confident are you that you recognized the type of feature that is present in the image? <20% 20-40% 40-60% 60-80% >80% c) How long did this feature take to form? Choose the BEST answer. a) years or less b) 100s of years c) 10s of 1000s of years d) 100s of 1000s of years e) 1 000 000s of years or more d) How confident are you in your estimation of the time the feature took to form? <20% 20-40% 40-60% 60-80% >80% 8. a) What type of feature is this? _______________________________________________ b) How confident are you that you recognized the type of feature that is present in the image? <20% 20-40% 40-60% 60-80% >80% c) How long did this feature take to form? Choose the BEST answer. a) years or less b) 1000s of years c) 100s of 1000s of years d) 10s of 1 000 000s of years e) 1 000 000 000s of years or more d) How confident are you in your estimation of the time the feature took to form? <20% 20-40% 40-60% 60-80% >80% 40 9. a) What type of feature is this? _______________________________________________ b) How confident are you that you recognized the type of feature that is present in the image? <20% 20-40% 40-60% 60-80% >80% c) How long did this feature take to form? Choose the BEST answer. a) months or less b) years c) 100s of years d) 10s of 1000s of years e) 100s of 1000s of years or more d) How confident are you in your estimation of the time the feature took to form? <20% 20-40% 40-60% 60-80% >80% 10. a) What type of feature is this? _______________________________________________ b) How confident are you that you recognized the type of feature that is present in the image? <20% 20-40% 40-60% 60-80% >80% c) How long did this feature take to form? Choose the BEST answer. a) minutes or less b) days c) years d) 100s of years e) 1000s of years or more d) How confident are you in your estimation of the time the feature took to form? <20% 20-40% 40-60% 60-80% >80% 11. a) What type of feature is this? _______________________________________________ b) How confident are you that you recognized the type of feature that is present in the image? <20% 20-40% 40-60% 60-80% >80% c) How long did this feature take to form? Choose the BEST answer. a) minutes or less b) days c) months d) years e) 100s of years or more d) How confident are you in your estimation of the time the feature took to form? <20% 20-40% 40-60% 60-80% >80% 41 12. a) What type of feature is this? _______________________________________________ b) How confident are you that you recognized the type of feature that is present in the image? <20% 20-40% 40-60% 60-80% >80% c) How long did this feature take to form? Choose the BEST answer. a) 100s of years or less b) 10s of 1000s of years c) 100s of 1000s of years d) 1 000 000s of years e) 10s of 1 000 000s of years or more d) How confident are you in your estimation of the time the feature took to form? <20% 20-40% 40-60% 60-80% >80% Part 2 – Geologic Time: You will have 10 minutes to complete this section. Please answer the questions below to the best of your abilities. 13. Carefully examine the relative positions of the lettered arrows on the timeline below and estimate the ages represented by each arrow. Identify which letter corresponds most closely to the extinction of the dinosaurs. (a) A (b) B (c) C (d) F (e) G 14. What age does D correspond to on the above timeline? (a) 2.25 years (b) 2 250 000 years (c) 2 500 000 years (d) 2 250 000 000 years (e) 2 500 000 000 years 15. Which method was primarily used to establish the Geologic Time Scale? (a) Correlation of magnetic signatures in rocks (b) Calculation of alpha decay of isotopes (c) Calculation of beta decay of isotopes (d) Correlation of fossils in rock units across vast distances (e) Correlation of rock types across vast distances 42 16. During fieldwork in western Canada, an experienced geologist sketched the cross section below showing three different units of tilted rocks and their relative ages. What could you best infer from this diagram? (a) Mountain building has overturned the rock units. (b) Rock layer C formed first and so lies on the bottom. (c) Rock layers B and C must be igneous rocks. (d) The sequence of rock layers conforms to the principle of superposition. (e) The sequence of rock layers conforms to the principle of original horizontality. 17. What do we call the feature left by a cycle involving deposition, then removal of previously- deposited sediment by erosion, then a return to deposition? (a) A cross-cutting relationship. (b) An inclusion. (c) A turbidite sequence. (d) A fault. (e) An unconformity. 18. If you were to date this rock (in the picture at right) with radioactive isotopes, which of the following would be a reasonable determination you would be able to make? You be able to…. (a) tell when this rock was erupted as a lava. (b) tell what age the fossils are that are contained within the rock. (c) tell at what time heat and pressure caused the rock to metamorphose. (d) determine how old the Earth is. (e) determine when this rock was originally deposited as a sediment. 43 19. Which of the following labeled events would correlate to the base of the Mesozoic and the base of the Cenozoic? (a) A, G (b) I, J (c) A, H (d) H, J (e) E, I 20. What is the best estimate of the age of F if A is 100 million years old and D is 70 million years old? (a) 55 Myrs (b) 85 Myrs (c) 110 Myrs (d) 170 Myrs (e) same age as A 44 II. When I look at a landscape, I have an idea of how long it took to form. Circle a number. Strongly Disagree 1 2 3 4 5 Strongly Agree Lastly, some demographic information please: Gender: ________________________________________________ Age: ___________________________________________________ Major: _________________________________________________ Year: __________________________________________________ Courses Currently Taking (check the box to the left of the course number): EOSC 210 EOSC 220 EOSC 330 EOSC 425 Courses Previously Taken (check the box to the left of the course number): EOSC 210 EOSC 220 EOSC 222 EOSC 328 EOSC 330 EOSC 332 EOSC 425 The end – thanks for your help! Please hand in your booklet at the front of the class. *For an answer key please contact the author (ajolley@eos.ubc.ca) or the Department of Earth and Ocean Sciences at the University of British Columbia (Vancouver). 45 Appendix 2. Image slides with credits. Question 1. How long did this feature take to form? Copyright © Marli Miller, University of Oregon; Image source: Earth Science World Image Bank http://www.earthscienceworld.org/images Question 2. How long did this landscape take to form? Copyright © Daniel Mayer, Image source: Wikimedia Commons http://commons.wikimedia.org 46 Question 3. How long did this feature take to form? Copyright © B.P. Snowder; Image Source: Western Washington University Planetarium http://www.wwu.edu/depts/skywise/a101_meteors.html Question 4. How long did this landscape take to form? Copyright © CBC; Image source: CBC Seven Wonders of Canada http://www.cbc.ca/sevenwonders/wonder_drumheller.html 47 Question 5. How long did this landscape take to form? Image Courtesy United States Geological Survey; Image source: Earth Science World Image Bank http://www.earthscienceworld.org/images Question 6. How long did this landscape take to form? Copyright © Travel Adventures; Image source: http://www.traveladventures.org/continents/asia/mount-everest-north-face.shtml 48 Question 7. How long did this feature take to form? Copyright © Mike Baird; Image source: Wikimedia Commons http://commons.wikimedia.org Question 8. How long did this feature take to form? Copyright © Shmuel Spiegelman; Image source: Wikimedia Commons http://commons.wikimedia.org 49 Question 9. How long did this feature take to form? Copyright © Richard Muller, University of California at Berkeley; Image source: http://muller.lbl.gov/travel_photos/AmazonWebPages/AmazonWebPages.html Question 10. How long did this feature take to form? Copyright © Vinod Panicker; Image source: Wikimedia Commons http://commons.wikimedia.org 50 Question 11. How long did this feature take to form? Copyright © Michael Collier; Image source: Earth Science World Image Bank http://www.earthscienceworld.org/images Question 12. How long did this feature take to form? Copyright © Luca Galuzzi; Image source: Imaging Earth's Surface: University of Vermont http://www.uvm.edu/~geomorph/gallery/ 51 Appendix 3. Interview consent form. T H E U N I V E R S I T Y O F B R I T I S H C O L U M B I A Carl Wieman Science Education Initiative (CWSEI) University of British Columbia Wesbrook Bldg. 300-6174 University Blvd. Vancouver, BC Canada V6T 1Z3 Tel: (604) 827-3119 Fax: (604) 827-3118 Consent Form #1 Development and Validation of Surveys Probing Students’ Understanding of Concepts in Science Principal Investigator: Dr. Carl Wieman Carl Wieman Science Education Initiative (CWSEI) 604-822-1732 Co-Investigator: Dr. Sarah Gilbert Carl Wieman Science Education Initiative (CWSEI) 604-822-3193 Study Team Members/Researchers: Sara Harris, Earth and Ocean Sciences, 604-822-9651 Francis Jones, Earth and Ocean Sciences, 604-822-2138 Alison Jolley, Earth and Ocean Sciences, alisonjolley@gmail.com UBC undergraduate students are invited to participate in a study aimed at improving the learning and appreciation of science. This study is conducted by science departments at UBC and the Carl Wieman Science Education Initiative (CWSEI). Conclusions and the survey instruments based on composite interviews of many students may be published in some form and/or presented publicly, but without any information that could be used to identify the participants. Purpose: The purpose of this study is to develop and validate surveys that examine university students’ understanding of concepts in particular science disciplines. This study will test the survey questions through interviews with a range of students that span the populations for which the surveys will be used. Interviews are an important “validation” step in the development of a survey; they test whether survey questions are worded so that the students interpret them as intended. Study Procedures: Your participation will involve filling out a survey on some key concepts in a science discipline. Then the interviewer will ask questions in order to understand how you 52 interpret each question and the reasons for your particular responses to the survey question. They will proceed in this fashion through the entire survey, or, in later work, just those selected questions that need further revision and testing. Typical interviews will last no more than one hour. The interviews examining subsets of questions for validation will also take less than an hour in length. All interviews will be audio recorded and may be video recorded with your permission. Participants will be selected based on their grades to ensure an equal representation of students from each grade band. In the event that there are more participants than needed, subjects may be selected based on gender, ethnic background, and age for the sole purpose of ensuring the best match to the demographic representation of the overall UBC student population. At the time of recruitment, researchers will ask students for their ID numbers and permission to review their grades and background information for this purpose of participant selection. Potential Risks: There are no known risks to participation in this study. Potential Benefits: The benefits to you are indirect; the surveys developed as a result of these interviews are part of a major UBC initiative to improve science education. Input from the surveys is an essential component in understanding what changes in educational approaches are working well and where further improvements are needed. This may result in improvements to science courses you take in future semesters. The benefits to society in general will be improved science education that most students will find more interesting and relevant to their lives. Confidentiality: Your confidentiality will be respected. Interviews will be transcribed and no one except the researchers will have access to your identity. The interviewer will be a faculty or staff researcher from each participating department with expertise in the respective discipline and will not be an instructor of any course in which you are currently enrolled. Any written or printed out materials with identifiable information will be stored in a locked filing cabinet and will not be available to any of your current instructors. Any information in electronic format will be stored on password protected computers. No individual student identifiers will be used in any published or publicly presented work. Remuneration/Compensation: For your participation, you will receive the monetary compensation of $20. 53 Contact for information about the study: If you have any questions or would like further information about this study, you may contact the appropriate researcher for the particular survey instrument as follows: Earth and Ocean Sciences – Sara Harris (604-822-9651) Francis Jones (604-822-2138) Alison Jolley (alisonjolley@gmail.com) Contact for concerns about the rights of research subjects: If you have any concerns about your treatment or rights as a research subject, you may contact the Research Subject Information Line in the UBC Office of Research Services at 604-822-8598 or if long distance e-mail to RSIL@ors.ubc.ca. Consent: Your participation in this study is entirely voluntary and you may refuse to participate or withdraw from the study at any time without jeopardy to your class standing. *If you require additional time to review this consent form, please feel free to do so and return the signed form to the appropriate researcher by tomorrow. Your signature below indicates that you have received a copy of this consent form for your own records. Your signature indicates that you consent to participate in this study. ____________________________________________________ Participant’s Signature Date ____________________________________________________ Printed Name of the Participant signing above 54 Appendix 4. Changes made during the interview process. Version Number Interview Number* Images Used Specific Changes Made 1 1 1. Lava, 2. Alluvial Fan, 3. Impact Crater, 4. Hoodoos, 5. Fault, 6. Mountains, 7. Karst, 8. Stromatolites, 9. Volcano/Lahar, 10. Mesa, 11. Salt Flat, 12. Sand Dunes, 13. Volcano, 14. River, 15. Mud Cracks, 16. Landslide N/A 2 2 Same as version 1. - added boxes and arrows to specify area of image to focus on 3 4 1. Lava, 2. Alluvial Fan, 3. Impact Crater, 4. Hoodoos, 5. Fault, 6. Mountains, 7. Mesa, 8. Salt Flat, 9. Sand Dunes, 10. Volcano, 11. River, 12. Mud Cracks, 13. Landslide, 14. U- shaped valley - removed karst, stromatolite, and volcano/lahar images - added u-shaped valley image - put all images in powerpoint format - cropped images instead of using boxes/arrows - changed rating scale to fifths (as opposed to a five point Likert-scale) - added identification confidence rating after viewing image 4 8 Same as version 2. - changed the volcano, mud cracks, sand dunes, and salt flat images - un-cropped the images and tried using cartoons to represent the before image (unsuccessful) 55 - added in specific question that asks for identification of the landscape - used the word “feature” in some questions (those that are more of a feature as opposed to a landscape) 5 11 (first expert interview) 1. Lava, 2. Alluvial Fan, 3. Impact Crater, 4. Hoodoos, 5. Fault, 6. Mountains, 7. Sand Dunes, 8. Volcano, 9. River, 10. Mud Cracks, 11. Landslide, 12. U-shaped valley - removed mesa and salt flat images - re-cropped the images *Where change first came into effect. 56 Appendix 5. Suggested image revisions. Question 7. How long did this feature take to form? Copyright © Luca Galuzzi; Image source: Imaging Earth's Surface, University of Vermont https://www.uvm.edu/~geomorph/gallery Question 11. How long did this feature take to form? Copyright © Marli Miller; Image source: Imaging Earth's Surface, University of Vermont http://www.uvm.edu/~geomorph/gallery *Note that neither of these images have undergone sufficient validation with students. "@en ; edm:hasType "Graduating Project"@en ; edm:isShownAt "10.14288/1.0053590"@en ; dcterms:language "eng"@en ; ns0:peerReviewStatus "Unreviewed"@en ; edm:provider "Vancouver : University of British Columbia Library"@en ; dcterms:rights "Attribution-NonCommercial-NoDerivatives 4.0 International"@en ; ns0:rightsURI "http://creativecommons.org/licenses/by-nc-nd/4.0/"@en ; ns0:scholarLevel "Undergraduate"@en ; dcterms:isPartOf "University of British Columbia. EOSC 449"@en ; dcterms:title "Identifying Landscapes and their Formation Timescales: Comparing Knowledge and Confidence of Beginner and Advanced Geoscience Undergraduate Students"@en ; dcterms:type "Text"@en ; ns0:identifierURI "http://hdl.handle.net/2429/23321"@en .