UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Technology in middle school science : the effect of student-centered pedagogy on metacognition and conceptual… Gillis, Diane 2015

Your browser doesn't seem to have a PDF viewer, please download the PDF to view this item.

Item Metadata

Download

Media
24-ubc_2015_september_gillis_diane.pdf [ 2.95MB ]
Metadata
JSON: 24-1.0166622.json
JSON-LD: 24-1.0166622-ld.json
RDF/XML (Pretty): 24-1.0166622-rdf.xml
RDF/JSON: 24-1.0166622-rdf.json
Turtle: 24-1.0166622-turtle.txt
N-Triples: 24-1.0166622-rdf-ntriples.txt
Original Record: 24-1.0166622-source.json
Full Text
24-1.0166622-fulltext.txt
Citation
24-1.0166622.ris

Full Text

TECHNOLOGY IN MIDDLE SCHOOL SCIENCE: THE EFFECT OF STUDENT-CENTERED PEDAGOGY ON METACOGNITION AND CONCEPTUAL UNDERSTANDING IN SMALL GROUP AND WHOLE CLASS SETTINGS by Diane Gillis A THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF MASTER OF ARTS in The Faculty of Graduate and Postdoctoral Studies (Curriculum Studies) THE UNIVERSITY OF BRITISH COLUMBIA (Vancouver) August 2015 © Diane Gillis, 2015  ii Abstract Despite the frequent use of classroom response systems (CRS) in university courses, there is lack of research to support effectiveness of these handheld electronic devices (clickers) at the middle school (fifth grade) level. Furthermore, there is a scarcity of research comparing CRS-enhanced pedagogy in the middle school science context with traditional teaching methods. Additionally, research investigating how middle school science students think about their own thinking (metacognition) and how this correlates with a Self-Efficacy and Metacognition Learning Inventory - Science (SEMLI_S) measurement system (Anderson & Nashon, 2007) and Bloom’s taxonomy categories (Bloom, 1965) is equally scarce. My research explores how CRS-enhanced, student-centered pedagogy affects metacognition and conceptual understanding in the context of small group and whole class settings in middle school science. Overall results show that: CRS-enhanced pedagogy is a more beneficial instructional method for concept review and reinforcement compared with traditional teaching methods; SEMLI_S and Bloom’s taxonomy results were similar for lower Bloom’s taxonomy levels, however traditional teaching method results were higher when more challenging concepts are introduced; learning can be enhanced if SEMLI_S dimensions are utilized prior to Bloom’s taxonomy; and that following up use of SEMLI_S dimensions with Bloom’s taxonomy instruction provides the most effective teaching practices.    iii Preface This thesis is original, unpublished, independent work by the author, D. Gillis. All thesis work and associated methods were approved by the University of British Columbia’s Research Ethics Board [protocol # H13-00587].    iv Table of Contents Abstract ......................................................................................................................... ii Preface ......................................................................................................................... iii Table of contents .......................................................................................................... iv List of tables ............................................................................................................... viii List of figures ................................................................................................................ ix Acknowledgements ...................................................................................................... xi Dedication ................................................................................................................... xii Chapter 1: Introduction .............................................................................................. 1 1.1 Problem statement ............................................................................................... 1 1.2 Research questions ............................................................................................. 6 1.3 Clickers, STEM education, and metacognition .................................................... 6 1.3.1 Metacognition ........................................................................................................................ 11 1.3.2 Social context ........................................................................................................................ 19 1.4 Researcher’s position and significance of the study .......................................... 21 1.5 Theoretical framework ....................................................................................... 24 Chapter 2: Literature review ..................................................................................... 27 2.1 Beginning with a story (contrasting perspective): .............................................. 27 2.2 Situating the exemplar ....................................................................................... 28 2.3 What is Scaffolding? .......................................................................................... 30 2.4 Technology parallels/background information ................................................... 31 2.5 From holistic perspective to context .................................................................. 33 2.6 Whose voice? .................................................................................................... 35  v 2.7 Triangulation & validity ....................................................................................... 37 2.8 Discourse: How much is too much? .................................................................. 39 2.9 Seeking best practices utilizing CRS ................................................................. 40 2.10 Attention to metacognition ............................................................................... 44 2.11 Metacognitive coding strategies ...................................................................... 46 2.12 Interviews, video and/or audio-taping .............................................................. 46 2.13 Rival propositions ............................................................................................ 51 2.14 Summary ......................................................................................................... 51 Chapter 3: Methodology ........................................................................................... 54 3.1 Participants ........................................................................................................ 75 3.2 Student preparation: POOP & QUEST .............................................................. 78 3.3 Data collection ................................................................................................... 82 3.2.1 Treatment of lab group work handouts .................................................................................. 86 3.2.2 Treatment of clickers ............................................................................................................. 88 Chapter 4: Results .................................................................................................... 91 4.1 Analysis of SEMLI_S questions and sets .......................................................... 91 4.2 Analysis of Bloom’s taxonomy questions 4 & 5 ............................................... 103 4.2.1: Summary comparing Bloom’s taxonomy questions 4-5 & SEMLI_S questions 1-3 ........... 107 4.2.2: Summary comparing set 1 Bloom’s taxonomy KCA & SEMLI_S application ..................... 107 4.2.3: Summary comparing set 2 Bloom’s taxonomy KCA & SEMLI_S knowledge ..................... 108 4.2.4: Summary comparing set 3 Bloom’s taxonomy KCA & SEMLI_S application ..................... 108 4.2.5: Summary for research question 2 ...................................................................................... 111 4.3 Analysis of student exit questionnaire (survey) ............................................... 111 4.4 Analysis of quiz results .................................................................................... 116  vi 4.5 Main findings .................................................................................................... 117 Chapter 5: Discussion ............................................................................................ 120 5.1 Discussion of findings ...................................................................................... 120 Chapter 6: Conclusions, limitations, implications and recommendations ....... 126 6.1 Conclusions ..................................................................................................... 126 6.2 Limitations ........................................................................................................ 127 6.3 Implications and recommendations ................................................................. 128 6.3.1 Implications and recommendations for practice .................................................................. 128 6.3.2 Implications and recommendations for curriculum .............................................................. 129 6.3.3 Implications and recommendations for research ................................................................. 131 References ................................................................................................................... 133 Appendices ................................................................................................................... 141 Appendix A: Clicker use curriculum calendar ..................................................... 141 Appendix B: PowerPoint Presentations ............................................................... 149 Chemistry Power Point 1 ....................................................................................... 149 Chemistry Power Point 3 ....................................................................................... 153 Chemistry Power Point 4 ....................................................................................... 154 Chemistry Power Point 5 ....................................................................................... 155 Chemistry Power Point 6 ....................................................................................... 156 Chemistry Power Point 7 ....................................................................................... 157 Appendix C: Student interview questions & transcripts ..................................... 158 Appendix D: A taxonomy of Socratic questions .................................................. 161 Appendix E: Excel spreadsheet sample ............................................................... 164  vii Appendix F: Student exit questionnaire ratings .................................................. 165 Appendix G: Histogram results for ppt#5, question #3 ....................................... 167  viii List of Tables Table 1: Five dimensions of metacognition for the SEMLI_S instrument ................................... 18 Table 2: SEMLI_S metacognition rubric for small group work handout questions #1-3 ............ 60 Table 3: Bloom’s taxonomy rubric for small group work handout questions #4 & 5 .................. 72 Table 4: SEMLI_S items .............................................................................................................. 83 Table 5: Clicker Technology (CT) – Non-Clicker (NC) review schedule .................................... 89 Table 6: Knowledge by clicker question 1-3 ................................................................................ 92 Table 7: Knowledge by clicker set 1-3 ......................................................................................... 94 Table 8: Comprehension by clicker question 1-3 ......................................................................... 96 Table 9: Comprehension by clicker set 1-3 .................................................................................. 98 Table 10: Application by clicker question 1-3 ............................................................................ 100 Table 11: Application by clicker set 1-3 ..................................................................................... 102 Table 12: Bloom’s taxonomy by question 4-5 ............................................................................ 105 Table 13: Bloom’s taxonomy by set 1-3 ..................................................................................... 109 Table 14: Summary of percentages for effectiveness categories (exit questionnaire results) .... 115     ix List of Figures Figure 1: Clicker hand-held device, instructor device & frequency transmitter ............................. 8 Figure 2: Sample histogram ............................................................................................................ 9 Figure 3 Triangulation components for increased metacognitive awareness ............................... 14 Figure 5: Peer Instruction (PI) pedagogy flowchart ..................................................................... 16 Figure 6: Lab group work handout ............................................................................................... 57 Figure 7: Lab group work student sample .................................................................................... 58 Figure 8: Lab question/comment handout .................................................................................... 59 Figure 9: QUEST handout ............................................................................................................ 80 Figure 10: Bloom’s taxonomy ...................................................................................................... 82 Figure 11: Lab group work handout, completed student sample .................................................. 87 Figure 12 (a): Graph of knowledge by clicker question 1-3 ......................................................... 92 Figure 12 (b): Graph of average SEMLI_S knowledge question 1-3 ........................................... 93 Figure 13 (a): Graph of knowledge by clicker set 1-3 .................................................................. 94 Figure 13 (b): Graph of average SEMLI_S knowledge set 1-3 .................................................... 95 Figure 14 (a): Graph of comprehension by clicker question 1-3 .................................................. 96 Figure 14 (b): Graph of average SEMLI_S comprehension question 1-3 .................................... 96 Figure 15 (a): Graph of comprehension by clicker set 1-3 ........................................................... 98 Figure 15 (b): Graph of average SEMLI_S comprehension set 1-3 ............................................. 99 Figure 16 (a): Graph of application by clicker question 1-3 ....................................................... 100 Figure 16 (b): Graph of average SEMLI_S application question 1-3 ......................................... 101 Figure 17 (a): Graph of application by clicker set 1-3 ................................................................ 102 Figure 17 (b): Graph of average SEMLI_S application set 1-3 .................................................. 103  x Figure 18 (a): Graph of Bloom’s taxonomy question 4-5, clicker & no clicker ......................... 105 Figure 18 (b): Graph of Bloom’s taxonomy averages, question 4-5, clicker & no clicker ........ 106 Figure 18 (c): Graph of KCA only, Bloom’s taxonomy question 4-5, clicker & no clicker ...... 106 Figure 18 (d): Graph of KCA averages, Bloom’s taxonomy question 4-5, clicker & no clicker 106 Figure 19 (a): Graph of Bloom’s taxonomy set 1-3, clicker & no clicker .................................. 109 Figure 19 (b): Graph of Bloom’s taxonomy averages, set 1-3, clicker & no clicker .................. 110 Figure 19 (c): Graph of KCA only, Bloom’s taxonomy set 1-3, clicker & no clicker ............... 110 Figure 19 (d): Graph of KCA averages, Bloom’s taxonomy set 1-3, clicker & no clicker ........ 110 Figure 20: Student questionnaire sample – clickers, whole class study ..................................... 113 Figure 21: Graph of student exit questionnaire ratings ............................................................... 114 Figure 22: Quiz results for clicker and non-clicker use .............................................................. 116     xi Acknowledgements  I would like to acknowledge my graduate advisors, Dr. Samson Nashon and Dr. Marina Milner-Bolotin for their valuable feedback and continuous guidance. As consultants, teachers and mentors for my M.A. thesis, both supervisors offered unwavering support and encouragement throughout my academic journey. Their courses motivated me to think critically about meaningful curriculum design in science education and inspired me to pursue this research study. Also, thank you to Dr. Sandra Scott, committee member extraordinaire, whose graciousness and thoughtful insights helped me revisit and rethink chapter details.  My sincere thanks also go to John Brooks and Paul Curtis for their valuable statistical advice. Their patience, kindness and insights were truly appreciated. I am also grateful to Dr. Guillermo Algaze and Susan Algaze for time spent proofreading and listening as I grappled with research ideas. Gratitude also goes to Dr. Maria Tillmanns for sharing her philosophical wisdom during our engaging discussions of Socratic questioning methods. I would also like to thank Dr. Peter Cole and Dr. Pat O’Riley for convincing me that thinking differently and infusing creativity into the research process provides rich rewards. I am also grateful to Basia Zurek for her professionalism and administrative support. I would like to acknowledge all my students who inspire and amaze me every day. I am also obliged to my family for the encouragement and tireless efforts in assisting me in ways you may never fully realize.   Finally, thanks to my husband, José Enrique, for his savvy technical skills and unrelenting reassurance that no problem regarding data input and analysis is insurmountable.      xii Dedication  This study is dedicated to my family. I thank my husband, José Enrique, and our son Quito for their love, motivation, and words of encouragement in completing this fulfilling journey. A special thanks to my parents, Jack and Stella Gillis, for instilling a strong work ethic and for being wonderful role models for perseverance and tenacity. Last, but not least, to my in-laws Teresita and Joaquin de Sequera for their unconditional support and unwavering faith.     1  You have brains in your head. You have feet in your shoes. You can steer yourself any direction you choose. You’re on your own. And you know what you know. And YOU are the guy who’ll decide where to go.  Dr. Seuss, Oh, the places you’ll go! (Random House, 1990) Chapter 1: Introduction 1.1 Problem statement Although electronic CRS such as clickers are used in university settings for large-enrolment classes (Milner-Bolotin,  2004; Milner-Bolotin, Antimirova & Petrov, 2010) my experiences as an educational researcher  and classroom teacher indicate that clicker use is not prevalent in middle school science classrooms. There a current nationwide emphasis in the  United States (U.S.) on Science Technology Engineering and Math (STEM) education (2013). Also, the Program for Internatinal Student Assessment (PISA) statistical analysis (2009, 2012) ranked the U.S. thirteenth out of thirty-three and twenty-first out of thirty-four respectively for Organization for Economic Co-operation and Development (OECD) countries for science literacy. Hence, there clearly is a need to examine the potential for effective pedagogies incorporating clicker technology in the middle school science setting. Stakeholders such as scientists, science educators, and community members stand to benefit from students who have better metacognitive abilities, can demonstrate higher level reasoning skills, and those who participate in educational settings where technology fosters metacognitive or self-regulated learning (SRL) (self-monitoring and goal-oriented learning) (Azevedo, R & Aleven, V, 2013).   2 According to Anderson and Nashon (2007), metacognition also incorporates ‘self-perceptions of one’s capacity to learn (self-efficacy) and includes how confident an individual is about the effectiveness of his/her learning process and the results of the learning process’ (p. 300). Based on clicker use in higher-education settings in concert with PI pedagogy (Mazur, 2009) and a focus on metacognition, effective use of this technology along with student-enhanced pedagogy could potentially promote improved metacognitive awareness and conceptual understanding at the middle school level. Exploring this potential is the focus of my research. To determine the effectiveness of PI pedagogies in middle school science teaching and learning utilizing clicker technology, this study examines how PI pedagogies affects metacognition and conceptual understanding and how results can be compared between clicker use and traditional teaching methods using a SEMLI_S instrument (Thomas, Anderson, & Nashon, 2008) and Bloom’s taxonomy (1965) in whole class settings. Dialogue, communication, and collaboration between students in small groups and whole class settings support the focus on student-centered pedagogy in this study. As the most crucial stakeholders, students will ultimately gain the greatest advantage from this research.  Many studies indicate more positive student attitudes, better engagement with the subject matter, and increasing test scores through the use of clicker-enhanced pedagogy (e.g. Boyle & Nicol, 2003; Caldwell, 2007; Crouch & Mazur, 2001; Draper & Brown, 2004; Jones, 2012; Lasry, Guillemette & Mazur, 2014; Milner-Bolotin et al., 2010). A CRS such as clickers aids the instructor for posing multiple-choice questions and polling students’ answers during class. After the instructor displayes a question on a screen, students can vote their responses by using handheld clickers. The instructor then has the  3 opportunity to display the answers with a histogram showing the number of students selecting each response (Mazur, 1997; Milner-Bolotin, Antimirova & Petrov, 2010; Milner-Bolotin, 2004) (www1.iclicker.com, 2014). Mazur (2009) suggested PI as a pedagogy of inclusive student engagement where a structured questioning process involves every member of the class. According to Mazur, PI incorporates science activities into the classroom which require students to focus on concepts presented to them, then in turn, students explain concept details to classmates.  Currently, there is a gap in the relevant research regarding the effect of student-centered pedagogy utilizing clicker technology on the processes necessary to enhance and improve learning opportunities for students younger than the postsecondary level (Milner-Bolotin, et al., 2013). In an effort to narrow the gap, I employed a mixed methods research study within my own middle school science classroom to examine the effect of clicker use on student metacognition and conceptual understanding amongst my middle school science students. This study, involved data collection using SEMLI_S dimensions, distribution of questionnaires, observations of small group interactions, and student interviews. To support my research, I drew upon the idea that if students become more aware of methods that enhance their own learning process (metacognition) and can determine underlying meanings of concepts (conceptual understanding), they may become more independent and empowered learners. For the purposes of this research, terms such as “clicker-enhanced pedagogy”, “peer instruction (PI) pedagogy utilizing clickers”, “PI enhanced pedagogy utilizing clicker technology” and “classroom response system (CRS)-enhanced pedagogy” may be used interchangeably.  4 CRS provides both instructor and students with immediate, continuous, and accurate formative (on-going) feedback and helps with being “reflective” in their teaching/learning practice.  Individual student responses to conceptual questions can be shared with the class anonymously in the form of histogram results (Figure 2). Data about student participation on individual multiple-choice questions can be stored electronically which gives instructors more evidence to support and potentially enhance learning processes among students (Petrov, Antimirova & Milner-Bolotin, 2010).  The common practice of asking questions during a lecture (traditional teaching style) typically engages only a small portion of students. Use of clicker-enhanced pedagogy however provides an opportunity for each student to actively contribute in class through more structured questioning tactics (Mazur, 1996).  The setting for this research study is four fifth grade (middle school) science classes with 17 - 20 students per class. Clicker-enhanced PI provides students with the opportunity to re-think the topics/concepts through discussion (helping them integrate the concepts better) and strengthen their reflective skills as a result of immediate feedback. Utilizing this strategy, students’ critical thinking skills may be developed and enhanced along with their problem solving (higher order cognitive skills), and metacognitive skills. The students’ confidence levels may improve, and they may be more engaged in the learning process. PI empowered by use of clicker technology serves to meaningfully engage students in constructing knowledge within a collaborative learning environment (Dufresne et al., 2000).  The interest in incorporating clicker-enhanced pedagogy has been triggered by the necessity of promoting student participation and enhancing student dialogue in large-size  5 classrooms in a university setting (MacArthur, 2008). However, Smith’s study (2010) of a university biology course revealed that clickers can be very effective in small-size classes (N=25) in terms of increased student engagement, participation, and the advantages of immediate feedback. Also, one study from Black and William (2001) revealed promising results about the effectiveness of clicker-enhanced pedagogy in K-12 classrooms. This study revealed that clicker use improved learning by giving students a chance to reflect on his/her thinking through displayed histograms. Student surveys supported these findings. Drawing upon the results from Mazur, Smith and Black and William’s studies, I suggest that pedagogy could be effectively supported by clickers and may be a viable strategy to promote middle school students’ improved metacognitive awareness and conceptual understanding.  The goals of my research were to: (1) investigate how student use of clickers in a whole class setting might allow students to construct and reconstruct conceptual ideas (Anderson & Nashon, 2007); (2) investigate the effectiveness of student clicker use on a) idea-sharing within small lab group dynamics (Anderson & Nashon, 2007), and b) levels of metacognition; (3) propose a framework for consideration by middle school science educators, stakeholders and other decision-makers for broader use of clicker-enhanced pedagogy in middle school science to enhance student metacognition and conceptual understanding. This mixed methods Action Research (AR) study (Graham, 2013; Levy, 2003) was conducted over a five month period at a private school in Southern California (U.S). The qualitative component involved open-ended student interviews and observations of student interactions, and the quantitative component used a SEML_S instrument  6 (Thomas, Anderson & Nashon, 2008) to investigate students’ levels of metacognition in the form of numerical data. The qualitative data provided deeper interpretation of the effect of PI pedagogy on students’ conceptual understanding and metacognition. Under the umbrella of the mixed methods approach, data analysis involved: identification of variables; identification of relationships between variables and questions posed; organization of data; integration of data at different stages of research; and incorporation of practices of both quantitative and qualitative research methodologies. Research findings determined the effectiveness of clickers amongst 5th grade science students. Based on the results of the analysis, I make recommendations regarding the usefulness of clickers as a technological tool to promote pedagogy in the middle school science classroom.  1.2 Research questions The following research questions were investigated in this study: 1.! How does clicker-enhanced pedagogy affect student metacognition and conceptual understanding in the context of small (lab group) interactions and whole group middle school science classes? 2.! How do SEMLI_S dimensions and Bloom’s taxonomy results compare during clicker-enhanced pedagogy and traditional teaching methods? 1.3 Clickers, STEM education, and metacognition CRS systems are frequently used at the university level, and there is much research suggesting that devices such as clickers improve metacognition and conceptual understanding among students (e.g., Boyle & Nicol, 2003; Jones et al., 2012; MacArthur, 2008; Mazur, 2009). However, there is little research available that examines clicker- 7 enhanced pedagogy use in educational levels lower than postsecondary institutions (Milner-Bolotin, et al., 2013). With the current emphasis on STEM education, teachers, educators, and leaders in business and industry are all stakeholders in improving science, technology, engineering and math teaching and learning in schools. From environmental awareness to solving future energy problems, a stronger foundation in STEM is essential towards career skills and for students to understand the increasing complexity of the world around them (National Science Foundation, 2011). This research study offers an opportunity to examine how clicker technology could potentially improve learning in keeping with the tenets and recommendations of STEM.  Additionally, most students in the middle school classroom in the Unites States are already tech-savvy and gravitate naturally towards technology. The majority of middle school students in the U.S. today have grown up surrounded by technology, are open-minded and enthusiastic about new technology and have a high level of comfort when introduced to new devices such as clickers. Bring Your Own Device (BYOD) policies in schools also allow students to access clicker devices directly from smartphones. Plickers (https://www.plickers.com/), another technology option used in tandem with the instructor’s smartphone, allows quick checks for understanding without the need for student devices. With the variety of technology options in mind, this creates an ideal environment to focus on STEM-based education, enhance learning opportunities amongst 5th grade students, raise confidence levels, and nurture independent lifelong learners. Also the quick set up and ease of use also permits a shallow learning curve for teachers.   8 Clickers are hand-held devices with a numeric keypad that allow students to send answers to multiple choice questions created by teachers via a radio frequency transmitter (Figure 1). Once all students respond, question results can be displayed as a colored histogram (Caldwell, 2007; www1.iclicker.com, 2014) (Figure 2).  Figure 1: Clicker hand-held device, instructor device & frequency transmitter    9 Figure 2: Sample histogram  For the purposes of my study, questions were created using a power point program and projected onto a screen using an LCD device. Multiple choice question construction is usually strategic and the number of questions is typically limited to two and five for a fifty-minute period (e.g., Beatty, 2004; Burnstein and Lederman, 2001; Caldwell et al., 2006; Elliot, 2003; Jackson and Trees, 2003). In this study, students engaged in interactive science activities during small group (lab) situations as well as whole class discussions while utilizing clickers to answer multiple-choice PowerPoint questions. PI pedagogy also included side conversations between students (small groups of 2 – 3) promoting discussion of concepts and/or justification of answers to multiple-choice questions during PowerPoint question sessions utilizing clickers. Benefits of classroom clicker use include, but are not limited to increased engagement by a greater number of students, the ability to poll student responses and attitudes, diagnostic  10 assessment to reveal concept understandings and misunderstandings, improvement of student involvement in a lesson as well as the ability to assess lesson recall, guide student thinking during reviews, allow students to examine their own level of understanding at the end of a class, and to make a lecture fun (Caldwell, 2007).  Although their study was situated in a Teacher Education Program at a large Canadian university, Milner-Bolotin, Fisher and MacDonald (2013) mirror this sentiment by stating “It (PI pedagogy) utilizes Classroom Response Systems (clickers) to engage students in interactive activities and discussions through conceptual multiple-choice questions that target student difficulties, often referred to as misconceptions. PI has been found to be very effective in college STEM classrooms when students use either clickers (Hake, 1998; Milner-Bolotin, Antimirova & Petrov, 2010) or flashcards (Lasry, 2008)” (p. 524). As a science educator committed to STEM education and student-centered pedagogy, I believe that it is important to explore how clickers might be utilized in the middle school science classroom as a pedagogical tool. Through this study, I wanted to understand what clickers were, examine the level of comfort students exhibit during clicker use, investigate potential connections with stakeholders in a broader sense, explore how students-enhanced pedagogy might affect students’ metacognitive awareness and conceptual understanding, examine how student-enhanced pedagogy might empower middle school science students in a STEM-based learning environment, and understand the potential long-term benefits and greater implications for students of STEM-based education and student-centered pedagogy utilizing clickers.  In How People Learn: Brain, Mind, Experience, and School (2002), Bransford, Brown and Cocking state that “a metacognitive approach to instruction can help students  11 learn to take control of their own learning by defining their goals and monitoring their progress in achieving them” (p. 18). Incorporating strategies that allow students to reflect on how they think about their own thinking along with clicker-enhanced pedagogy may enhance learning opportunities in the middle school science classroom. 1.3.1 Metacognition Currently, there is a lack of uniform definitions for metacognition in the research literature (Anderson & Nashon, 2007; Larkin, 2006). This is not surprising given the fact that the theoretical construct of metacognition has gone through an evolutionary process. From an ambiguous, ill-defined concept to a construct with refined elements and organized subsystems, metacognition has evolved to include complex relationships with knowledge, metamemory, learning, cognition and reflection (Tarricone, 2011). Metacognition is employed to discuss a variety of learning behaviors. The definitions that exist reflect these varied perspectives on metacognition and include learner motivation and willingness to engage with the learning process (White, 1993; White & Gunstone, 1998); self-regulatory behaviors such as interpreting task demands, selecting, adapting or inventing strategies, self-assessment and decision-making (Butler, Beckingham & Novak-Lauscher, 2005); and self-awareness and efficacy (Nashon, Anderson & Nielsen, 2005). Flavell (1976) described metacognition as the action of the learner while in the process of solving problems.  As busy educators, many teachers often spend more time focusing on content than on how students acquire knowledge and process information. With this in mind, metacognitive awareness allows students to reflect on their own thinking and develop and use practical problem-solving skills to overcome challenges and resolve learning difficulties (Joseph, 2010).  12 Pintrich (2002) suggests that students become more knowledgeable about cognition as they develop, and they also become more aware of how they think about their own thinking. Bransford, Brown and Cocking (2002) add that as students become more aware of their own thinking, their learning improves. I believe that increasing metacognitive awareness may help students become more aware of their own learning, improve peer-to-peer collaboration within a lab team (small group) dynamic, engage students more effectively in the learning process, promote independent learning. I also believe that increasing students’ metacognitive awareness will in turn increase their confidence and opportunities for self-directed pedagogy. Metacognition concerns “thinking about one’s own thoughts” (Hacker, Dunlosky & Graesser, 1998) and involves “what one knows (i.e., metacognitive knowledge), what one is currently doing (i.e., metacognitive skill), or what one’s current cognitive or affective state is (i.e., metacognitive experience)” (p. 3). Flavell (1981) furthers this idea by describing metacognition as a “fuzzy concept” (p. 37) that needs clarification. He explains that intentional, deliberate awareness of cognitive actions allows an individual to improve their metacognitive abilities. Based on the above statements, I suggest that if educators can bring clarity to how content is being taught and can give students tools to increase metacognitive awareness, then more authentic science learning can take place.  Piaget wrote that cognitive development (acquiring knowledge and understanding through thought and experience) should be viewed “as a continuum involving the interaction of four influences: maturation, active experience, social interaction, and a general progression of equilibrium” (Piaget, 1961 as stated in Ewing, Foster & Whittington, 2011, p. 69) and that teachers could influence active experiences in the classroom by offering opportunities for students to explore and discover. Metacognition includes learning behaviors and skills that  13 increase a student’s understanding of his or her cognitive processes resulting in greater self-directed learning opportunities (Baird, 1986; Butler, 2002; White, Bakopanos & Swan, 1988). This, in turn, may empower students to be more confident, engaged learners. Figure 3 describes how promotion of metacognitive principles, application of strategic questioning in small groups, and incorporation of questioning strategies in whole class settings triangulate to improve students’ metacognitive awareness. The flowchart in Figure 4 describes in greater detail potential pathways towards increased metacognitive awareness in whole class and small group settings. In contrast, the flowchart in Figure 5 describes connections that may exist in whole class and small group settings when PI pedagogy utilizing clicker technology is incorporated into the science classroom setting. Although the outcome of enhanced conceptual understanding shown in Figure 5 (stated as ‘greater concept comprehension’ in Figure 4) may be similar, the differences outlined in Figure 5 include: use of clicker technology; emphasis on greater time and intellectual investment by teachers; and greater comfort level and pedagogical ownership of available STEM-based materials utilized by teachers.    14 Figure 3 Triangulation components for increased metacognitive awareness  Increased'metacognitive'awareness'Whole'class:'questioning'strategies'for'concept'reinforcement/reflection'Small'group:'application'of'strategic'peer'question'techniques''Promotion'of'metacognition'principles' 15 Figure 4: Metacognitive awareness flowchart  16 Figure 5: Peer Instruction (PI) pedagogy flowchart  Whole Class(concept introduction)Engage students in interactive activitiesSelf-reflectionSelf-analysis CollaborationPeerfeedbackPeerassessmentIdentify student difficulties and misconcpetionsEnhanced conceptual understanding by studentsIncorporation of teacher Pedagogical Content Knowledge (PCK)Greater student independence/confidenceGeneration of leadership traits and student empowermentIncreased conceptual understanding about science and active engagement amongst students to facilitate effective whole group contributionsPromote STEM-based learning opportuntiesIncrease student-centered pedagogyContentknowledgeWays to identify potential student difficultiesLevel/number of active learning opportunities for studentsConstruction of pedagogically effective conceptual science questionsSmall Group Peer Instruction (PI)(with clicker technology)todue toby addressingresults in results in results in results in 17 The various dimensions of metacognition serve as a starting point for interpreting and understanding learning behaviors and how they manifest in individuals who learn together in a collaborative small group setting. Anderson and Nashon’s (2007) operational definitions for five dimensions of metacognition are adopted here. These are as follows: (1) Constructivist Connectivity (CC): examining students’ perceptions of whether they make connections between information and knowledge; (2) Awareness (AW): the learner’s character of consciousness about learning; (3) Control (CO): self-regulation and conscious control over the learning process; (4) Self-Efficacy (SE): perceptions of the capacity to learn and use knowledge; and, considered as one category (MEP), (5) Monitoring (M): the ability to keep track of the learning process so things make sense; Evaluation (E): the ability to assess the fruitfulness of the adopted learning strategies; and Planning (P): conscious awareness and deliberate strategizing to manage learning. These five dimensions comprise a framework for how students enter and engage in learning situations (Table 1).           18  Table 1: Five dimensions of metacognition for the SEMLI_S instrument Dimension abbreviation Dimension term Dimension description CC Constructivist Connectivity Explores students’ perceptions of whether they make connections between information and knowledge AW Awareness The learner’s character of consciousness about learning CO Control Self-regulation and conscious control over the learning process SE Self-Efficacy Perceptions of the capacity to learn and use knowledge MEP Monitoring The ability to keep track of the learning process so things make sense  Evaluation The ability to assess the fruitfulness of the adopted learning strategies  Planning Conscious awareness and deliberate strategizing to manage learning (Source: Anderson & Nashon, 2007) Studying individual cognitive behavior or activity including metacognition is difficult because, in many ways, it is “covert behavior” (Garafalo & Lester, 1985). Brown (1987) and Gunstone (1994) both make a valid point when they suggest that it might be difficult to identify (and thus study) metacognition unless the learner is given something to be metacognitive about. In earlier studies, learners were typically given problem-solving tasks by researchers hoping to observe and explicate learning behaviors (Barron, 2000). An awareness of one’s own thinking is necessary (Paris & Winograd, 1990) for the individual to share with others what is largely an internal and private process. Also, individual motivation is important, so that the learner is willing to take on a problem-solving task (Baird, 1992). More broadly, Salomon et al. (1991)  19 question whether cognitive skills can be generalized as they discuss the context-specific nature of many types of cognitive skills. Similar questions may be asked about metacognitive skills, particularly in a classroom context where students are asked to use cognitive and/or metacognitive skills while working in groups. Problem-solving tasks are often assigned in science classrooms as students work in groups on laboratory activities, projects or other collaborative work (Akana & Yamauchi, 2004; Davier & Halpin, 2013; DeChurch & Mesmer-Magnus, 2010; Fischer et al., 2013; Hafner & Culp, 1996; Hmelo-Silver et al., 2013; Kirschner et al., 2011; Mintzes, Wandersee & Novak, 1998; Nickerson, 1988). Tasks that draw upon and value students’ prior knowledge can help them build deeper understandings, and connect these to a wider structure of understanding (Piaget, 1978). Within the middle school science classroom, teachers may provide rich opportunities for problem-solving contexts. Therefore I posit that the middle school science classroom could then provide fruitful places to consider how students bring their own metacognitive skills and behaviors to the group learning process. 1.3.2 Social context The relationship between the learner (student) and more experienced learning partners (parents and teachers) form the center of the social dimension of a student’s cognitive functioning. The relationship itself plays a fundamental role in that student’s life, which is vital to his or her learning process (Chaiklin, 2003; Karpov, 2003; Lave & Wenger, 1991; Vygotsky, 1978). Through the process of working with more experienced and knowledgeable partners, the student grows and matures. Individual positioning, then, within the social community mediates and directs conceptions about how the world  20 works (Gergen, 1994). Additionally, student-to-student (or peer) support strategies such as cooperative learning situations can also improve student achievement levels (Carter, Cushing, et al., 2008). From this perspective, I view the social grouping, and not the individual, as a suitable unit of analysis for examining a social setting. The teacher has a mediating role in a science classroom setting (Driver, Asoko, Leach, Mortimer & Scott, 1994). The role of mediator is enacted through activities in the classroom where students are offered opportunities to verbalize ideas or share thoughts. Through this process students can therefore build new understandings. Consistent interaction with authentic information on a meaningful level is key (Ausubel, 1968) such as the crafting of meaningful multiple-choice power point review questions during clicker use. Careful construction of multiple-choice questions using scaffolding strategies can maximize learning (Beatty, 2009; Connor, Jakobsons, et al. 2009). As previously stated in this chapter, active engagement in postsecondary STEM classrooms through PI pedagogy and conceptual multiple-choice questions that target misconceptions has been effective utilizing clickers or flashcards (Milner-Bolotin, Fisher & MacDonald, 2013). Similarly, strategically created multiple-choice questions can help nurture active, intellectual engagement amongst students in the learning process (Beatty, 2009). Therefore, these strategies may also be effective in middle school science classrooms. Hence this this study with my own fifth grade classes.  Individual thinking and processing become visible as the individual comes to define himself or herself in relation to significant others (Mead, 1934). Peer interactions or interactions with parents or teachers and classroom activities offer possibility for this type of interactional dialogue. Dewey (1916,1966) also offers advice on activities and  21 notes that they “not only direct the natural tendencies of youth, but they involve intercourse, communication and cooperation…all extending the perception of connections” (p. 358). Learning, then, is the “acquisition of many specialized abilities for thinking about a variety of things” (Vygotsky, 1978, p. 53). Thinking and learning activities are actions that happen within an activity system. In order to effectively examine students’ thinking and learning processes during classroom activities, it is important to recognize the essential components of metacognition and conceptual understanding amongst students in the context of self-efficacy and students’ awareness of situational perception (Anderson, Nashon & Thomas, 2008). In this research paper, science activities are presented as a means to examine the group interactions within which student metacognition can be interpreted and understood. As mentioned above, this study was conducted with my own students (four classes) in a fifth grade (middle school) science classroom; this research took place at a private school in Southern California. 1.4 Researcher’s position and significance of the study After teaching both high school and middle school science for many years, I am intrigued by how students acquire, understand, and retain information. Over the years, I have contemplated how students interpret concepts, how authentic understanding of science concepts occurs, how background experiences influence learning, how students recognize and improve metacognitive abilities through a carefully crafted science curriculum, and how they can engage more meaningfully in the learning process during interactive, teacher guided lessons. With each curriculum goal and objective, I endeavor to create dynamic lessons to enhance learning by being aware of learning differences  22 amongst students and by implementing strategies that help all learners reach their potential. Also, by upgrading my skills by seeking professional growth opportunities I have increased my awareness of current technology options. Crafting effective clicker questions by incorporating scaffolding strategies to maximize learning is essential (Connor, 2009) as is nurturing active, intellectual engagement in the learning process (Vygotsky, 1978).  Additionally, taking a radical constructivist approach to curriculum design, von Glasersfeld (1989) explored the idea of focusing not solely on knowledge acquisition but also on supporting the “art of learning” (p. 192). von Glasersfeld emphasized that offering help to students as opposed to instruction is a more valuable learning experience; and when students become intrinsically motivated and can “conceptually grasp a situation” (p. 188) then learning improves. Of equal importance is an emphasis on PI pedagogy and an investigation of metacognition and conceptual understanding amongst students in a middle school science classroom (Flavell, 1971; Milner-Bolotin, Fisher & MacDonald, 2013).  With a passion for science education, and a desire to instil strong foundational science process skills and a love of learning within my students, I believe that by effective implementation of clicker-enhanced pedagogy metacognition and conceptual understanding may be improved amongst students. Actively engaging students in the learning process through the enrichment of whole class and small group activities is key to capturing student interest. Enhancing metacognitive and conceptual understanding amongst middle school students and striving to increase knowledge transfer by incorporating a metacognitive approach to instruction (Bransford, Brown, & Cocking,  23 2000) are also essential components of meaningful curriculum construction. Based on my experience, lessons designed with this in mind may allow students to take ownership of the learning process more readily and may improve their confidence as they approach science challenges and critical thinking exercises. Additionally, helping students articulate ideas and modify their questioning strategies within small groups in an effort to motivate meaningful, interactive discussions (Beatty & Gerace, 2009) may be enhanced by PI. In a technological age, educators can also “speak their students’ language” by incorporating clickers, iPads, cell phones (or other electronic devices that promote active student engagement) into middle school science lessons.  The majority of American middle school students have been raised in a technological age and although computers and software are already prevalent in classrooms, clicker technology is not typically utilized in science education beyond large class university lecture settings. Stakeholders who will potentially benefit from my investigation of clicker use in the middle school science classroom are: students, teachers, parents, colleagues, department heads, administrators, engineers and design technicians for technology and clicker device companies, IT departments, school boards and businesses that employ scientists. Key decision-makers such as administrators and technology supervisors need to be aware of available tools such as clickers that could potentially transform classroom learning; they must be convinced that these technological tools are useful and must be made aware of their potential. Parents need to support school decision-makers to maximize opportunities that will ultimately benefit their children.   24 1.5 Theoretical framework Through the lens of constructivism  as conceived by von Glasersfeld (1983) and motivated by an examination of active engagement pedagogy in the form of PI (Milner-Bolotin, Fisher & MacDonald, 2013), this study examined the effectiveness of incorporating clickers into middle school science classrooms. Espousing the integration of student-centered pedagogy and technology, this study is also framed by the TPACK (technology, pedagogy and content knowledge) model (Mishra & Koehler, 2006). Relationships between technology use and pedagogy were investigated using peer instruction during whole class settings and using peer instruction in small group (lab) settings. The TPACK model describes the interaction between teachers’ foundational content knowledge, practices of teaching and learning as it relates to student knowledge construction, and a deep understanding of how technological tools can be incorporated effectively into the classroom. Understanding the complex interaction between TPACK components and employing this model as a framework for clicker research brought authenticity to this study.  According to Hewson and Hewson (1988), “learning is not an accumulation of bits of information,” but a dynamic process that requires the active engagement of the learner (Cobb, 1994; Driver et al., 1994; Ernest, 1995; Posner et al., 1982; Vygotsky, 1978) within effective and systematized instruction (Ausubel, 1963). Therefore, effective pedagogy promotes the interaction of curriculum content in context with the learner and, with clickers this technology reflects the philosophies within the TPACK framework (Koehler & Mishra, 2006). It also reinforces that, to achieve authentic pedagogical  25 practices, technology such as CRS should be used in concert with meaningful teaching practices and not as a separate entity. In this study, the TEFA (Technology-Enhanced Formative Assessment) model was also used to examine clicker-enhanced pedagogy (Beatty & Gerace, 2009), which espouses four interlocking pedagogical principles. These include: (1) question-driven instruction; (2) dialogical discourse; (3) formative assessment; (4) and meta-level communication, all connecting with and relating to PI pedagogy. According to these principles and student-centered pedagogies, educational objectives include: investigating how students evaluate and critically analyze problem-solving methods; allowing students to practice self-regulation skills and improve metacognition; and encouraging teachers to improve instructional efficiency by refining the communication process between teacher and student. This ultimately influences the learning process within the classroom. The TEFA model helped create a framework for PI pedagogy and clicker question design and construction. The TEFA-based question design and assessment model led to more thoughtful student interactions, more authentic data collection and served as a modified version of PI for this research study. As a result, the tenets of TEFA dovetailed suitably with the goals of this study.   Considering PI pedagogy along with the TPACK and TEFA models, this theoretical framework guided my investigation of metacognition and the learning process amongst 5th grade science students in whole class and small group settings by making connections between students’ conceptual understanding of content with and without clicker technology, evaluating the learning process with and without clickers, and utilizing clicker technology to assess learning. Furthermore, the framework allowed an  26 in-depth analysis of CRS use as it relates to learning awareness amongst students, self-regulation of the learning process, and effectiveness of formative assessment. TPACK and TEFA provided the framework for the construction of meaningful clicker questions, an understanding of effective PI pedagogy during CRS use in both whole class and small group settings, and allowed for a deeper understanding of student metacognition and content knowledge.   Chapter 1 presented the problem and research questions and situated this study in terms of metacognitive awareness, PI pedagogy and SEMLI_S measurements instrument. Chapter 2 provied a review of the relevant research and presents a critical review of a case study. Chapter 3 describes the methodology developed for this mixed methods Action Research (AR) study as well as the data collection procedures. In Chapter 4, research study findings are revealed leading to a discussion of the findings in Chapter 5. Conclusions and implications and recommendations are presented in Chapter 6.    27 Chapter 2: Literature review This chapter provides an in-depth look at Hannafin and Kim’s case study (2010). The study describes a technology based scaffolding exercise in a sixth grade science classroom. This study was chosen based on similarities of grade level, technology use, peer to peer and student-teacher interactions, as well as a focus on metacognition and inquiry. In Hannafin and Kim’s (2010) case study, parallels were drawn between their classroom setting, technology use and interactive student-teacher dynamic as well as the topic being addressed (science), and my own classroom situation. 2.1 Beginning with a story (contrasting perspective): Imagine for a moment that schools were represented as world famous museums such as the Prado in Spain, the Louvre in Paris France, the Guggenheim in Bilbao, or New York’s Metropolitan Museum of Art. As you stroll through the galleries and enter the modern art wing (middle school), locate a single, wildly ornate masterpiece (one particular class) and consider every speck of exploding color (individual students) in this magnificent collage. Take a breath, open your eyes wider and then open your mind. Try and appreciate for a moment the wonder of the piece and the significance of each deliberate, masterful brushstroke. Consider the artwork’s importance; the depth, thoughtfulness, meaning and technique in every shape, color and texture. Colors morph, adapt and evolve with the changing light, offering new meaning from every distance and from each varied angle. Contemplate the unexpected movement within this piece and consider: Is this an expression of life experiences? Does this piece push boundaries? Is there structure within this piece or just chaos? However you view this magnificent work of fictitious modern art, you are only limited by your imagination. Welcome to the remarkable, creative world of middle school students.   28 The paragraph above illustrates how I view my middle school science students. Themes within the story make up part of the multifaceted perspective I use when considering each fifth grade child. Students enter the lab every day expecting to be enlightened by my instruction, by my passion for science, and by my enthusiasm for sharing information. They often comment about the world around them and are seated with eyes glued to me for direction; ready to absorb some nugget of new scientific knowledge. I see each one as a unique individual; each one with a different level of ability; each one with a varying degree of tenacity and drive; each one a different hue in a modern art painting. Separately, each student represents a distinct tone on an artist’s palette, but as a whole they blend and merge on canvas to create an extraordinary group collage. Despite the fact that I began this chapter with artistic imagery, the case study research I conducted is a more science-focused study involving CRS and metacognition in whole class and small group settings. To me, there is artistry in case study research and in a qualitative approach to examining connections between clicker technology and metacognition. This is where science and art fuse together and become the lens through which I view my classes and curricula; a lens that allows me to study and interpret the nuances of each science student I teach.  2.2 Situating the exemplar To determine how technological tools such as clickers may influence metacognition in whole class and small group settings, it is important to consider the focus of my inquiry and to determine if there are valid parallels between Hannafin and Kim’s (2010) work and the research I intend to conduct. In Hannafin and Kim’s (2010) case study of problem-solving in 6th grade technology-enhanced classrooms, computers were utilized to determine best practices towards metacognition and inquiry-based pedagogy. In my study, I examined how clicker technology might influence metacognition in whole class and small group settings. Within both settings, it is  29 important to recognize the impact of effective student-based question construction; intrinsic to productive clicker use are strategically designed, teacher-created multiple-choice questions, which require a clear curricular intent. Teachers can use clicker questions to: introduce students to new concepts; to check for immediate understanding of current concepts; to review curriculum material over time; or to assess the effectiveness of a particular lesson.  Due to the functional properties of the clicker remote (i.e. A-E letter selection), instructor designed multiple-choice PowerPoint questions accommodate the operation of this device. PowerPoint (or other) questions can be effectively crafted by: incorporating scaffolding strategies into their design to maximize learning (Connor, 2009); nurturing in learners an active, intellectual engagement in the learning process (Vygotsky, 1978); taking a constructivist approach to curriculum design (von Glasersfeld, 1989). The article I chose to review: Hannafin and Kim’s (2010) Scaffolding 6th graders’ problem solving technology-enhanced science classrooms: a qualitative case study resonates with me because this research study: has a real-world focus (Yin, 2014); describes a contemporary phenomenon (p. 16); has an inquiry-based problem-solving perspective; uses of scaffolds to examine student metacognition; incorporates inductive research strategies (Merriam, 1998) embedded in its research strategies; uses fuzzy generalizations (Merriam, 1998) to indicate that a result may have happened; uses scaffolds to examine metacognition amongst students; and reflects the authors’ fundamental roots in constructivism. Three other more obvious reasons that are worth mentioning include the similar grade level of the article’s case study. Since I teach fifth grade students, I relate to the use of technology in this study and the focus on determining how students might overcome metacognitive roadblocks as they utilize technological tools to learn new science concepts.   30 As Hannafin and Kim’s (2010) study was a qualitative case study design, the authors were also concerned with the essence of a phenomenon (Merriam, 1998, Yin, 2014). They grounded their data in a real-world context (Merriam, 1998, Yin, 2014) and utilized an embedded case study (rooted in a larger case study) method (Yin, 2014) The authors investigated how peer, teacher, and technology-based scaffolds influenced students’ inquiry processes. Using an interactive online program called WISE (Web-based Inquiry Science Environments) in two 6th grade science classes, Hannafin and Kim (2010) explored students’ problem-solving strategies as they implement, discover, reconstruct and analyze scientific problems concerning wolves in a real-world scenario. The WISE program complements the scaffolding framework by asking students to connect prior knowledge to the tasks, gives verbal prompts, and allows students to receive scaffolding assistance from their peers, teachers and the program itself. As indicated, although specific curricular concepts differ, Hannafin & Kim’s (2010) article dovetails with my research mainly because both studies focus on student inquiry in a technology-based setting within the middle school science classroom. However, my study examined the impact of PI in middle school science, not web-based science inquiry. The current study also explored PI enhanced pedagogy utilizing clicker technology and its influence on metacognition, not its influence on scaffolding.  2.3 What is Scaffolding?  To describe the relationship between teacher and student within group settings, Vygotsky (1978) wrote that there is a cultural and social component to learning which he described as the Zone of Proximal Development, or ZPD. Considering the tenets of the scaffolding process such as student-centered learning, small group collaboration and peer interaction, connections can be made between scaffolding techniques and ZPD. In scaffolding, teacher-student interaction is  31 frequent as new concepts are introduced, however in subsequent stages teacher-student interaction is less frequent and teachers “encourage interpretation through metacognition and guided practice” (Vacca, 2008, p. 653). Ultimately, students work both independently and within small groups to process and extrapolate ideas. With less dependence on teacher support, the scaffolding process allows students to take more responsibility for their learning and to problem-solve with less teacher intervention. Self-evaluation is also a key component to scaffolding lesson designs and teachers often implement rubrics as a template to guide students through the scaffolding process. Hannafin and Kim’s (2010) utilize technology-enhanced scaffolds to optimize conceptual understanding and facilitate learning in a science-based unit on wolves.  Advantages of scaffolding exercises may include: instruction differentiation, greater collaboration between students in small groups, stronger engagement in the learning process, less teacher driven instruction, independence amongst learners, greater momentum towards more effective learning and increased responsibility amongst students. In their study, Hannafin and Kim (2010)  examined how scaffolds influence student inquiry during an investigation of wolf environments, feeding habits, and controversies surrounding wolf populations. Within this unit, students examined and determined management options, built opinions, and discussed and revised ideas. In the exemplar, Hannafin and Kim (2010) investigated how a 6th grade science teacher utilized peer, teacher and technology-based scaffolding to provide a framework for students to construct ideas and overcome challenges during scientific, inquiry-based problem-solving activities.  2.4 Technology parallels/background information Considering my own research path, the technology theme is similar to Hannafin and Kim’s (2010) case study. It differs however in the following ways technology-enhanced  32 scaffolds were not used. There was no web-based program, and the lesson construction and curriculum design differed from my research. Similarities between my study and Hannafin and Kim’s included attention to metacognition, promotion of critical thinking opportunities, and examining authenticity of student learning situations. I also concur with the authors’ stance that promoting active engagement as opposed to passive instruction through whole class and small group activities is key to captivating student interest. Additionally, enhancing metacognition amongst middle school students and striving to increase knowledge transfer by incorporating a metacognitive approach to instruction (Bransford, Brown, & Cocking, 2000) are essential components of meaningful curriculum construction. Based on my experience, lessons designed with this in mind may allow students to take ownership of the learning process more readily and may improve their confidence as they approach science challenges and critical thinking exercises.  Additionally, helping students articulate ideas and modify their questioning strategies within small groups in an effort to motivate meaningful, interactive discussions (Beatty & Gerace, 2009) may be enhanced by CRS such as clickers. By comparison, Hannafin and Kim (2010) build a case for the positive attributes of scaffolding by elaborating on pattern and inquiry use by students, teachers, and technology. During problem-solving phases of student work, for example, the exemplar authors observed that peers validated scaffold use by checking and confirming answers (during and exploration phase) and by offering feedback to the contents of a peer’s online brochure (reflection/negotiation phase). In a second example, the authors described successful use of teacher scaffolds throughout class demonstrations as teachers helped students become familiar with a task and online tool (clarification), and as the teacher motivated students during challenging exercises. Thirdly, the authors described the successful use of technology- 33 based scaffolds as metacognitive, processing, modeling and visualization tools. Although scaffolding success varied depending on student job descriptions within the study (inquirers, students who reason and rationalize, negotiators, trial-and-error students and right-answer students), there was some degree of success at each level. In a technological age, educators can “speak their students’ language” by incorporating technology into middle school science lessons. Middle school students today have been raised in a technological age and, although computers and software are already prevalent in most classrooms, clicker technology is not typically utilized in science education beyond large class university lecture settings (Milner-Bolotin, et al., 2013).  2.5 From holistic perspective to context Looking at the Hannfin and Kim (2010) case study holistically, I am satisfied that the authors met several criteria successfully including data collection, triangulation, emphasis on naturalistic generalizations (applying case study representations to personal situations) (Stake, 1995), and they employed appropriate strategies to ensure trustworthiness (valid, reliable qualitative research) in this study. Multiple data samples were used, for example, which included individual interviews, participants’ electronic notes from the software program, observational field notes, review of online student brochures, surveys and videotapes of group activities. Strong teacher involvement in the project preparation was evident, discussion frameworks between teacher and student were explained and execution of the technology-based lessons was clear. Hannafin and Kim (2010) also offered detailed accounts of the methodology involved in inscribing key pieces of data which included notations made on hard copies of transcribed interviews related to specific evidence, notations concerning student reflections on relevant prior experiences, and notations about reading challenges students might have faced during the  34 technology-based activities. The researchers also offered a thorough discussion of both peer and teacher scaffolding patterns.    Again, as I review Hannafin and Kim (2010) within the holistic perspective, critiques of context, situationality (Stake, 1995), and voice arise. While the authors provide clarity to the context by describing (generally) a 6th grade science classroom participating in a six-day study in a third semester class, this information is not provided in a timely manner as it is offered only after a lengthy introduction, a theoretical framework overview, and a research design section.The study context is not discussed until much later in the article under ‘teacher scaffolds.’ Although some journal articles mirror this format, I found  articles that laid a contextual framework almost immediately allowed me as a reader to situate myself more quickly and comfortably. As a result of what I perceived as a situational delay or a sense of positional limbo, unanswered contextual questions arose about classroom configuration and the immediate student environment such as: How many computers are available to students?; How big are the groups?; Did groups rotate during the research process?; If so, how often did groups rotate?; What does the physical setting of the room look like (i.e. lab benches, tables, Individual seats)?; What was the timeframe for each exercise?; and What types of interruptions occurred such as administrative announcements or hallway noise?”. These details are important in terms of what the students’ environment “looks like”. Without the article explicitly addressing these issues, the reader must use his or her imagination to create this space, to fill in details and to situate students in the learning environment. Therefore, I find that this format weakens the reliability of the study somewhat and reduces transferability opportunities for the reader.  According to Stake (1995), situationality deserves to be recognized in case studies. He shares that meaning should be drawn largely from a case’s unique circumstance and that “given  35 intense interaction of the researcher with persons in the field and elsewhere, given constructivist orientation to knowledge, given the attention to particular intentionality and sense of self, however descriptive the report, the researcher ultimately comes to offer a personal view” (p. 42). In terms of context, this foundation should be laid early on in the article so the reader has a clear reference of time, space and place; closer attention to this issue would bring clarity to the case study. Situationality takes context to the next level by offering “thick descriptions” (Geertz, 1973 as stated in Stake, 1995) which are not “complexities objectively described,” but “the particular perceptions of the actors” (p. 42). According to Stake, this increases subjectivity and offers “experiential understanding and multiple realities” necessary in the case study process (p. 43). The Hannafin and Kim (2010) case study offers a basic contextual framework for the class sizes, ethnicity diversity amongst students (percent) and gender ratios but I found that the rich or “thick” descriptions about physical space and the classroom dynamic that provide sophistication and depth to the contextual framework were lacking. 2.6 Whose voice? By voice, I mean the students’ perspective with attention to what they may be saying, doing and thinking as they interact and engage in the technology-based activity in Hannafin and Kim’s study (2010). As students navigated through the WISE program, they utilized metacognitive tools (used an index, retrieved notes, achieved activity goals) and communicated online with peers by participating in chat room discussions. Students also conducted online research related to their assigned topic (wolves) and designed a computer-generated brochure based on their findings. During this time period, researchers make obsevations, take notes, review online student journals, and conduct three interviews per student (a pre-study interview, a mid-study interview, and a post-study interview). As I contemplated the study’s frequent peer to  36 peer interactions, I considered the following questions: What is behind the (student’s) comment? What is not being said? What instructional methods are students receiving to learn how to forumulate valid questions? How comfortable do students feel asking questions (of each other, in front of the class, or of the teacher? In the case study, student quotes and cursory statements are shared. However greater attention could be paid to the discrete meaning of student interactions as it relates to classroom activities and technology use as well as resolution of challenges they grapple with as they proceed through activities. For example, some students cited challenges in completing the technology-related tasks in a timely manner, some had little interest in the subject matter, and others “did not engage deeply in inquiry related to the problems under study” (Hannafin & Kim, 2010, p. 275). To deepen and enrich this portion of the qualitative investigation, the researchers could have posed questions during post-interviews such as: What motivated you to respond in that particular manner?; What specific component of the task gave you the most difficulty?; How do you think adjustments could be made in this exercise or to the online program to change the learning experience for you?; and Describe your interaction with peers during the activity? Without asking questions such as these, Hannafin and Kim (2010) create barriers to student pedagogy and miss an opportunity to probe further into students’ inquiry patterns, a goal expressed by the authors at the beginning of the article.  This ommision leaves me wondering why, when the opportunity presented itself, did the researchers not consider more effective questioning strategies with the intention of revealing more authentic student metacognitive processes? Why was greater attention not paid to the subtleties of student dialogue? Whose voice are they listening to? Whose voice did they not hear? In this exemplar, the ‘etic’ perspective (usually the researcher’s perspective) is well defined, resulting in less emphasis on the more important insider, or ‘emic’ view (an insider’s  37 viewpoint or a cultural perspective) (Merriam, 1998; Stake, 1995). Hannafin and Kim (2010) stated initially that their intention was, among other listed goals, to investigate “students’ metacognitive challenges, including task understanding and planning, monitoring and regulation, and reflection” (p. 256). If the authors implemented more carefully designed questioning tactics, I posit that their research results would be more meaningful, rich, and deep. Since part of Hannafin and Kim’s (2010) research focus was based on student perspectives regarding technology-based scaffolding, their questioning strategies deserved more than just superficial treatment.  2.7 Triangulation & validity In the case study, triangulation (a 3-part process utilized to demonstrate interactions and relationships between components) is evident in at least two thematic areas. Utilizing this strategy, the Hannafin and Kim (2010) effectively describe the axial merge between multiple data. One triangulation example specifically identifies documents, direct observations, interviews and surveys as multiple sources for data collection. In a second example, peer-, teacher- and technology-enhanced scaffolds are explained. In this instance, the authors expand on how scientific inquiry is facilitated during problem-solving activities. Underlying both examples is the authors’ foundational commitment to the study’s purpose: to examine how scaffolds influence student inquiry in problem-based 6th grade science classes. The authors have successfully considered a variety of triangulation options under various themes and take a meaningful approach to this process. Their use of triangulation as a qualitative strategy is meaningful in this study as it increases its reliability and adds to the robustness of their research.  Hannafin and Kim’s (2010) article raises questions, however, concerning validity. Yin’s (2014) definition of external validity seems to apply to this case study since implications for  38 future consideration are stated clearly in the article’s concluding paragraphs. These generalizations include how inquiry can be supported, the nature and quality of group interaction, how core inquiry goals align with scaffolding, and the effectiveness of mutual student-teacher goals. Generalizations within the article also align with Stake’s (1995) view of validity where he explains that the focus should be on meaning rather than location. Although external validity receives sufficient attention in the case study because the researchers make generalizations from their study (see above), specific tactics for the study’s internal validity are weak. Yin explains that internal validity describes, “how and why event X led to event Y” (p. 47), and that inferences occur frequently in case study research. Yin adds that once inferences are made, researchers must investigate whether they are correct and whether or not rival explanations exist. In Hannafin and Kim (2010), although the authors successfully investigate how peer, teacher and technology-enhanced scaffolds influence students’ inquiry, the why component of this equation is not evident. As a result, I find that the authors are not asking the right questions regarding inferences that emerge from their research. Although this case study’s treatment of internal validity is weak, attention to external validity is relevant and meaningful. Therefore, apart from this component, it is a valid study. In contrast, a different internal validity marker, pattern-matching logic, is evident in Hannafin and Kim’s (2010) study. In Yin (2014), pattern matching is described as “logic (that) compares an empirically based pattern - that is, one based on the findings from your case study - with a predicted one made before you collected your data” (p. 143). In this case study, the researchers found that students implemented distinct inquiry patterns when conducting problem-solving exercises and that technology-based scaffolds were beneficial to learning. The researchers also determined that through scaffold use, students were more likely to confirm  39 answers, confront conflicts, and manage cognitive responsibilities. This reveals a direct connection to the study’s purpose of examining how scaffolds influence students’ inquiry in problem-based contexts and adds credibility to their research methods. The case study’s strategies also fit with Patton’s (2002, as stated Yin, 2014) “quazi-experimental” (where the experimental design includes nonrandomized groups) pattern matching methods due to the nature of its qualitative focus.  Another essential marker of internal validity, according to Yin (2014) is to test the plausibility of rival explanations. He states that “the more rivals that your analysis addresses and rejects, the more confidence you can place in your findings” (p. 142). In Hannafin and Kim’s (2010)  case study however the authors only offer one rival explanation for their inquiry-based case. It concerns a three-year systemic study in Detroit (Michigan, USA) public schools involved in an inquiry-based technology study that did not involve scaffolding strategies. Here, technology-supported curricula resulted in higher post-test scores amongst students. For the reader, this raises questions about the validity of technology-based scaffolds in inquiry-based learning and the necessity of them in problem solving exercises. Hannafin and Kim’s case study offers no explanation regarding how the study was conducted. The authors do not mention the grade level of students involved, and they give cursory attention to the reason for the study’s success (timeframe). Although little detail is provided in this case study, such as students’ grade level and other pertinent details, I consider this to be a rival hypothesis.  2.8 Discourse: How much is too much?  According to Grbich (2013), social groupings, culture and constructs are essential components of critical discourse analysis (CDA). With this in mind, it is critical to conduct and review qualitative case studies with sensitivity to discourse and discursive practices. As  40 interviews are conducted and text is compiled, condensed, and analyzed, discussions between researchers need to take place and questions need to be asked such as: What are the social dynamics of the group (students, teachers, administrators, parents)?; How can power struggles between groups identified and considered during research studies?; Are there hierarchies between individuals or across groups?; Are there varying degrees of literacy amongst research participants?; and What role do participant past life experiences play in the approach to research studies?  From an ‘emic’ perspective (Merriam, 1998; Stake, 1995), dialogue must also occur to clarify how the reader can be situated within a study. The reader needs to feel that they are present and are part of the action; ‘why’ questions need to be asked regarding discourse to bring authenticity to every aspect of a research study. The researcher must become the detective and investigate, probe and decipher material including written language and conversations. It may be possible to overdo discourse analysis, but Hannafin and Kim’s (2010) article is an example of what constitutes not enough. CDA needs to be embraced and should be an integral part of all qualitative case studies. Hence, my study seeks to illuminate the how and why related to the middle school science classroom dynamic and small group interactions. 2.9 Seeking best practices utilizing CRS As stated earlier, whether it is in a whole class or small group (lab) setting within a middle school science classroom, each student approaches learning situations differently and, as each student waits for a lesson to begin, he or she is impacted differently by the information presented. Although CRS are frequently used in university lectures, this type of technology is relatively new to North American students younger than the postsecondary level (Milner-Bolotin, et al., 2013). Clickers play a role in engaging students in the concept review process by  41 increasing students’ level of engagement during lessons (Fike, Fike & Lucio, 2012). However simply using clickers to answer power point review questions does not mean that students develop better metacognitive skills. According to Joseph (2006), educators need to concentrate less on content and more on teaching critical thinking skills. Through coaching, encouragement, and self-awareness Joseph explains that reflexive (metacognitive) thinking allows students to become more independent, confident learners and maximize their learning experiences (p. 100).  According to Joseph (2006), students are hindered by low metacognitive awareness and lack sophisticated thinking and learning skills. Contrasted with self-regulated learners, Joseph adds that the unsophisticated less successful learners “fail to establish and maintain an effective focus, becoming frustrated and confused by unproductive learning strategies” (p. 33). Another educational expert, Bassey (1993) mirrors this sentiment by explaining that students have not been taught the practical thinking skills that will enable them to gain more from a classroom lesson, therefore preventing them from becoming effective learners. In Chapter 7 of his book Case Study Research in Educational Settings Bassey (1993) explains that educational research should stimulate professional discourse and this, in turn, should generate rich conversations about meaningful educational topics. In addition, Bassey (1993) shares his ideas on “fuzzy generalizations” (p. 52), which leave the application of research conclusions open to interpretation. This vulnerability may not sit well with entrenched strategies of staunch positivists (where the researcher controls the research process and is external to the research site), however it is a necessary part of the qualitative research growth process.   Insecurity, nervousness, and lack of confidence often affect learning, and many students do not have the skills or tools to maximize their learning experience. For example, at the beginning of a lab activity when students have been given a thorough explanation of  42 experimental goals and procedures, have received details regarding a lab’s purpose, and all necessary paperwork and materials have been distributed to lab benches, some students may still not comprehend the exercise and will ask questions such as: What materials are we going to use?; How do I work the equipment?; Do we have to collect data?; Where do we record data? What are we doing?; and Why are we doing this? How does this relate to what we’re studying? Or, students may simply comment, “I don’t get this at all”. Even when students are given opportunities to ask questions prior to the task during whole class settings or within small groups at lab stations, these questions may continue to arise. These kinds of questions reflect a lack of understanding, the absence of metacognitive skills, and the inability to process information effectively.  Individual circumstances that may influence learning situations could range anywhere from ability, level of attention, life experiences, personal distractions, atmosphere, environment, subject interest, and mood or energy level. For students to understand, comprehend, and process information, they need to take charge of their learning and realize what kind of a learner they are. With improved metacognitive abilities developed in whole class and small group settings, students may be empowered to learn more effectively and may become more confident, independent students. As educators, we also need to speak the student language of technology and embrace its use in the classroom. Clicker use has a place in the classroom by increasing student engagement during pre-test reviews (Fike, Fike & Lucio, 2012). This, along with strategies to improve student metacognition may lead to improved learning in the middle school science classroom. The relevant research has shed very little light on how clicker use can reflect metacognitive changes amongst students in the middle school science classroom (Milner- 43 Bolotin, et al., 2013). Mazur (2009) conducted extensive research regarding clicker use in college undergraduate classes with more than 200 participants. His study showed promising results regarding student interaction and engagement. The following questions guided my study’s methods and analyses: How can student interactions within small groups improve metacognition amongst individuals?; Can use of clicker technology reflect changes in metacognitive levels amongst students?; and Why do certain questioning strategies work better than others within the small group dynamic? With that in mind, the focus will not only be on the ‘how’ but also the ‘why’ in terms of contemporary events within the case study methodology (Yin, 2014).  Based on my experiences as a researcher and educator, I find that technological tools such as clickers must be useful, relevant, and easy to use for both students and teachers, and they must be incorporated thoughtfully into the curriculum as opposed to being added for the sake of using technology. Clickers should also add an element of fun to lessons. Additionally, some students work diligently at staying out of the limelight; they enjoy anonymity; they avoid participating during discussions and ‘tune out’ during class. Other students are reluctant to participate in group discussions due to lack of confidence, shyness or are embarrassed to take risks and possibly get an answer wrong. Clickers may empower these students to take on challenges more readily while  reducing anxiety associated with whole group participation. Students may also want to, as opposed to have to, participate in lessons as well as find their voice during homework reviews, concept reinforcement and whole class conversations. Within a small group dynamic, clicker technology may allow students to gain confidence, learn to collaborate more effectively, take risks more frequently, and improve metacognition and conceptual understanding.   44 Teaching does not occur in a vacuum and the traditional sage on the stage and related non-interactive methods do not meet the complex needs of middle school learners. Knowledge acquisition and skills connected to understanding science is an interactive, experiential, and cognitive process. During engaging classroom lessons, educators may want to know what students are doing in any given moment, and how they are accomplishing their assigned tasks. Educators may also seek to determine what students are thinking and how each student’s thinking may translate into a better understanding of how they learn and how they can monitor their own learning using clicker technology to enhance understanding. This way, broader questions can be answered, such as, how I can design or re-design curricula components to enhance this process? 2.10 Attention to metacognition Piaget wrote that cognitive development (acquiring knowledge and understanding through thought and experience) should be viewed “as a continuum involving the interaction of four influences: maturation, active experience, social interaction, and a general progression of equilibrium” (Piaget, 1971 as stated in Ewing, Foster & Whittington, 2011), and that teachers could influence active experiences in the classroom by offering opportunities for students to explore and discover. Metacognition concerns “thinking about one’s own thoughts” (Hacker, Dunlosky & Graesser, 1998) and involves “what one knows (i.e., metacognitive knowledge), what one is currently doing (i.e., metacognitive skill), or what one’s current cognitive or affective state is (i.e., metacognitive experience)” (p. 3).  Flavell takes this idea one step further by describing metacognition as a “fuzzy concept” (Flavell, 1981 as stated in Hacker, Dunlosky & Graesser, 1998) that needs clarification. He explains that intentional, deliberate awareness of cognitive actions allows an individual to  45 improve their metacognitive abilities. Flavell stated that intelligence and memory development are connected, and they fuse to form “metamemory” (Flavell, 1971 as stated in Hacker, Dunlosky & Graesser, 1998) and added that metamemory is a crucial component in monitoring and acquiring metacognitive skills. The constructivist perspective is also evident in Flavell’s ideas about the connection between metacognitive knowledge and the impact of real-life experiences on learning, believing that life experiences and personal actions are incorporated into metacognitive functions (p. 5). Foundational constructivist ideas are also reflected when students interact socially, and when they reflect and collaborate during problem-solving activities (p. 71).  In an effort to examine the inquiry process amongst eighth grade science students, Rand (2005) used various methods to ‘enculturate’ students into examining each step of a lab activity. He created strategic thinking questions to guide students as they performed experiments, held open-forum discussions about the scientific problem-solving process, and modeled how each part of the experiment should be approached. By explicitly sharing lists of questions designed to foster student understanding about the scientific method process, Rand (2005) was able to create a classroom culture where individuals were encouraged to ‘think about what you are thinking’ (p. 61). Along with group collaboration and peer editing, students were more likely to discuss their thought processes with one another, ask questions, and justify statements. I suggest that the following questions be asked: Can clickers influence metacognition amongst students in whole class settings?; Can clickers influence metacognition amongst students in small group settings? and If so, how can lessons be strategically be designed to influence metacognition in whole class and small group settings?; and How can clicker technology be utilized to influence metacognition in whole class and small group settings? I drew upon these questions to inform my study.   46 2.11 Metacognitive coding strategies One particular research source (Whitebread et al., 2009) describes a coding instrument called ‘CHILD 3-5 checklist,’ (p. 74-75) in which a numbered system was used to assess learner metacognition, self-regulation, and independence during student observations. In the study, the coding scheme involved the following categories: metacognitive knowledge; metacognitive regulation; and emotional and motivational regulation. Whitebread et al. also described their use of “verbal and non-verbal indicators” (p. 69). Although the study was conducted with very young children, it might be useful to examine it (and others like) for ideas on coding strategies and methods for observing verbal and non-verbal characteristics within small groups.  2.12 Interviews, video and/or audio-taping  In the case study reviewed in this chapter, Hannafin and Kim (2010) described their interview process where individuals (19 students and one classroom teacher) were interviewed for 15-20 minutes during a pre- and post-activity exercise. Advanced planning, according to Stake (1995), is essential to gathering valid qualitative research data. Towards this goal, the author suggests compiling a “short list of issue oriented questions” (p. 65), “trying out the questions in pilot form” (p. 65), and “keeping the record of the interview” (p. 66). Avoiding a strict interview structure and maintaining a more fluid interview process (Stake, 1995; Yin, 2014) allows the interviewer to evoke authentic responses and helps raise the comfort level of the interviewee throughout the process. Additionally, teachers can seek alternate opinions from colleagues regarding the validity of interview questions and can make modifications to questions to improve questioning strategies and reduce bias based on their input. Interviews can take place either during breaks between classes, before or after school, or at the convenience of both student and teacher. In an effort to address biases during interviews (or other data collection methods),  47 instructors need to be sensitive to the information being gathered (Merriam, 1998), reflective and sceptical throughout the data collection process (Stake, 1995), and maintain consistently high ethical standards (Yin, 2014). Seeking  advice and the opinion of at least two or three colleagues (Yin, 2014) throughout this process is also critical. Educators need to implement these strategies to reduce research bias – define this for your reader in the context of interviews.  In their case study, Hannafin and Kim (2010) addressed issues of trustworthiness in a thorough and comprehensive manner; however the attention paid to researcher bias was cursory. Apart from stating that “the researcher’s assumptions and potential biases were identified prior to data collection,” and that “multiple sources of data were triangulated,” specifics regarding strategies employed to avoid bias were absent. Similarly, Hannafin and Kim (2010) neglected to elaborate on how they approached issues surrounding sensitivity during student or teacher observations and interviews. Including more detail about how the authors minimized bias would help strengthen the reliability of their study and would eliminate any scepticism concerning this issue. Contrasted with interviewing adults, interviews with children should take place in a setting that is relaxed and comfortable for the child, and where “chitchat could describe the spirit of an interview” (p. 90) during an interactive dialogue between adult and child (Engel, 1995; Karlssonin, 2004, as stated in Kyronlampi-Kylmanen & Maatta, 2011).   Standardized interviews (in-depth interviews using the same questions in the same order), based on Van den Hoonaard’s (2012) perspective, neglect to pose questions that are presented in a “manner that is neutral and consistent across multiple interviews.” In an effort to delve further into the interview design process and overcome these problems, the author elaborates on in-depth interview strategies. By offering open-ended, conversational questions (Van den Hoonaard, 2012; Yin, 2014), actively listening during the interview process, and providing opportunities to  48 reveal the “inner views of the respondent” (p. 80), these tactics are more likely to provide rich, meaningful data. In research situations, small groups (i.e. lab teams) could provide valuable insight regarding how students might think about their own thinking. Three to four students could be chosen by a “purposeful process” (decisions made by the instructor with a specific strategy in mind) (Denscombe, 2002, p. 146), or intentionally selected, based on their particular learning styles. Although specific details are still to be determined regarding which exact learning styles may best suit this study, students may be chosen partly based on the following criteria: whether he/she is a visual, auditory or kinaesthetic learner; level of enthusiasm; level of responsibility; and/ or leadership traits. Random sampling (an equal chance of selection for all participants), seen more often in quantitative research methods, may not be suitable in situations where small number of individuals are involved.  Although audiotaping was a data collection method during Hannafin and Kim’s (2010) case study, videotaping was not. Excerpts from interviews were provided and an analysis of both teacher and student interviews were offered. However videotaping may have offered a window into a more thorough understanding of student perspectives. In particular, an examination of student gestures and facial expressions offered by videotaping may have provided greater insight into the nuances of specific student responses and allow for a deeper appreciation of individual, spontaneous student reactions during scaffolding exercises. By including videotaping as a data collection strategy, Hannafin and Kim (2010) could have provided further insight into visual student responses related to scaffolding exercises, thereby enhancing their research study.  In the exemplar, artefacts included online student journal entries, answers to assignment questions, and computer records of student interactions during WISE program activities. These samples were analyzed for evidence of scaffolding strategies amongst individuals and within  49 peer groups. Data were then documented in tables within the exemplar, which highlighted specific scaffolding patterns. Although instructors may not have access to the WISE computer software program similar to Hannafin and Kim’s (2010) study, or may not conduct a wolf-based study, clickers could be utilized to examine the scaffolding process. For example, clickers could be used at the beginning of the school year to complement an investigation of the scientific method. While students learn the scientific method steps, they could work in small groups to construct a foot-long rubber band car as part of a physics lab. Throughout this hands-on activity, students could discuss ideas, complete handouts, follow procedural lab steps, and collect and analyze data. Students could also take a pre- and post-activity quiz or could be asked questions using clicker technology. Lab materials such as the rubber band car could be a helpful artefact during interviews since it might potentially take attention away from the task at hand (answering questions), calm nerves, and allow students to elaborate on ideas more easily with the artefact available to them.  This kind of interview is referred to as the task-based interview strategy and is particularly effective when interviewing children. In task-based interview strategies, the interviewee interacts not only with the interviewer, but also with a carefully designed task environment which may include manipulatives such as toys, blocks, models or puzzles (Goldin, 2000, as stated in Maher & Sigley, 2014).  Equally important in the audio or videotaping process is the set-up and use of this equipment during research studies. Is it a distraction to interviewees? Are students surprised that the equipment is being used? Creating a comfortable setting for students is essential when gathering authentic data. In order to accomplish this, instructors should maintain transparency by: making students aware of audio (iPhone) or video equipment prior to the interview; giving students an opportunity to view interview questions ahead of time; and giving students a chance  50 to become accustomed to the equipment by setting it up in the lab in advance of the study (i.e. it is positioned in a corner of the lab but off to the side). Additionally, students should receive interview questions in advance (10 minutes or so in advance) so they can consider answers prior to introduction of any recording equipment (as mentioned earlier). Although not suggested in educational literature, my decision to distribute interview questions in advance was solely to compensate for student nervousness while being recorded during the interview process. My belief was that, if students could contemplate some of the questions more easily (by giving them the question handout in advance), then it would increase fairness related to this portion of the study and would elicit more authentic answers to interview questions. Open-ended questions should be asked so that a conversational tone is created during the interview process. This may relax students to promote more authentic responses during actual data collection.  In an effort to analyze coded field notes from small group and whole class observations, it is necessary to study the data and look for noticeable “patterns, insights, or concepts that seem promising” (Yin, 2014, p. 135). Yin’s (2014) idea of categorizing data in a matrix, creating data flowcharts or examining evidence for the frequency of events seems like a logical, manageable system. Analysis of audio and videotaped sessions may include coding software such as Nvivo for data interpretation and management. I investigated and tested this software prior to using it for this study. In an effort to avoid becoming overwhelmed, data analysis should occur “in conjunction with data collection” (Merriam, 1998, p. 177). Being aware that there is a partnership between data collection and data analysis may help prevent frustration and anxiety throughout the organic process of case study research.  51 2.13 Rival propositions According to Yin (2014), examination of rival explanations enhances the robustness of a research study. By considering alternate views to propositions, researchers can build confidence in their findings and increases the validity of the study. In this way, data will be more reliable and the research study will be more authentic. The following examples of rival explanations for my study could include: clicker technology does not influence student metacognition in whole class or small group settings; clicker power point questions are not an effective barometer of changes in student metacognition in whole class or small group settings; and small group interactions do not influence changes in metacognition during lab activities. Once a research study has been completed, instructors can determine if these rival propositions can be rejected or if they need to be addressed. As stated earlier in this essay, the exemplar gives only one rival explanation. Other alternative views for Hannafin and Kim (2010) to consider could include: technology-based scaffolding does not influence problem-solving abilities amongst 6th grade science students; scaffolding does not influence peer support in technology-based inquiry focused classrooms; inquiry patterns between individuals and groups cannot be distinguished through the use of scaffolds during problem-solving exercises. 2.14 Summary  Qualitative researchers focus not on cause but on interpretive, holistic, naturalistic perspectives (Stake, 1995). Case studies are also empirical (field oriented, noninterventionist), emic in nature and empathic (tending to participant intentionality) (p. 47-48). Some research studies fit within the parameters of qualitative methodology. However quantitative data (clicker technology data) may also need to be recognized and incorporated in some cases. Merriam (1998) explains that case study research can be a combination of quantitative and qualitative and  52 that case studies can be labeled holistic, life-like, grounded, and exploratory. She adds that the uniqueness of a case study lies more in the questions asked and the relationship with the end product. “How and why questions guide case study research” (p. 59); and “more art than rules is involved in (qualitative) research (p. 57)”.  After careful review of the case study reviewed in this chapter, I find that Hannafin and Kim (2010) sufficiently answer questions about ‘how’ scaffolds influence inquiry-based problem-solving within 6th grade science classrooms and, although they pose several suggestions for future consideration at the end of their research study, they do not adequately address ‘why.’   In his section on questioning strategies, Yin (2014) explains that arriving at answers is not as important as the questions you generate in case study research (p. 74). Yin’s positivist perspective emerges here because of the similar strategy used in science where new hypotheses are generated from experimental results. Reflecting on the museum tour I described at the beginning of this chapter, art as a central theme is revealed once again. There is art in qualitative and quantitative research which includes appreciating the artistry within the context of a middle school science research study, bringing the importance of the student to the forefront of the research process (seeking the student ‘voice’), and providing a rich, robust study which incorporates the reader into the research process. Similarly, there is intrinsic beauty in fusing art and science in inquiry-based learning and research methodology. Inspired by the parallels between Hannafin and Kim’s case study (2010) and my research topic and middle school environment, this guided my research path. In addition, my study follows case study parameters by addressing the tenets of case study protocol and its real-life context (Yin, 2014), my interest in how students function in daily classroom pursuits as well as a willingness to put aside preconceived ideas during this study (Stake, 1995). Additionally, the  53 qualitative component of my study was characterized by a quest for meaning and understanding, with myself as the primary data collector and data analyzer (Merriam, 1998). The following chapter describes the methodology used in this study.    54 Chapter 3: Methodology  Chapter 3 describes the research methods used in this study as they relate to the research questions. This chapter also explains the grading system used for the small group handout and offers detailed rubrics for SEMLI_S subscales and the LOTS, MOTS and HOTS Bloom’s taxonomy scales. Included here is also a description of the research participants and the school/classroom environment, preparation methods prior to commencement of data collection, as well as data collection strategies. Additionally, AR is utilized as a component of my methodology that actively involves my students and I in a classroom-based, problem-solving research that offers a “powerful tool for improving the quality of teaching and learning within a school community” (Tillotson, 2000). AR is the best fit for this study since it incorporates important components of the research process such as generating questions about a problem, collecting and analyzing data, potentially modifying current teaching practices and creating new strategies to study educational problems (p. 31-32). There is much research suggesting that classroom response systems such as clickers improve metacognition and conceptual understanding amongst students in university settings (e.g., Boyle & Nicol, 2003; Jones et al., 2012; MacArthur, 2008; Mazur, 2009). However there is a lack of research investigating clicker use in middle schools (Milner-Bolotin, et al., 2013). With the current nationwide (U.S.) emphasis on STEM-based learning (www.nextgenscience.org/), an examination of clicker technology reflects current trends in science education. Clicker-enhanced pedagogy in context with STEM education has the potential to improve metacognition. Flavell (1976) described metacognition as the action of the learner while in the process of solving problems. Metacognition, then, includes learning behaviors and skills that enable the learner to hold his/her own cognitive processes in mind as an object of thinking, resulting in attributes and  55 dispositions for self-directed learning (Baird, 1986; Butler, 2002; White, Bakopanos & Swan, 1988). Given the emphasis on STEM education, this research attempts to explore clicker technology and metacognition amongst middle school science students and addresses an important gap in the literature.  As previously mentioned, the current study was conducted to investigate the following research questions through traditional teaching methods and the use of clicker technology while investigating levels of metacognition using Bloom’s taxonomy categories (Bloom, Krathwohl, et al., 1956) amongst middle school science students: 1.! How does the use of clicker-enhanced pedagogy affect student metacognition and conceptual understanding in the context of small (lab group) interactions and whole group middle school science classes? 2.! How do SEMLI_S dimensions and Bloom’s taxonomy results compare during clicker-enhanced pedagogy and traditional teaching methods?  To investigate the use of clickers and how this affects student metacognition in the science classroom (Research Question 1), students were given background information on metacognition and Bloom’s taxonomy categories and class discussions allowed students to grapple with ideas about how they could think about their own thinking. Then, as students participated in small group chemistry activities, students were asked to individually complete handouts on the following three questions (Lab Group Work handout questions #1-3, Figure 6) regarding how the concepts learned might help them: construct knowledge; think at a higher level; and transfer knowledge to a new situation. A sample (completed) student Lab Group Work handout is included below in Figure 7.   56 During this type of small group work, students were encouraged to collaborate, reflect on their learning, communicate, and be actively engaged during lab activities. Students practiced these skills over an 18-week time period in science class prior to commencing this study by completing an introductory version (Figure 8) of the lab group work handout (Figure 6) so lab group members could become accustomed to listing questions and comments during lab collaborations. Students were also familiar with small group strategies such as listening to others, respecting team members, and knowing his or her particular lab group responsibility. These strategies were practiced in class throughout the 18-week timeframe prior to beginning this research study. Additionally, students were familiar with clicker use from science lessons during this 18-week time period.    57 Figure 6: Lab group work handout LAB GROUP WORK 5th Grade Science - Mrs. D   Instructions: Consider the task at hand carefully & work with your team to find solutions to the problem presented. This exercise is also meant to improve the way you ‘think about thinking’(=metacognition). As you communicate and collaborate with your group members, record your questions and comments below.  Topic: ________________________________________________________________  1. Constructing knowledge: What do you already know about this concept?       2. Thinking at a higher level: How do I explain what I think about this concept?       3. Transferring knowledge: What skills can I transfer to a new situation?       QUESTIONS I ASKED:         COMMENTS I MADE:   58  Figure 7: Lab group work student sample      59 Figure 8: Lab question/comment handout  LAB QUESTION/COMMENT HANDOUT 5th Grade Science - Mrs. D  Name:_______________________________TG:_____Date:_____________________  Instructions: Fill in the following ‘Question’& ‘Comment’boxes as you work through the activity at your lab bench. Write 3 - 4 bullet points in each section below.  QUESTIONS I ASKED COMMENTS I MADE     Two additional handout questions shown in Figure 6 (although they are not numbered, for the sake of simplicity I will call them Questions #4 & 5) asked students to fill in spaces under: “questions I asked” and “comments I made”. Samples were collected from research participants and answers to each handout question were analyzed. In order to explore SEMLI-S and Bloom’s taxonomy during clicker use and traditional teaching methods, two out of four classes reviewed for quizzes using clicker technology and two out of four classes reviewed using traditional teaching methods. Once the study was complete, participants anonymously completed a follow-up clicker questionnaire. A teacher colleague administered this questionnaire independently, and I was not in the classroom during this time. Five students voluntarily  60 participated in video interviews at the end of this research study. These students gave verbal answers to written clicker and metacognition questions, and their responses were transcribed and included in this research study. Together with quantitative data, interview data informed my findings by revealing students’ conscious understanding of metacognition and Bloom’s taxonomy levels. It also supported qualitative data of a high comfort level with technology and increased engagement in the learning process utilizing technology. These data helped supported my conclusions by revealing consistent use of metacognitive skills and understanding of Bloom’s taxonomy levels throughout the data collection period in this study (Appendix C). This mixed methods research incorporated both quantitative and qualitative data gathered at three intervals from February 2014 through mid-May 201;, however clicker power point lessons related to this study began in late January 2014. A score of one or two was assigned for Lab Group Work handout questions #1-3 (Figure 6), reflecting the criteria created in the following SEMLI_S rubric (Table 2). Table 2: SEMLI_S metacognition rubric for small group work handout questions #1-3   Level of Evidence = 2 Level of Evidence = 1 SEMLI-S: CC (Constructivist Connectivity)    Science student demonstrates one or more of the following related to CC categories such as: connecting information to previous knowledge, connecting what student learns to out-of-class activities, making connections between science and other subjects/disciplines. (SEMLI-S: CC1, 3, 4, 6) Science student does not demonstrate one or more of the following related to CC categories such as: connecting information to previous knowledge, connecting what student learns to out-of-class activities, making connections between science and other subjects/disciplines. (SEMLI-S: CC1, 3, 4, 6)  61  Level of Evidence = 2 Level of Evidence = 1 Knowledge Within this category, one or more of the following is apparent regarding concept, task or skill: Information recall, statement of a science observation, listing related science information, defining a science term related to assignment task/skill taught.  Within this category, one or more of the following is not apparent regarding concept, task or skill: Information recall, statement of a science observation, listing related science information, defining a science term related to assignment task/skill taught.  Comprehension Within this category, one or more of the following is apparent regarding concept, task or skill: understanding, explaining, summarizing. Within this category, one or more of the following is not apparent regarding concept, task or skill: understanding, explaining, summarizing, concept or task.  Application Within this category, one or more of the following is apparent regarding concept, task or skill: using and applying knowledge, identifying connections between ideas, problem solving, designing, experimenting. Within this category, one or more of the following is not apparent regarding concept, task or skill: using and applying knowledge, identifying connections between ideas, problem solving, designing, experimenting. Analysis Within this category, one or more of the following is apparent regarding concept, task or skill: identifying and analyzing patterns and relationships, evidence of organizing ideas, recognizing trends. Within this category, one or more of the following is not apparent regarding concept, task or skill: identifying and analyzing patterns and relationships, evidence of organizing ideas, recognizing trends. Synthesis Within this category, one or more of the following is apparent regarding concept, task or skill: inferring, predicting, modifying, inventing, composing, modifying. Within this category, one or more of the following is not apparent regarding concept, task or skill: inferring, predicting, modifying, inventing, composing, modifying.  62  Level of Evidence = 2 Level of Evidence = 1 Evaluation Within this category, one or more of the following is apparent regarding concept, task or skill: comparing ideas, evaluating outcomes, solving, and judging. Within this category, one or more of the following is not apparent regarding concept, task or skill: comparing ideas, evaluating outcomes, solving, and judging.    63 SEMLI-S: MEP (Monitoring, Evaluation, Planning)    Science student demonstrates one or more of the following related to MEP categories such as: assessing a necessary learning task, evaluating his/her learning process, understanding the aim or goal of a task, predicting possible learning problems related to concept or task. (SEMLI-S: MEP4, 7, 8, 9) Science student does not demonstrate one or more of the following related to MEP categories such as: assessing a necessary learning task, evaluating his/her learning process, understanding the aim or goal of a task, predicting possible learning problems related to concept or task. (SEMLI-S: MEP4, 7, 8, 9) Knowledge Within this category, one or more of the following is apparent regarding concept, task or skill: Information recall, statement of a science observation, listing related science information, defining a science term related to assignment task/skill taught.  Within this category, one or more of the following is not apparent regarding concept, task or skill: Information recall, statement of a science observation, listing related science information, defining a science term related to assignment task/skill taught.  Comprehension Within this category, one or more of the following is apparent regarding concept, task or skill: understanding, explaining, summarizing. Within this category, one or more of the following is not apparent regarding concept, task or skill: understanding, explaining, summarizing.  Application Within this category, one or more of the following is apparent regarding concept, task or skill: using and applying knowledge, identifying connections between ideas, problem solving, designing, experimenting. Within this category, one or more of the following is not apparent regarding concept, task or skill: using and applying knowledge, identifying connections between ideas, problem solving, designing, experimenting. Analysis Within this category, one or more of the following is apparent regarding concept, task or skill: identifying and analyzing patterns and Within this category, one or more of the following is not apparent regarding concept, task or skill: identifying and analyzing patterns and  64 relationships, evidence of organizing ideas, recognizing trends. relationships, evidence of organizing ideas, recognizing trends. Synthesis Within this category, one or more of the following is apparent regarding concept, task or skill: inferring, predicting, modifying, inventing, composing, modifying. Within this category, one or more of the following is not apparent regarding concept, task or skill: inferring, predicting, modifying, inventing, composing, modifying. Evaluation Within this category, one or more of the following is apparent regarding concept, task or skill: comparing ideas, evaluating outcomes, solving, and judging. Within this category, one or more of the following is not apparent regarding concept, task or skill: comparing ideas, evaluating outcomes, solving, and judging.    65 SEMLI-S: SE (Self-Efficacy)    Science student demonstrates one or more of the following related to SE categories such as: mastering skills being taught, confidence to do a good job on assignments and tests, confidence in understanding complex material, confidence in understanding basic course concepts. (SEMLI-S: SE2, 3, 5, 6) Science student does not demonstrate one or more of the following related to SE categories such as: mastering skills being taught, confidence to do a good job on assignments and tests, confidence in understanding complex material, confidence in understanding basic course concepts. (SEMLI-S: SE2, 3, 5, 6) Knowledge Within this category, one or more of the following is apparent regarding concept, task or skill: Information recall, statement of a science observation, listing related science information, defining a science term related to assignment task/skill taught.  Within this category, one or more of the following is not apparent regarding concept, task or skill: Information recall, statement of a science observation, listing related science information, defining a science term related to assignment task/skill taught.  Comprehension Within this category, one or more of the following is apparent regarding concept, task or skill: understanding, explaining, summarizing. Within this category, one or more of the following is not apparent regarding concept, task or skill: understanding, explaining, summarizing.  Application Within this category, one or more of the following is apparent regarding concept, task or skill: using and applying knowledge, identifying connections between ideas, problem solving, designing, experimenting. Within this category, one or more of the following is not apparent regarding concept, task or skill: using and applying knowledge, identifying connections between ideas, problem solving, designing, experimenting. Analysis Within this category, one or more of the following is apparent regarding concept, task or skill: identifying and analyzing patterns and relationships, evidence of Within this category, one or more of the following is not apparent regarding concept, task or skill: identifying and analyzing patterns and relationships, evidence of  66 organizing ideas, recognizing trends. organizing ideas, recognizing trends. Synthesis Within this category, one or more of the following is apparent regarding concept, task or skill: inferring, predicting, modifying, inventing, composing, modifying. Within this category, one or more of the following is not apparent regarding concept, task or skill: inferring, predicting, modifying, inventing, composing, modifying. Evaluation Within this category, one or more of the following is apparent regarding concept, task or skill: comparing ideas, evaluating outcomes, solving, and judging. Within this category, one or more of the following is not apparent regarding concept, task or skill: comparing ideas, evaluating outcomes, solving, and judging.    67 SEMLI-S: AW (Learning Risks Awareness)    Science student demonstrates one or more of the following related to AW categories such as: revealed awareness regarding learning challenges, when student is losing track of a learning task, when student doesn’t understand an idea, when student is not concentrating. (SEMLI-S AW: 1, 2, 3, 5) Science student does not demonstrate one or more of the following related to AW categories such as: revealed awareness regarding learning challenges, when student is losing track of a learning task, when student doesn’t understand an idea, when student is not concentrating. (SEMLI-S AW: 1, 2, 3, 5) Knowledge Within this category, one or more of the following is apparent regarding concept, task or skill: Information recall, statement of a science observation, listing related science information, defining a science term related to assignment task/skill taught.  Within this category, one or more of the following is not apparent regarding concept, task or skill: Information recall, statement of a science observation, listing related science information, defining a science term related to assignment task/skill taught.  Comprehension Within this category, one or more of the following is apparent regarding concept, task or skill: understanding, explaining, summarizing. Within this category, one or more of the following is not apparent regarding concept, task or skill: understanding, explaining, summarizing.  Application Within this category, one or more of the following is apparent regarding concept, task or skill: using and applying knowledge, identifying connections between ideas, problem solving, designing, experimenting. Within this category, one or more of the following is not apparent regarding concept, task or skill: using and applying knowledge, identifying connections between ideas, problem solving, designing, experimenting. Analysis Within this category, one or more of the following is apparent regarding concept, task or skill: identifying and analyzing patterns and relationships, evidence of organizing ideas, recognizing Within this category, one or more of the following is not apparent regarding concept, task or skill: identifying and analyzing patterns and relationships, evidence of organizing ideas, recognizing  68 trends. trends. Synthesis Within this category, one or more of the following is apparent regarding concept, task or skill: inferring, predicting, modifying, inventing, composing, modifying. Within this category, one or more of the following is not apparent regarding concept, task or skill: inferring, predicting, modifying, inventing, composing, modifying. Evaluation Within this category, one or more of the following is apparent regarding concept, task or skill: comparing ideas, evaluating outcomes, solving, and judging. Within this category, one or more of the following is not apparent regarding concept, task or skill: comparing ideas, evaluating outcomes, solving, and judging.    69 SEMLI-S: CO (Control of Concentration)    Science student demonstrates the following related to this CO category: adjusting his/her level of concentration depending on learning situation. (SEMLI-S: CO1) Science student does not demonstrate the following related to this CO category: adjusting his/her level of concentration depending on learning situation. (SEMLI-S: CO1) Knowledge Within this category, one or more of the following is apparent regarding concept, task or skill: Information recall, statement of a science observation, listing related science information, defining a science term related to assignment task/skill taught.  Within this category, one or more of the following is not apparent regarding concept, task or skill: Information recall, statement of a science observation, listing related science information, defining a science term related to assignment task/skill taught.  Comprehension Within this category, one or more of the following is apparent regarding concept, task or skill: understanding, explaining, summarizing. Within this category, one or more of the following is not apparent regarding concept, task or skill: understanding, explaining, summarizing.  Application Within this category, one or more of the following is apparent regarding concept, task or skill: using and applying knowledge, identifying connections between ideas, problem solving, designing, experimenting. Within this category, one or more of the following is not apparent regarding concept, task or skill: using and applying knowledge, identifying connections between ideas, problem solving, designing, experimenting. Analysis Within this category, one or more of the following is apparent regarding concept, task or skill: identifying and analyzing patterns and relationships, evidence of organizing ideas, recognizing trends. Within this category, one or more of the following is not apparent regarding concept, task or skill: identifying and analyzing patterns and relationships, evidence of organizing ideas, recognizing trends.  70 Synthesis Within this category, one or more of the following is apparent regarding concept, task or skill: inferring, predicting, modifying, inventing, composing, modifying. Within this category, one or more of the following is not apparent regarding concept, task or skill: inferring, predicting, modifying, inventing, composing, modifying. Evaluation Within this category, one or more of the following is apparent regarding concept, task or skill: comparing ideas, evaluating outcomes, solving, and judging. Within this category, one or more of the following is not apparent regarding concept, task or skill: comparing ideas, evaluating outcomes, solving, and judging.       71 Within the SEMLI_S rubric, a numerical designation of one was assigned if there was little or no evidence of one or more criteria in the following categories: Constructivist Connectivity (CC); Monitoring, Evaluating, Planning (MEP); Self-Efficacy (SE); Learning Risks Awareness (AW); and Control of Concentration (CO). A numerical designation of two was assigned if one or more of the established criteria were demonstrated. Imbedded within these SEMLI_S categories, evidence of Bloom’s taxonomy categories received a designation of one if one or more criteria were not present. A numerical designation of two was assigned if one or more criteria were present. For example, in question #1 on “constructing knowledge” for Set 1 (designing a useful data table for an Alka-Seltzer lab), a value of one was assigned if the student wrote an unrelated comment or confusing answer. This included “The tablet took longer to dissolve than the crushed table”,’ or “It smelled like Sprite”. A value of two was assigned for comments such as: “observations and trials are necessary (in data table construction)”; “specific units are necessary for all numbers”; or “different forms (of data tables) may be needed depending (on your) hypothesis and what data you need to collect”. A designation of one was also assigned if questions were not answered.  To investigate how SEMLI_S and Bloom’s taxonomy results can be compared amongst clicker use and traditional teaching methods (Research Question 2), Small Group Work handout questions #4 & 5 (Figure 6) were assigned scores of one through three which reflected the criteria created in the Bloom’s taxonomy rubric scale (Table 3).     72 Table 3: Bloom’s taxonomy rubric for small group work handout questions #4 & 5    Strong evidence of one or more traits satisfied for this category 3 (HOTS = Higher Order Thinking Skills) Adequate evidence of one or more traits satisfied for this category 2 (MOTS = Middle Order Thinking Skills) Limited or no evidence of trait development for this category 1 (LOTS = Lower Order Thinking Skills) Bloom’s Taxonomy Categories:    Knowledge -! strong evidence of information recall -! accurate term definitions -! stating facts accurately -! making accurate observations  -! adequate recall of information -! adequate term definition -! adequate factual statements made -! accurate observations made -! limited or no recall of information -! weak term definition(s) or terms not defined -! weak factual statements made -! observations inaccurate, not made, or observations unrelated to task Comprehension -! strong understanding of concepts and/or skills -! solid evidence of concept explanation, confirmation of ideas and/or summary of concept -! adequate understanding of concept evident -! adequate explanation, confirmation of ideas and/or summary of concept given  -! limited or no understanding of concept is evident or no comment/question listed -! weak or no explanation -! weak or no confirmation of ideas given -! summary of concept insufficient or not given  73  Strong evidence of one or more traits satisfied for this category 3 (HOTS = Higher Order Thinking Skills) Adequate evidence of one or more traits satisfied for this category 2 (MOTS = Middle Order Thinking Skills) Limited or no evidence of trait development for this category 1 (LOTS = Lower Order Thinking Skills) Application -! strong evidence of knowledge application and/or problem solving -! strong evidence of designing and/or experimenting -! strong evidence of identifying connections and relationships between ideas -! adequate knowledge application and/or problem solving -! evidence of adequate designing and/or experimenting -! adequate evidence of identifying connections and relationships between ideas -! limited or no evidence of application and/or problem solving -! limited or no evidence of designing and/or experimenting -! limited or no evidence of identifying connections and relationships between ideas Analysis -! strong evidence of identifying and analyzing patterns -! proficient in organization of ideas -! proficient in recognizing trends -! ability to identify and analyze patterns adequate -! adequate organization of ideas -! ability to recognize trends adequate -! limited or no evidence of pattern identification and analysis -! limited ability to organize ideas or organization of ideas not evident -! ability to recognize trends limited or not apparent  74  Strong evidence of one or more traits satisfied for this category 3 (HOTS = Higher Order Thinking Skills) Adequate evidence of one or more traits satisfied for this category 2 (MOTS = Middle Order Thinking Skills) Limited or no evidence of trait development for this category 1 (LOTS = Lower Order Thinking Skills) Synthesis -! strong ability to use old ideas to create new ones -! strong ability to design and invent -! strong ability to infer, modify, predict -! strong ability to combine ideas -! adequate ability to use old ideas to create new ones -! adequate ability to design and invent -! adequate ability to infer, modify, predict -! adequate ability to combine ideas -! limited ability to use old ideas to create new ones or no evidence of ability to use old ideas to create new ones -! limited or no evidence of ability to design and invent -! limited or no evidence of ability to infer, modify, predict -! limited or no evidence of ability to combine ideas Evaluation -! strong ability to assess theories -! strong evidence of judging the outcome of a situation -! strong evidence of idea comparisons -! adequate ability to assess theories -! adequate evidence of judging the outcome of a situation -! adequate evidence of idea comparisons -! limited or no evidence of ability to assess theories -! limited or no evidence of ability to judge the outcome of a situation -! limited or no evidence of ability to compare ideas  For the Bloom’s taxonomy rubric, a numerical value of one was assigned for lower order thinking skills (LOTS); a value of two was assigned for middle order thinking skills (MOTS); and a value of three was assigned for higher order thinking skills (HOTS). These three  75 designated thinking skill levels reflect the six Bloom’s taxonomy categories with a numerical designation of one (LOTS) representing little or no evidence of a traits within a particular category, a numerical designation of two (MOTS) representing adequate evidence of one or more traits within a particular category, and a numerical designation of three (HOTS) representing strong evidence of one or more traits within a particular category satisfied.  For example, in question #4 for Set 1 (designing a useful data table for an Alka-Seltzer lab), a value of one was assigned if the student wrote an unrelated or confusing question. This included “How long does the observation paragraph have to be?” or “What would happen if you didn’t clean a cup fully?” A value of two was assigned for questions such as: “Can you make data tables with one subject in them?” or “How were data tables used long ago?” A value of three was assigned for questions such as: “Why do different experiments require different style data tables?” or “How do you simplify a data table with so many complicated things in it?” A designation of one was also assigned if question #4 was not answered. SEMLI_S and Bloom’s taxonomy results were compared based on the rubric criteria.  3.1 Participants  This study was conducted between January 2014 and May 2014 at a private school in Southern California (USA) with an approved ethics protocol (protocol #: H13-00587) to answer the research questions. This educational institution is a co-educational N-12 independent school situated in an urban setting and has a student population of approximately 1,150. In terms of ethnic population, 77.57% of students are White, 9.47% Hispanic, 10.46% are Asian/Pacific, and 2.50% are African American. Class size in the middle school is approximately 20 students per class. Frequency of class periods is three times a week and duration of each class is 45 minutes long, with 90 minute long block classes occurring once a week. In total, students  76 receive two hundred twenty-five minutes of science instruction per week. The middle school consists of grades five through eight, and teachers in the fifth and sixth grade levels teach four academic classes per day, with some teaching additional elective classes. Out of the four total fifth grade classes participating in this study, two classes were even in gender distribution. However in two classes there were three to five more boys than girls. In one of these classes there were twelve boys and seven girls. Another class consisted of eleven boys and eight girls. Abilities between students varied and each class demonstrated an approximate equal mix of academic abilities.  Clicker technology was used over the 2013 - 2014 school years during whole class instruction. Frequency of clicker use was one to two times per week throughout the entire study, taking place over a five-month period beginning in late January 2014. Rationale for clicker use frequency was: 1) to allow students a reasonable amount of time to become accustomed to clicker technology; 2) to allow enough clicker sessions to accommodate any unpredictable technical difficulties (including misuse of clickers by students) or software glitches.  All science classes are predominantly student-centered and learning is inquiry-based. This strategy is a student-centered method of teaching where attention is paid to the knowledge and beliefs students bring to an assignment or activity, where this knowledge is pivotal for formulating new concepts, and where a student’s changing ideas are tracked by the teacher throughout the learning process (Bransford, Brown & Cocking, 2002). A focus on inquiry-based learning is maintained throughout this study. Hands-on activities are frequent, students are required to complete projects (i.e. inventions), and discussions are often student-led. As I have several years of teaching experience, my classroom is well organized, and students adhere to high behavioral standards. Due to safety concerns in a lab situation, there are strict rules of  77 conduct, and overall students maintain high levels of responsibility during labs, written exercises, and the lesson itself. Expectations are high for both students and teacher. The learning curve is steep in the college prep atmosphere of the private school where this research was conducted, and most students are hard working and motivated.  As a professional educator with several years experience, my approach to data collection and analysis was as unbiased and fair as possible. As their teacher, I was aware that bias may exist and, to the best of my ability, I acted and conducted this study in a fair and unbiased manner. All participants were informed about the nature of the study, and all aspects of their participation were voluntary and confidential. My students understand that fifth grade is a foundational year for science process skills, and students should be able to apply what they learn in this class to their sixth, seventh and eighth grade science classes. As a seasoned educator, I view every student as a unique individual. Each student has something remarkable to contribute, and each one has his or her own potential. At the beginning of the year, students quickly become accustomed to a routine that includes how to handle and use lab equipment and where to locate supplies. They realize the importance of knowing their lab job. A nurturing atmosphere has been established in the classroom, and students are able to have fun while they explore, discover, and investigate during curricular activities. Along with a generally high level of enthusiasm, abundant energy and a positive attitude towards learning, all fifth grade science students brought a variety of experiences and insights to this study. The effort by participants enriched the research data and enhanced lessons, discussions, lab work, questionnaires, and post-study video interviews.  Although students were asked to complete group work handouts on five occasions for this research study, sets of data from three intervals were analyzed and reviewed. These three  78 intervals took place February 19, April 11, and May 5, 2014.and reflected three different chemistry topics including creation of an Alka-Seltzer lab data table, balancing chemical equations, and analyzing data in a chemistry of chocolate lab activity. All 75 students consented to participate in this study and, after fielding some parent questions regarding my research, all parent consent forms were signed. Alphabetical letters were assigned to students in each of the four classes, and names stored in clicker software were concealed. All students’ names were removed from handouts. A calendar of clicker use by class, lesson topics, curriculum details, and instructional methods (Appendix A) was kept for the duration of this study.  3.2 Student preparation: POOP & QUEST My students were well versed in classroom routines by the time this study began. They were proficient at discussion protocol during science lessons and understood how to divide lab jobs amongst team members during experimentation prior to initiation of this study. Since research for this study began in late January 2014, students were given several classroom opportunities from late August 2013 to January 2014 to develop their small group collaboration skills. For example, a system called ‘POOP in a group’ (an acronym for People Of Outstanding Procedures) was implemented to engage students more effectively and increase participation in active learning situations during small group lab activities. Lab procedures in this case includes experimental activities where students are in small groups of 3 - 4, have specific jobs such as investigator, equipment monitor, and recorder or technician (=classroom computer use). Students keep their particular job for short lab activities and rotate through job descriptions for longer labs. ’POOP in a group’ was meant to be humorous while, at the same time, intended to focus students’ attention on meaningful collaborative strategies. While participating in lab activities, students were required to complete a two-question worksheet (Figure 5) where team members  79 practiced documenting questions they asked and comments they made during lab procedures prior to research data collection.  As students followed pre-set lab procedures and conducted experiments, I circulated throughout the classroom and visited lab stations frequently. If students struggled with ideas on worksheet questions or comments, examples from previous lab activities were given and they were offered encouragement and support. Lab goals were consistently reinforced and students who veered off-task were reminded of their specific lab job and group responsibility. One or two students who still struggled with this task were given one-on-one help and were supervised more closely until they felt more comfortable with the task. By participating in this exercise, students became accustomed to sharing ideas, asking questions and taking risks in a small group setting prior to formal data collection.  Several weeks prior to data collection, students also became familiar with a simpler way of learning the scientific method called QUEST (Figure 9). The purpose of QUEST was to introduce an easier, less cumbersome way of sharing important scientific method features while, at that same time, reinforcing the importance of: establishing a big question; communicating with lab partners and working well as a team; understanding individual lab jobs; being familiar with lab equipment for a specific task; being precise during data collection; and reflecting carefully on scientific results. QUEST was also meant to reduce anxiety related to the formality of the scientific method which might, at times, seem intimidating or complex to young middle school students. Class discussions involved review of QUEST terms and reinforcement of QUEST components occurred prior to each lab activity. A new lab activity commenced only after students felt comfortable with the QUEST concept.  80 Figure 9: QUEST handout     QUEST Simplifying the Scientific Method 5th Grade Science - Mrs. D  Name:_________________________________TG:____Date:____________________ Introduction: In science, we’re constantly on a QUEST to seek information and gain knowledge about the world around us. Regardless of the topic being covered, learning about concepts and practicing science process skills is at the core of every investigation. QUEST can help guide you towards achieving these goals by offering an easy acronym to remember and by outlining a simpler version of the scientific method steps. Keep QUEST in mind as you complete your investigation.  Q =  Question  This refers to 2 things. 1) What is the big question you’re trying to answer by conducting the experiment? 2) What is your experimental hypothesis (= educated prediction)? Do research and consult partners first with any questions that come to mind throughout the activity.  U =  Understand Understand what’s being asked of you prior to beginning the experiment. Make sure you have all necessary materials and read the lab procedure thoroughly prior to beginning. You’ll get confused quickly if you don’t know what you’re doing, why you’re doing it or how you should proceed. Understand your lab job and rotate through lab responsibilities as instructed by your teacher.  E =  Experiment Conduct the experiment. Collect and record data as you work through your investigation. Use all lab equipment safely!  S =  Summarize Reflect and analyze your data. Were the results predictable or unexpected? Were patterns apparent? What about inconsistencies in the data? What conclusions can you make? Ask more questions, consult your group (see below), discuss the data and reflect on your hypothesis. Write up a lab summary if necessary or answer required questions on the lab handout.  T =  Teamwork! Collaborate with group members every step of the way as you experiment. Be respectful, polite and work cooperatively throughout the investigation. Know your lab job, share ideas and be a good listener. Present your team’s results to the class.   81 For several weeks prior to data collection, Socratic questioning strategies (Appendix D) were introduced. During whole class discussions, concept reviews and small group work, students practiced using simple Socratic questions. These questioning strategies encouraged students to probe assumptions, seek concept clarification, establish perspectives, and investigate implications. Sample questions included: Why did you say that? (Questions about clarification); How can you prove that assumption? (Questions that probe assumptions); What caused __________ to happen? and Why? (Questions that probe reasons and evidence); What would be an alternative to this situation? (Questions that probe viewpoints and perspectives); How does X affect Y? (Questions that probe implications and consequences); and What was the point of asking this question? (Questions about questions). Questions appropriate for young middle school students were drawn from the Taxonomy of Socratic Questions list (Appendix D) and were modified to suit a particular lab activity or discussion topic. The project was described to all participants and definitions of the following terms were reviewed: metacognition; knowledge; comprehension; application; analysis; synthesis; and evaluation. Students were given a copy of the Bloom’s taxonomy categories in the typical “pyramid” format (Figure 10). However all titles referencing the title “Bloom’s Taxonomy” were removed. To eliminate confusion over terminology and to reduce the number of terms students were introduced to throughout the study, the Bloom’s taxonomy handouts were referred to as “metacognition pyramids.” This supported an increase in reinforcement of new terminology and reduced confusion of what students might perceive as difficult or complex language. It also allowed students to see metacognition as a common thread, not exclusive to one particular activity, but woven throughout class discussions, handouts, and small group work.   82 Figure 10: Bloom’s taxonomy  Google Image, 2014 3.3 Data collection This mixed methods research incorporated both quantitative and qualitative data that were gathered at three intervals from late January 2014 through May 2014. These intervals included Set 1: February 19 & 20, 2014, Set 2: April 11, 2014 and Set 3: May 12, 2014. The following information describes the instruments and methods (quantitative and qualitative) used to gather data for this research. To investigate elements of students’ metacognition in science learning contexts, the Self-Efficacy and Metacognition Learning Inventory - Science (SEMLI-S) instrument was used to provide background on how students enter and engage into learning situations (Anderson, Nashon & Thomas, 2008). Initially consisting of 35 parts with five sub scales and 30 specific  83 items, the five sub scale categories were maintained, however due to relevance or suitability for the age group studied, SEMLI-S items were reduced from 30 to 17. The five sub scales which reflect a dimension of students’ self-perceived metacognitive science learning orientation (Anderson, Nashon & Thomas, 2008) include: Constructivist Connectivity (CC): perceptions of the capacity to construct connections between information and knowledge across various science learning locations and with students’ background knowledge; Monitoring, Evaluation & Planning (MEP): the ability to keep track of the learning process so things make sense, the ability to assess the fruitfulness of the adopted learning strategies, and the conscious awareness and deliberate strategizing to manage learning; Self-Efficacy (SE): perceptions of the capacity to learn and use knowledge; Learning Risks Awareness (AW); and Control of Concentration (CO): self-regulation; and conscious control over the learning process. Table 4 (below) outlines the 17 SEMLI-S dimensions and subsequent items used for this study. Table 4: SEMLI_S items Subscales & items chosen for study  Constructivist Connectivity (CC): Item: CC4 I seek to connect the information in science class with what I already know CC1 I seek to connect what I learn from what happens in the science classroom with out of class science activities (e.g. field trips or science visits) CC3 I seek to connect what I learn in my life outside of class with science class CC6 I seek to connect what I learn in other subject areas with science class Monitoring, Evaluation & Planning (MEP): Item:  84 Subscales & items chosen for study  MEP9 I try to predict possible problems that might occur with my learning MEP7 I evaluate my learning processes with the aim of improving them MEP8 I try to understand clearly the aim of a task before I begin it MEP4 I consider whether or not a plan is necessary for a learning task before I begin that task Self-Efficacy (SE): Item: SE6 I am confident of understanding the basic concepts taught in this course SE2 I know I can master the skills being taught in this course SE3 I am confident I can do a good job on the assignments and tests in this science class SE5 I am confident of understanding the most complex material presented by the teacher in this course Learning Risks Awareness (AW): Item: AW1 I am aware of when I am about to have a learning challenge AW5 I am aware of when I am not concentrating AW3 I am aware of when I don’t understand an idea AW2 I am aware of when I am about to lose track of a learning task Control of Concentration (CO): Item: CO1 I adjust my level of concentration depending on the learning situation   85 Bloom’s taxonomy categories (Figure 10) were used to interpret metacognition levels. During discussions with research participants, additional examples were given to further reinforce each category. This also helped students understand category terminology. For example, the foundational knowledge level was described during lessons not only as “recall of information” but as “information gathering”. Questions were posed to students regarding how they might gather information before beginning a lab activity, and examples were shared regarding how students could incorporate prior experiences to foundational knowledge.  Similar conversations shed light for students on comprehension such as. “Generally, how do you confirm, explain, understand or demonstrate in science? Give examples.” To enhance understanding of the application level, students were asked: what problem solving methods did you use in your last experiment? Describe these methods. Or, how did you apply science knowledge to a new situation here? For analysis, students were asked to break ideas down to determine how one factor might affect another. They were also asked to analyze relationships, for example how does habitat affect wild animal populations? To reinforce the synthesis category, students were asked to hypothesize prior to investigations and examples of creative thinking in science were shared. To reinforce this concept, class discussions took place regarding accomplishments of famous creative thinkers familiar to the majority of students. Examples included Stephen Hawking, Einstein, Benjamin Franklin, and Charles Darwin.  The evaluation category proved difficult for students to understand, and they did not engage readily in conversations about this term. In an attempt to simplify the definition of evaluation however, students were asked to engage in a discussion that involved comparing ideas about their most recent science lab. To guide students through small group discussions during lab activities, students were given a file folder containing Bloom’s taxonomy for reference (Figure  86 10) along with enough copies of the Lab Group Work handout for each group member (Figure 6).  3.2.1 Treatment of lab group work handouts As stated earlier, handouts were created for lab group work (=small groups of 3 – 4 students) and individuals within lab teams completed handouts anonymously (Figure 6). Students were asked to record the lesson topic and were instructed to carefully consider the task at hand and work in a team to find solutions to the science problem presented. This exercise was also meant to improve the way students “think about thinking” (metacognition) and to communicate, collaborate, and record answers to handout questions. During small group activities, students conducted investigations, reflected on concepts learned, collaborated throughout exercises, and communicated experimental results. The handout consisted of five questions divided into two parts. The first part included Questions #1-3 (Figure 6), which were considered SEMLI_S questions. The second part consisted of Questions #4 & #5, which was considered Bloom’s taxonomy section. A completed sample student handout is shown in Figure 11 below.  87 Figure 11: Lab group work handout, completed student sample      88 Bullet points made by students under each category for the particular concept taught were analyzed based on SEMLI-S sub scales CC, MEP, SE, AW & CO. Under each SEMLI-S sub scale, a value of ‘1’ was given for no evidence of satisfying a particular Bloom’s taxonomy category. A value of ‘2’ was given for evidence of satisfying a particular Bloom’s taxonomy category. If multiple bullet points were for offered for Question #1, 2 or 3, then these multiple ideas were considered together and interpreted either as a ‘1’ or a ‘2.’ On the Excel spreadsheet sample (Appendix E), shaded values indicated clicker use for those particular classes. Non-shaded values represent non-clicker use. Questions #4 & 5 were given values of ‘1’ for no evidence of a specific Bloom’s taxonomy category, ‘2’ for some evidence of a Bloom’s taxonomy category and ‘3’ for strong evidence of a specific Bloom’s taxonomy category. Rubrics in Tables 2 & 3 describe distribution of numerical values for SEMLI-S and Bloom’s taxonomy categories. 3.2.2 Treatment of clickers Students became familiar with clickers during November and December 2013 prior to data collection. They enthusiastically approached activities involving clickers, and all students were already or did they become? proficient in the use of this technology. Clicker use centered on concept review sessions and included five to 10 PowerPoint questions at a time. Seven power point sets were created for this study throughout a chemistry unit (see Appendix B). Table 5 below reflects the Clicker Technology (CT) and Non-Clicker (NC) review schedules for each class during the data collection intervals. Individual scores for PowerPoint sessions were recorded in clicker software. Each class received a designation of A, B, C or D to maintain student anonymity and these designations were maintained for each class for the duration of the study. Over a five-month period, all classes received the same curricular information, however  89 two classes at a time received traditional lesson reviews prior to quizzes while two classes participated in clicker review sessions prior to quizzes.  Table 5: Clicker Technology (CT) – Non-Clicker (NC) review schedule Data Interval Class Topic Type of review Set 1 Class 5A, 5B Chemistry: Data table construction Clicker technology (CT) = PowerPoint #1 & #2 Set 1 Class 5C, 5D Chemistry: Data table construction Non-Clicker (NC) = Traditional Set 2 Class 5C, 5D Chemistry: Balancing equations Clicker technology (CT) = PowerPoint #3 - #6 Set 2 Class 5A, 5B  Chemistry: Balancing equations Non-Clicker (NC) = Traditional Set 3 Class 5A, 5B  Chemistry: Theobromine lab Clicker technology (CT)  = PowerPoint #6 & #7 Set 3 Class 5C, 5D  Chemistry: Theobromine lab Non-Clicker (NC) = Traditional   Questions were included in all PowerPoint presentations to determine student confidence levels regarding review material. When the majority of the class answered a power point question incorrectly, students were sometimes asked to have side conversations connected with that particular topic.The same PowerPoint question was then asked again. Traditional teaching methods used in my science classes rely on teacher-guided instruction combined with whole class lessons and small group interactions. In whole class settings, students might participate in content driven discussions and take turns answering review  90 questions. Students may also answer textbook questions on chapter material, complete content specific handouts, read aloud pertinent chapter information, and review previously completed handouts currently in students’ binders. Students typically volunteer in class by raising their hand to make comments or ask questions throughout traditional teaching methods sessions.  Chapter 3 included the methodology developed for this study as well as data collection procedures and participants. In addition to providing detailed SEMLI_S and Bloom’s taxonomy rubrics, Chapter 3 also elaborated on treatment of student handouts and clickers. Chapter 4 offers an in-depth look at research results both for SEMLI_S and Bloom’s taxonomy questions and sets as well as an analysis of the student exit questionnaire, and quiz results with and without clicker-enhanced pedagogy.     91 Chapter 4: Results  Chapter 4 offers a detailed analysis of SEMLI_S and Bloom’s taxonomy questions and sets. This chapter also includes descriptions of specific SEMLI_S and Bloom’s taxonomy scores as well as descriptions of mean values and comparisons between scores. Summaries comparing Bloom’s taxonomy and specific SEMLI_S categories are also discussed in this chapter as well as an analysis of student exit questionnaires and quiz results.    Results were tallied for Questions #1-3 based on the SEMLI_S metacognition rubric for small group work (Table 2). Results for Questions #4 and #5 are based on the Bloom’s taxonomy rubric for small group work for Questions #4 and #5 (Table 3). The following information offers an analysis of comparisons between specific SEMLI_S and Bloom’s taxonomy categories by question and by set. Results also include a summary of the student exit questionnaire (Table 15) and quiz results for clicker-enhanced pedagogy and non-clicker use. To better understand the results, it is beneficial to examine each originally proposed research question in conjunction with data tables and figures presented in this section. Research question 1: How does clicker-enhanced pedagogy affect student metacognition and conceptual understanding in the context of small (lab group) interactions and whole group middle school science classes? 4.1 Analysis of SEMLI_S questions and sets For the following analysis of knowledge, comprehension and application results, comparisons for questions 1 - 3 and sets 1 - 3 are based on a clicker value of 100%. Differences between clicker and non-clicker values are shown in Tables 6 – 11. For each knowledge by clicker category question #1 - 3, mean SEMLI_S results for Bloom’s taxonomy categories indicated that students using clickers scored lower for all three questions. As Table 6 indicates,  92 students who did not use clickers for question 1 scored 3.78% higher than students who used clickers. Mean scores for question 2 indicated students who did not use clickers scored 2.69% higher than students who used clickers. Mean scores for question 3 indicated students who did not use clickers scored 0.87% higher than students who used clickers. Graphs of knowledge by clicker questions 1-3 reflect individual SEMLI_S category results in Figure 12 (a) and average SEMLI_S results in Figure 12 (b).  Table 6: Knowledge by clicker question 1-3 KNOWLEDGE by Clicker Questions 1 - 3 SEMLI-S Q1 Clicker Q1 No Clicker Q2 Clicker Q2 No Clicker Q3 Clicker Q3 No Clicker CC 1.695 1.794 1.758 1.820 1.933 1.947 MEP 1.704 1.794 1.758 1.820 1.941 1.947 SE 1.695 1.794 1.758 1.809 1.932 1.947 AW 1.979 1.934 1.752 1.776 1.923 1.947 CO 1.695 1.783 1.749 1.787 1.914 1.936 KNOWLEDGE by Clicker Questions 1 - 3 AVG SEMLI-S 1.753 1.820 1.755 1.802 1.929 1.945 % Diff  3.78%  2.69%  0.87%  Figure 12 (a): Graph of knowledge by clicker question 1-3  1.4001.5001.6001.7001.8001.9002.0002.1002.200CC MEP SE AW COKNOWLEDGE by Clicker Questions 1 - 3 Q1 ClickerKNOWLEDGE by Clicker Questions 1 - 3 Q1 No ClickerKNOWLEDGE by Clicker Questions 1 - 3 Q2 ClickerKNOWLEDGE by Clicker Questions 1 - 3 Q2 No Clicker 93 Figure 12 (b): Graph of average SEMLI_S knowledge question 1-3   Results for knowledge by set 1 - 3, (Table 7) show mean SEMLI_S values. This indicates that for set 1, students who did not use clickers scored 4.77% lower than students who used clickers. In set 2, students who did not use clickers scored 10.16% higher than those who used clickers. The greatest differences in set 2 occurred within four specific SEMLI_S categories: CC, MEP, SE and CO. Students in these four categories who did not use clickers scored 12.39% higher than students who used clickers. In the AW category, the non-clicker users scored 2.69% higher than clicker users, indicating that students on average, scored higher for set 2 in CC, MEP, SE and CO than AW. In set 3, students who did not use clickers scored 3.05% higher than students who used clickers. Graphs of knowledge by set 1-3 reflect individual SEMLI_S category results in Figure 13 (a) and average SEMLI_S results in Figure 13 (b).  1.6501.7001.7501.8001.8501.9001.9502.000AVG SEMLI-SQ1 ClickerQ1 No ClickerQ2 ClickerQ2 No ClickerQ3 ClickerQ3 No Clicker 94 Table 7: Knowledge by clicker set 1-3 KNOWLEDGE by Set 1 - 3 SEMLI-S S1 Clicker S1 No Clicker S2 Clicker S2 No Clicker S3 Clicker S3 No Clicker CC 1.964 1.884 1.598 1.796 1.823 1.882 MEP 1.981 1.884 1.598 1.796 1.823 1.882 SE 1.963 1.873 1.598 1.796 1.823 1.882 AW 1.964 1.861 1.896 1.947 1.795 1.849 CO 1.936 1.839 1.598 1.796 1.823 1.871 AVG SEMLI-S 1.962 1.868 1.658 1.826 1.818 1.873 % Diff  -4.77%  10.16%  3.05%  Figure 13 (a): Graph of knowledge by clicker set 1-3   1.4001.5001.6001.7001.8001.9002.0002.100CC MEP SE AW COS1 ClickerS1 No ClickerS2 ClickerS2 No ClickerS3 ClickerS3 No Clicker 95 Figure 13 (b): Graph of average SEMLI_S knowledge set 1-3   For comprehension by clicker category questions #1 - 3, mean SEMLI_S results for Bloom’s taxonomy categories indicated that students who did not use clickers scored lower for all three questions. As Table 8 shows, students who did not use clickers for question 1 scored 3.81% lower than students who used clickers. Within question 1, Table 8 shows the greatest difference occurred in SEMLI_S category AW reflecting that students who did not use clickers scored 10.78 % lower than students who used clickers. In SEMLI_S categories CC, MEP, SE and CO for question 1, students who did not use clickers on average scored 2.13% lower, indicating that students who did not use clickers scored lower in AW than in CC, MEP, SE, and CO. Mean scores for question 2 indicate students who did not use clickers scored 12.05% lower than student who used clickers. Within question 2, the greatest difference occurred in SEMLI_S category AW reflecting that students who did not use clickers scored 14.10 % lower than students who used clickers. Mean scores for question 3 indicate that students who did not use clickers scored 5.92% lower than students who used clickers (Table 8). Graphs of comprehension by question 1-3 reflect individual SEMLI_S category results in Figure 14 (a) and average SEMLI_S results in Figure 14 (b). 1.5001.5501.6001.6501.7001.7501.8001.8501.9001.9502.000AVG SEMLI-SS1 ClickerS1 No ClickerS2 ClickerS2 No ClickerS3 ClickerS3 No Clicker 96 Table 8: Comprehension by clicker question 1-3 COMPREHENSION by Question 1 - 3 SEMLI-S Q1 Clicker Q1 No Clicker Q2 Clicker Q2 No Clicker Q3 Clicker Q3 No Clicker CC 1.600 1.541 1.513 1.335 1.436 1.340 MEP 1.565 1.541 1.496 1.335 1.420 1.340 SE 1.565 1.541 1.496 1.303 1.402 1.340 AW 1.850 1.670 1.499 1.303 1.423 1.331 CO 1.565 1.541 1.488 1.314 1.410 1.321 AVG SEMLI-S 1.629 1.567 1.498 1.318 1.418 1.334 % Diff  -3.81%  -12.05%  -5.92%  Figure 14 (a): Graph of comprehension by clicker question 1-3  Figure 14 (b): Graph of average SEMLI_S comprehension question 1-3  1.2001.3001.4001.5001.6001.7001.8001.900CC MEP SE AW COCOMPREHENSION by Question 1 - 3 Q1 ClickerCOMPREHENSION by Question 1 - 3 Q1 No ClickerCOMPREHENSION by Question 1 - 3 Q2 ClickerCOMPREHENSION by Question 1 - 3 Q2 No Clicker0.0000.2000.4000.6000.8001.0001.2001.4001.6001.800AVG SEMLI-SQ1 ClickerQ1 No ClickerQ2 ClickerQ2 No ClickerQ3 ClickerQ3 No Clicker 97  Results for comprehension by set 1 - 3, Table 9.1 show mean SEMLI_S values. This indicates that, for set 1, students who did not use clickers scored 20.61% lower than students who used clickers. All SEMLI_S categories showed higher values for clicker use compared with non-clicker use amongst students in set 1. The greatest differences in set 1 occurred within SEMLI_S category CC indicating that students who did not use clickers scored 30.27% lower for than students who used clickers. In the other SEMLI_S categories (MEP, SE, AW, CO) non-clicker users scored 24.95% lower than the CC category indicating that students on average scored lower for CC in set 1 than MEP, SE, AW, and CO. In set 2, students who did not use clickers scored 8.38% higher than those who used clickers. In set 3, students who did not use clickers scored 4.70% lower than students who used clickers. Graphs of comprehension by set 1-3 reflect individual SEMLI_S category results in Figure 15 (a) and average SEMLI_S results in Figure 15 (b).    98 Table 9: Comprehension by clicker set 1-3 COMPREHENSION by Set 1 - 3 SEMLI-S S1 Clicker S1 No Clicker S2 Clicker S2 No Clicker S3 Clicker S3 No Clicker CC 1.859 1.427 1.263 1.414 1.426 1.375 MEP 1.792 1.427 1.263 1.414 1.426 1.375 SE 1.774 1.427 1.263 1.404 1.426 1.353 AW 1.792 1.427 1.563 1.555 1.418 1.321 CO 1.774 1.427 1.263 1.384 1.426 1.364 AVG SEMLI-S 1.798 1.427 1.323 1.434 1.425 1.358 % Diff  -20.61%  8.38%  -4.70%  Figure 15 (a): Graph of comprehension by clicker set 1-3  1.0001.1001.2001.3001.4001.5001.6001.7001.8001.9002.000CC MEP SE AW COCOMPREHENSION by Set 1 - 3 S1 ClickerCOMPREHENSION by Set 1 - 3 S1 No ClickerCOMPREHENSION by Set 1 - 3 S2 ClickerCOMPREHENSION by Set 1 - 3 S2 No ClickerCOMPREHENSION by Set 1 - 3 S3 ClickerCOMPREHENSION by Set 1 - 3 S3 No Clicker 99 Figure 15 (b): Graph of average SEMLI_S comprehension set 1-3   For each application by clicker category question #1 - 3, mean SEMLI_S results for Bloom’s taxonomy categories indicated that students who did not use clickers scored lower for all three questions. As Table 10 shows, students who did not use clickers for question 1 scored 14.27% lower than students who used clickers. Within question 1, the greatest difference occurred in SEMLI_S category MEP indicating that students who did not use clickers scored 18.27 % lower than students who used clickers. In SEMLI_S categories CC, SE, AW and CO for question 1, students who did not use clickers on average scored 16.21% lower, indicating that students who did not use clickers scored lower in MEP than in CC, SE, AW, and CO. All question 1 SEMLI_S categories reflected greater differences between clicker and non-clicker use than question 2 or question 3. Mean scores for question 2 indicated students who did not use clickers scored 5.95% lower than student who used clickers. Mean scores for question 3 indicated students who did not use clickers scored 7.31% lower than students who used clickers. Graphs of application by question 1-3 reflect individual SEMLI_S category results in Figure 16 (a) and average SEMLI_S results in Figure 16 (b). 0.0000.2000.4000.6000.8001.0001.2001.4001.6001.8002.000AVG SEMLI-SS1 ClickerS1 No ClickerS2 ClickerS2 No ClickerS3 ClickerS3 No Clicker 100 Table 10: Application by clicker question 1-3 APPLICATION by Question 1 - 3 SEMLI-S Q1 Clicker Q1 No Clicker Q2 Clicker Q2 No Clicker Q3 Clicker Q3 No Clicker CC 1.195 1.026 1.135 1.083 1.165 1.047 MEP 1.204 1.018 1.152 1.075 1.157 1.047 SE 1.196 1.018 1.149 1.056 1.089 1.047 AW 1.173 1.018 1.133 1.075 1.079 1.038 CO 1.167 1.009 1.115 1.057 1.139 1.038 AVG SEMLI-S 1.187 1.018 1.137 1.069 1.126 1.044 % Diff  -14.27%  -5.95%  -7.31%  Figure 16 (a): Graph of application by clicker question 1-3  0.9000.9501.0001.0501.1001.1501.2001.250CC MEP SE AW COAPPLICATION by Question 1 - 3 Q1 ClickerAPPLICATION by Question 1 - 3 Q1 No ClickerAPPLICATION by Question 1 - 3 Q2 ClickerAPPLICATION by Question 1 - 3 Q2 No ClickerAPPLICATION by Question 1 - 3 Q3 Clicker 101 Figure 16 (b): Graph of average SEMLI_S application question 1-3    Results for application by set 1 - 3, Table 11 show mean SEMLI_S values. All SEMLI_S categories showed higher values for clicker use compared with non-clicker use amongst students. For set 1, students who did not use clickers scored 12.31% lower than students who used clickers. In set 2, students who did not use clickers scored 0.29% lower than those who used clickers. Set 2 reflected the most similar results with a maximum difference of 0.29% between clicker and non-clicker use for CC, MEP, AW and CO. Set 2 reflected a minimum difference of 0.098% between clicker and non-clicker use for SEMLI_S category SE. In set 3, students who did not use clickers scored 13.89% lower than students who used clickers, reflecting the largest difference between all three sets. Within set 3, the greatest differences occurred in SEMLI_S categories CC, MEP and SE indicating that students who did not use clickers scored 18.32% lower for than students who used clickers. In the other SEMLI_S categories, AW and CO, non-clicker users scored 12.88% lower than the CC category indicating that students on average scored lower for CC, MEP and SE in set 3 than AW and CO. Graphs of application by set 1-3 reflect individual SEMLI_S category results in Figure 17 (a) and average SEMLI_S results in Figure 17 (b). 0.9000.9501.0001.0501.1001.1501.200AVG SEMLI-SQ1 ClickerQ1 No ClickerQ2 ClickerQ2 No ClickerQ3 ClickerQ3 No Clicker 102 Table 11: Application by clicker set 1-3 APPLICATION by Set 1 - 3 SEMLI-S S1 Clicker S1 No Clicker S2 Clicker S2 No Clicker S3 Clicker S3 No Clicker CC 1.253 1.105 1.034 1.031 1.208 1.021 MEP 1.271 1.088 1.034 1.031 1.208 1.021 SE 1.205 1.079 1.022 1.021 1.208 1.021 AW 1.214 1.079 1.034 1.031 1.137 1.021 CO 1.219 1.053 1.034 1.031 1.168 1.021 AVG SEMLI-S 1.232 1.081 1.032 1.029 1.185 1.021 % Diff  -12.31%  -0.29%  -13.89%  Figure 17 (a): Graph of application by clicker set 1-3  0.9000.9501.0001.0501.1001.1501.2001.2501.300CC MEP SE AW COAPPLICATION by Set 1 - 3 S1 ClickerAPPLICATION by Set 1 - 3 S1 No ClickerAPPLICATION by Set 1 - 3 S2 ClickerAPPLICATION by Set 1 - 3 S2 No ClickerAPPLICATION by Set 1 - 3 S3 ClickerAPPLICATION by Set 1 - 3 S3 No Clicker 103 Figure 17 (b): Graph of average SEMLI_S application set 1-3   Higher Bloom’s taxonomy categories including analysis, synthesis, and evaluation were not considered in this study since only two individuals amongst 75 total students demonstrated conceptual understanding at the analysis level. No students achieved scores in the synthesis or evaluation categories and, as a result, it was not statistically significant to include values from these two students in this data analysis. Research Question 2: How can SEMLI_S and Bloom’s taxonomy results be compared amongst clicker use and traditional teaching methods? 4.2 Analysis of Bloom’s taxonomy questions 4 & 5  For the following analysis of Bloom’s taxonomy questions 4 and 5 (Figure 6) in clicker versus non-clicker settings, non-clicker represents traditional classroom teaching methods for review sessions in compared with classroom teaching methods, which incorporated use of clicker (CRS) technology. Question 4 asked students to write ‘Questions I Asked’ during lab activities; Question 4 asked students to write “Comments I Made” during the same activities. Results were analyzed based on LOTS, MOTS and HOTS ratings of 1 – 3 (Table 3). Correlations can be made 0.0000.2000.4000.6000.8001.0001.2001.400AVG SEMLI-SS1 ClickerS1 No ClickerS2 ClickerS2 No ClickerS3 ClickerS3 No Clicker 104 between the SEMLI_S scoring system and Bloom’s taxonomy LOTS, MOTS and HOTS scoring system since each student’s answer, comment or question was assessed with the same gradient standard. This standard reflected careful attention to demonstration of evidence within a specific category that traits or characteristics of each category’s components were satisfied or not satisfied (Table 3). Since analysis of SEMLI_S categories for questions 1 - 3 included the first three Bloom’s taxonomy levels (knowledge, comprehension and application), comparisons between these three categories only were made for questions 4 and 5. For questions 4 and 5, mean results for Bloom’s taxonomy categories indicated that students who did not use clickers for knowledge, comprehension and application categories (KCA) scored lower for both questions. As Table 12 shows, students who did not use clickers for question 4 scored 6.36% lower than students who used clickers. Mean scores for question 5 (KCA) indicated students who did not use clickers scored 6.12% lower than students who used clickers. Graphs of Bloom’s taxonomy question 4-5 reflect clicker and no clicker results for: clicker vs. no clicker knowledge through evaluation categories (Figure 18 (a)), Bloom’s taxonomy averages for knowledge through evaluation categories (Figure 18 (b)), individual KCA only categories for Bloom’s taxonomy questions 4-5 (Figure 18 (c)), and KCA only averages for Bloom’s taxonomy questions 4-5 (Figure 18 (d)).     105 Table 12: Bloom’s taxonomy by question 4-5 BLOOM'S TAXONOMY by Question 4-5  Q4 Clicker Q4 No Clicker Q5 Clicker Q5 No Clicker Knowledge 1.667 1.611 1.581 1.449 Comprehension 1.459 1.351 1.312 1.225 Application 1.197 1.085 1.084 1.060 Analysis 1.027 1.000 1.018 1.000 Synthesis 1.009 1.000 1.008 1.000 Evaluation 1.000 1.000 1.000 1.000 Bloom AVG 1.226 1.175 1.167 1.122 % Diff  -4.22%  -3.85% KCA AVG 1.441 1.349 1.326 1.245 % Diff  -6.36%  -6.12%  Figure 18 (a): Graph of Bloom’s taxonomy question 4-5, clicker & no clicker  0.7500.9501.1501.3501.5501.7501.950BLOOM'S TAXONOMY by Questions 4-5 Q4 ClickerBLOOM'S TAXONOMY by Questions 4-5 Q4 No ClickerBLOOM'S TAXONOMY by Questions 4-5 Q5 ClickerBLOOM'S TAXONOMY by Questions 4-5 Q5 No Clicker 106 Figure 18 (b): Graph of Bloom’s taxonomy averages, question 4-5, clicker & no clicker   Figure 18 (c): Graph of KCA only, Bloom’s taxonomy question 4-5, clicker & no clicker  Figure 18 (d): Graph of KCA averages, Bloom’s taxonomy question 4-5, clicker & no clicker   1.0601.0801.1001.1201.1401.1601.1801.2001.2201.240Bloom AVGQ4 ClickerQ4 No ClickerQ5 ClickerQ5 No Clicker0.7500.9501.1501.3501.5501.7501.950Knowledge Comprehension ApplicationBLOOM'S TAXONOMY by Questions 4-5 Q4 ClickerBLOOM'S TAXONOMY by Questions 4-5 Q4 No ClickerBLOOM'S TAXONOMY by Questions 4-5 Q5 ClickerBLOOM'S TAXONOMY by Questions 4-5 Q5 No Clicker1.1001.1501.2001.2501.3001.3501.4001.4501.500KCA AVGQ4 ClickerQ4 No ClickerQ5 ClickerQ5 No Clicker 107 Within question 4, the greatest difference occurred in the application category which shows that students who did not use clickers scored 10.32 % lower than students who used clickers, indicating students scored lower on average for non-clicker use in the application category than in knowledge or comprehension. Within question 5, the greatest difference occurred in the knowledge category which shows students who did not use clickers on average scored 9.11% lower than students who used clickers, indicating students scored lower on average for non-clicker use in knowledge than in comprehension or application. Most question 4 categories reflected greater differences between clicker and non-clicker use than question 5, however the knowledge category for question 5 indicated the greatest point differential than any other category for question 4 or 5.  4.2.1: Summary comparing Bloom’s taxonomy questions 4-5 & SEMLI_S questions 1-3 Comparing average differences between Bloom’s taxonomy questions 4 and 5 with SEMLI_S questions 1- 3 for knowledge, comprehension and application, Bloom’s taxonomy question non-clicker results show greatest similarity to SEMLI_S non-clicker application question 2 results with a 0.41 percentage point difference between values. The greatest average difference between clicker and no clicker results occurred between Bloom’s taxonomy question 4 and SEMLI_S knowledge question 1 which indicated a 10.14 percentage point difference between values.  4.2.2: Summary comparing set 1 Bloom’s taxonomy KCA & SEMLI_S application For set 1, the greatest similarity in average results was shown between Bloom’s taxonomy KCA and SEMLI_S application category indicating students who received traditional teaching methods (no clicker) scored only 0.15 percentage points lower for SEMLI_S than for Bloom’s taxonomy results (Table 13). The greatest difference occurred between Bloom’s taxonomy non- 108 clicker users and SEMLI_S comprehension category non-clicker users indicating students who received traditional teaching methods scored 8.45 percentage points lower on average for SEMLI_S comprehension than for Bloom’s taxonomy. 4.2.3: Summary comparing set 2 Bloom’s taxonomy KCA & SEMLI_S knowledge For set 2, the greatest similarity in average results was shown between Bloom’s taxonomy KCA and SEMLI_S knowledge category indicating students who received traditional teaching methods (no clicker) scored only 5.6 percentage points higher for Bloom’s taxonomy results than SEMLI_S (Table 13). The greatest difference occurred between Bloom’s taxonomy non-clicker results and SEMLI_S application category indicating students who received traditional teaching methods scored 16.05 percentage points higher for Bloom’s taxonomy than for SEMLI_S application. 4.2.4: Summary comparing set 3 Bloom’s taxonomy KCA & SEMLI_S application For set 3, the greatest similarity in average results was shown between Bloom’s taxonomy KCA and SEMLI_S application category indicating students who received traditional teaching methods (no clicker) scored only 3.43 percentage points lower for Bloom’s taxonomy than SEMLI_S results (Table 13). The greatest difference occurred between Bloom’s taxonomy no clicker results and SEMLI_S knowledge category no clicker results which indicated students who received traditional teaching methods scored 20.37 percentage points higher on average for SEMLI_S knowledge than for Bloom’s taxonomy.    109 Table 13: Bloom’s taxonomy by set 1-3 BLOOM'S TAXONOMY by Set 1-3  S1 Clicker S1 No Clicker S2 Clicker S2 No Clicker S3 Clicker S3 No Clicker Knowledge 1.767 1.566 1.330 1.521 1.775 1.504 Comprehension 1.514 1.308 1.121 1.382 1.520 1.175 Application 1.213 1.073 1.018 1.113 1.192 1.031 Analysis 1.067 1.000 1.000 1.000 1.000 1.000 Synthesis 1.026 1.000 1.000 1.000 1.000 1.000 Evaluation 1.000 1.000 1.000 1.000 1.000 1.000 Bloom AVG 1.264 1.158 1.078 1.169 1.248 1.118 % Diff  -8.43%  8.45%  -10.38% KCA AVG 1.498 1.315 1.156 1.339 1.496 1.237 % Diff  -12.16%  15.76%  -17.32%  Figure 19 (a): Graph of Bloom’s taxonomy set 1-3, clicker & no clicker  0.0000.2000.4000.6000.8001.0001.2001.4001.6001.8002.000 BLOOM'S TAXONOMY by Set 1-3 S1 ClickerBLOOM'S TAXONOMY by Set 1-3 S1 No ClickerBLOOM'S TAXONOMY by Set 1-3 S2 ClickerBLOOM'S TAXONOMY by Set 1-3 S2 No ClickerBLOOM'S TAXONOMY by Set 1-3 S3 ClickerBLOOM'S TAXONOMY by Set 1-3 S3 No Clicker 110 Figure 19 (b): Graph of Bloom’s taxonomy averages, set 1-3, clicker & no clicker  Figure 19 (c): Graph of KCA only, Bloom’s taxonomy set 1-3, clicker & no clicker  Figure 19 (d): Graph of KCA averages, Bloom’s taxonomy set 1-3, clicker & no clicker   0.9501.0001.0501.1001.1501.2001.2501.300Bloom AVGS1 ClickerS1 No ClickerS2 ClickerS2 No ClickerS3 ClickerS3 No Clicker0.0000.5001.0001.5002.000Knowledge Comprehension ApplicationBLOOM'S TAXONOMY by Set 1-3 S1 ClickerBLOOM'S TAXONOMY by Set 1-3 S1 No ClickerBLOOM'S TAXONOMY by Set 1-3 S2 ClickerBLOOM'S TAXONOMY by Set 1-3 S2 No ClickerBLOOM'S TAXONOMY by Set 1-3 S3 ClickerBLOOM'S TAXONOMY by Set 1-3 S3 No Clicker0.0000.2000.4000.6000.8001.0001.2001.4001.600KCA AVGS1 ClickerS1 No ClickerS2 ClickerS2 No ClickerS3 ClickerS3 No Clicker 111 4.2.5: Summary for research question 2 From the descriptive statistical analysis in Tables 6 - 13 and Figures 12 – 19, results indicated that, for percent differences in clicker values for knowledge, comprehension and application, Bloom’s taxonomy results for set 1 – 3 were most similar to results obtained in set 1 – 3 for SEMLI_S comprehension. In both these examples, sets 1 and 3 indicated that, on average, students who used clickers received higher scores. However, higher scores were obtained, on average, by students in set 2 for traditional teaching methods in both Bloom’s taxonomy and SEMLI_S comprehension. For set 1, SEMLI_S knowledge, comprehension and application results were similar to Bloom’s taxonomy where clicker use received higher scores compared to traditional teaching methods.  Bloom’s taxonomy results differed from SEMLI_S application category results in set 2, indicating that traditional teaching methods scored higher in Bloom’s taxonomy and clicker use scored higher for set 2 in SEMLI_S application category. Additionally, clicker use scored higher for all three sets in the SEMLI_S application category.  Bloom’s taxonomy results differed from SEMLI_S knowledge category results in set 3, indicating that clicker use scored higher in Bloom’s taxonomy and traditional teaching methods scored higher for SEMLI_S knowledge category in set 3.    4.3 Analysis of student exit questionnaire (survey) At the conclusion of this research study, students were asked to complete an exit questionnaire (survey) based on clicker use in a whole class setting (Figure 20). The questionnaire consisted of five questions with a rating scale of 1-10. A value of 1 represented an ineffective or low score for a particular question whereas a value of 10 represented an extremely effective or a high score. Half points were also permitted for all questions. Students were asked  112 to rate their overall whole class clicker experiences and, in most cases, explanations accompanied ratings for each question. In numerical order from question number one to five, students were asked to: 1) give an overall rating for clicker lessons (compared to traditional lessons where clickers were not used for review sessions); 2) rate the use of shared histogram (bar graph) results during clicker use; 3) rate the level of focus/attention during clicker use; 4) rate the level of usefulness of clickers as a technological device; and 5) rate the effectiveness of side conversations during clicker use. Question #1 reflected an overall average score of 7.8 (rating clicker lessons); the overall average for Question #2 (sharing histogram results) was 7.3; Question #3 reflected an overall average of 8.2 (focus/attention during clicker use); Question #4 indicated the highest average score of 8.4 (clickers as useful technology); and the overall average for Question #5 (side conversation effectiveness) was the lowest score of 6.1 (Figure 21). See Appendix F for a complete list of student response ratings.  113 Figure 20: Student questionnaire sample – clickers, whole class study    114 Figure 21: Graph of student exit questionnaire ratings   Survey values were further analyzed by assigning effectiveness categories within the rating scale of 1-10 for each question. The value for a student response range of 1.0-3.5 was designated as non-effective; 4.0-6.5 was designated as somewhat effective; 7.0-8.5 was designated as effective; 9.0-10.0 was designated as extremely effective. Percentages for each of these designations are summarized in Table 14.    0.01.02.03.04.05.06.07.08.09.0Queston #1: Rate Clicker LessonsQuestion #2: Bar Graph ValueQuestion #3: Focus/attention during clickersQuestion #4: Useful technologyQuestion #5: Side conversation effectivenessRating from 1-10Student Exit Questionnaire 115  Table 14: Summary of percentages for effectiveness categories (exit questionnaire results)  Student responses w/ 1.0-3.5 rating = non-effective  Student responses w/ 4.0-6.5 rating = somewhat effective Student responses w/ 7.0-8.5 rating = effective  Student responses w/ 9.0-10.0 rating = extremely effective Question 1 2.7 % 15.0% 47.0% 36.0% Question 2 8.2% 27.4% 26.0% 38.4% Question 3 4.1% 12.3% 28.8% 54.8% Question 4 4.1% 12.3% 21.9% 61.6% Question 5 20.5% 31.5% 19.2% 28.8%  Question #4 reflected the highest overall average student rating of 8.4 (Figure 21) and, similarly, the highest percent of 9.0-10.0 scores of 61.6% (Table 14) indicating that, overall, students found clickers a useful form of technology. Question #5 reflected the lowest average student rating of 6.1 (Figure 21) and, similarly, the highest percent of 1.0-3.5 scores of 20.5% (Table 14) indicating that, overall, students found side conversations during clicker use ineffective. This finding complements histogram results for multiple-choice clicker questions where values reflect that a lower number of students chose the correct answer after engaging in peer-peer side conversations (see Appendix G). Although Question #2 had the fourth highest rating of 7.3, Table 14 reflects that it ranked third highest out of the five survey questions in terms of percentage in the 9.0-10.0 scores. The percentage value for Question #2 in the 1.0-3.5 ranking was also over twice the percentage value for Question #1, however the 9.0-10.0 rating was, overall, higher for Question #2. This indicates that even though over twice the percentage of students gave Question #2 a ranking of 1.0-3.5, the other three percentage levels resulted in Question #1 having a higher ranking overall (7.8 for Question #1 compared to 7.3 for Question  116 #2). The percentage for Question #2 was 38.4% compared with Question #1, which had a percentage value of 36%. Based on the distribution of percentages in Question #1 & #2 where the percentage for 7.0-8.5 was almost twice that of Question #2, this resulted in Question #1 having a higher overall ranking of 7.8 compared with 7.3 for Question #2.  4.4 Analysis of quiz results Quizzes were administered to all classes (clicker and non-clicker) and results indicated that there was no significant difference in scores between clicker and non-clicker users (Figure 22).  Figure 22: Quiz results for clicker and non-clicker use   Quiz 1 and 2 reflected a 0.21% and 0.35% higher score respectively for students using clickers as a method of curriculum review compared with students who participated in a review session based on traditional teaching methods. Traditional teaching methods included teacher 88.56%80.49%73.42%78.97%80.36%88.77%80.14%67.20%85.94%80.51%0.00%10.00%20.00%30.00%40.00%50.00%60.00%70.00%80.00%90.00%100.00%Quiz 1: 2/4/14Quiz 2: 3/10/14Quiz 3: 4/14/14Quiz 4: 5/9/14All QuizesQuiz Averages with and without iClickersClickerNo Clicker 117 guided whole class question/answer sessions,and asking volunteers come up to the whiteboard at the front of the class and write answers to practice questions, having students reflect on notes, handouts or textbook material, and volunteering to ask questions in a whole class setting. Quiz 3 reflected a greater difference for students using clickers (6.22%) compared with non-clicker users, however Quiz 4 reflected a higher overall score (6.97% greater) for non-clicker users compared with students who used clickers during review sessions. Overall, there was a 0.15% lower score for clicker users compared with students who participated in review sessions based on traditional teaching methods. This result indicates an almost negligible difference in resulting quiz scores between students who used clickers for review and those who participated in traditional teaching review sessions.  4.5 Main findings Comparing averages of SEMLI_S knowledge category by question and set, findings indicate that utilizing clicker-enhanced pedagogy resulted in lower values overall when compared with non-clicker use. The exception is set 1, which shows that values were somewhat higher for set 1.   Averages of SEMLI_S comprehension category by question and set indicate that clicker-enhanced pedagogy is slightly higher for all questions 1-3, however set 1 reflects significantly higher values for clicker-enhanced pedagogy. Set 3 indicates a negligible difference (higher) in clicker-enhanced pedagogy compared with non-clicker use.  Comparing averages of SEMLI_S application category by question and set, findings indicate that clicker-enhanced pedagogy resulted in higher overall values for all three questions. Sets 1-3 indicate higher clicker-enhanced pedagogy values, however results for set 2 are negligible (lower) for non-clicker use.   118 In Bloom’s taxonomy results for questions 4 and 5 for knowledge, comprehension, application, analysis, synthesis and evaluation categories (Figure 10), findings indicate that values were significantly higher for clicker-enhanced pedagogy compared with non-clicker use. Results for KCA (knowledge, comprehension and application) only are summarized here since most students’ scores did not reflect achievement for analysis, synthesis or evaluation levels (Figure 10).  Comparing Bloom’s taxonomy results by set 1-3, findings indicate that values were significantly higher for clicker-enhanced pedagogy for sets 1 and 3. For set 2 however, values for clicker-enhanced pedagogy were significantly lower than non-clicker use.  Comparing Bloom’s taxonomy questions 4 and 5 to SEMLI_S questions 1-3 for knowledge, comprehension and application, findings indicate that the greatest similarity in values occurred between Bloom’s taxonomy non-clicker use and SEMLI_S question 2 non-clicker application categories. The greatest difference occurred between non-clicker values for Bloom’s taxonomy question 4 and SEMLI_S knowledge question 1.  For set 1, comparing Bloom’s taxonomy and SEMLI_S, findings indicate that the greatest similarity in values occurred between SEMLI_S application and Bloom’s taxonomy KCA (knowledge, comprehension and application) categories for non-clicker use. The greatest difference in values occurred between SEMLI_S comprehension and Bloom’s taxonomy KCA categories for non-clicker use.  For set 2, comparing Bloom’s taxonomy and SEMLI_S, findings indicate that the greatest similarity in values occurred between SEMLI_S knowledge and Bloom’s taxonomy KCA categories for non-clicker use. The greatest difference occurred between SEMLI_S application values and Bloom’s taxonomy KCA categories for non-clicker use.  119 For set 3, comparing Bloom’s taxonomy and SEMLI_S, findings indicate that the greatest similarity in values occurred between SEMLI_S application categories and Bloom’s taxonomy KCA categories for non-clicker use. The greatest difference occurred between SEMLI_S knowledge values and Bloom’s taxonomy KCA categories for non-clicker use.  Findings for the student exit questionnaire (Figure 21) indicate that “clickers as a useful technology tool” received the highest overall rating and “side conversation effectiveness” received the lowest rating.  Quiz results, overall, indicate a negligible difference between clicker-enhanced pedagogy values and non-clicker use. Also, within individual quiz scores, results vary amongst classes between clicker-enhanced pedagogy and non-clicker use.  Chapter 4 included an in-depth analysis of SEMLI_S and Bloom’s taxonomy questions and sets. It also included an analysis of the student exit questionnaire and quiz results with and without clicker-enhanced pedagogy. Main findings were also revealed. Chapter 5 includes a detailed discussion of study findings.     120 Chapter 5: Discussion Chapter 5 includes a detailed discussion of findings related to SEMLI_S and Bloom’s taxonomy research results. This chapter also includes an in-depth look at factors that may have influenced research results. Chapter 5 also sheds light on the effectiveness of traditional teaching methods compared with clicker-enhanced pedagogy in terms of concept reinforcement and metacognitive awareness.  5.1 Discussion of findings After the data were analyzed for this study, findings comparing SEMLI_S knowledge category by question and set may mean that at this particular Bloom’s taxonomy level (Figure 10), clicker-enhanced pedagogy has little or no influence on fifth grade students’ information recall, observation, or naming skills. However specific data for Learning Risks Awareness (AW) (Table 1) in question 1 (Figure 6) reflected a higher value for clicker-enhanced pedagogy than non-clicker use. This may indicate that questions related to knowledge construction (Figure 6, question 1) involving AW and utilizing clickers, students might be less likely to lose track of a learning task and clickers may enhance concentration. Results for SEMLI_S knowledge by set (Figure 13 (b)) reflected changes over a four-month time period. A higher value for set 1 may indicate that, despite familiarity with clickers earlier in the school year, students demonstrated greater enthusiasm for technology use at the outset of data collection.  Additionally, AW values are higher for clicker-enhanced pedagogy in all SEMLI_S categories for knowledge by clicker set 1-3 (Figure 13 (a)) meaning that competency in basic skills such as information recall, observation or defining science terms were strong. Also, results for SEMLI_S set 1 comprehension category showed significantly higher clicker-enhanced pedagogy values compared to sets 2 and 3 (Figure 15 (b)); results for SEMLI_S set 1 application  121 category showed somewhat higher clicker-enhanced pedagogy values compared to sets 2 and 3 (Figure 17 (b)). This supports my hypothesis that students may have been more enthusiastic about utilizing clickers initially and then, over time, students may have become more complacent regarding technology use. Later in this study, student response to clicker technology may have been less emotional and more part of a regular daily routine. If this is the case, data collected later in this study may be more authentic and therefore more meaningful compared with findings earlier in this study. Findings indicated that, for sets 2 and 3 knowledge category, values for traditional teaching methods (non-clicker use) were higher indicating that, over time, students may have been impacted more by the quality of the activities and by the influence of the instructor than the technology utilized during science lessons. Interestingly, traditional methods for knowledge set 1 were higher than most other SEMLI_S categories for any sets. Factors responsible for this result could include: greater interest in the particular concept being taught at that time; greater enthusiasm for a specific activity; or greater engagement in a hands-on lab activity during data collection for set 1.  Unlike results for SEMLI_S knowledge questions 1-3, SEMLI_S comprehension questions 1-3 show higher average values for clicker-enhanced pedagogy in all three questions (Figure 14 (b)). However, these values are only slightly higher (negligible) for clicker-enhanced pedagogy in the comprehension category for questions 1-3. This may mean that there is very little difference between clicker-enhanced pedagogy and traditional teaching methods in comprehension for questions 1-3 (Figure 6). AW also reflected a significantly higher clicker-enhanced pedagogy value for question 1 (Figure 6) compared to all other SEMLI_S categories (Figure 14 (a)). Interestingly, clicker-enhanced pedagogy in comprehension for all SEMLI_S set 1 categories is significantly higher. Due to the study’s timeframe, this could mean that students’  122 enthusiasm for clicker use may have been higher at the beginning of data collection than later in the research process.  Findings comparing SEML_S application by question and set for this Bloom’s taxonomy level (Figure 10) show higher values overall for clicker-enhanced pedagogy. This may mean that, for the application category, clicker-enhanced pedagogy is more effective than traditional teaching methods. Results for SEMLI_S application questions (Figure 16 (b)) reflect significantly higher values for question 1 which may mean that students found it easier to apply knowledge in this question (Figure 6); question 2 (smallest difference between values) may indicate that students had the most difficulty applying knowledge, using problem-solving methods or manipulating information for either clicker-enhanced pedagogy or during traditional teaching methods.  Despite an almost negligible difference between set 2 values for clicker-enhanced pedagogy and traditional teaching methods for SEMLI_S application by set (Figure 17 (b)), where clicker-enhanced pedagogy is slightly higher, students may be somewhat more engaged in the learning process utilizing clickers than not using them. These findings may also indicate that teaching styles could vary over time from one topic to another and, although students may see learning as more fun when using clickers (therefore increasing the student engagement factor), higher scores for clicker-enhanced pedagogy may depend more on the type of activity presented and the teaching strategies and methodologies employed during lessons.  Findings may also indicate that Bloom’s taxonomy application level (Figure 10), which requires students to apply knowledge, use problem-solving methods, manipulate information, and design and experiment, may be supported or improved by clicker-enhanced pedagogy.  Considering SEMLI_S and Bloom’s taxonomy set 2 results, where data indicate higher values  123 for traditional teaching methods, similarities in these scores may indicate that students found concepts taught during this timeframe more challenging. This may indicate that clickers are less effective instructional tools when students are struggling with higher-level concepts. In turn, traditional teaching methods may be more effective than clicker-enhanced pedagogy in terms of concept reinforcement and increased metacognitive awareness regarding topics students might find more challenging. Findings for averages of Bloom’s taxonomy by question 4-5 for all levels (Figure 18 (b)) and KCA (knowledge, comprehension and application) (Figure 18 (d)) may indicate that clicker-enhanced pedagogy is more effective than traditional teaching methods. Values were significantly higher for question 4 compared with question 5 (Figures 6 and 8) meaning that students may find it easier, in small group peer interactions, to ask questions than it is to make and document comments during collaborative interactions. This may be because fifth grade students are more comfortable or more accustomed to asking questions in general than they are responding to and documenting comments made during collaborative interactions. Since making comments during PI discussions involves listening attentively to points raised by peers, lower values for question 5 may reflect lack of proficient listening skills amongst students in this age group or lower confidence levels interpreting and documenting comments made.   Bloom’s taxonomy averages by set (Figures 19 (b) and 19 (d)) reflect higher values for clicker-enhanced pedagogy for set 1 and 3, whereas higher values for traditional teaching methods are reflected for set 2. There may have been several reasons for this (some of which are mentioned above) such as: students found lesson concepts more worthwhile during traditional teaching methods in set 2; peer instruction was more beneficial during traditional method  124 sessions; students were able to focus more on lesson content and lab work without the distraction of clicker-enhanced pedagogy; teaching strategies differed during traditional teaching sessions.  Comparing Bloom’s taxonomy and SEMLI_S findings for sets 1-3, similarities in results in sets 1 and 3 may be due to similarities in teaching strategies or methods during this time period. Differences between results may be attributed to differences in teaching strategies or methods during data collection. Reasons for differences could include: special schedules, routine interruptions, assemblies or school events that disrupt the daily flow, teacher absence due to workshops or conferences, pace or timeframe of classes; interferences to classes due to field trips, outdoor education programs where students are off campus; and large breaks in class schedules due to spring break. Findings for the student exit questionnaire (survey) data (Figures 12 and 20) reflect a high comfort level amongst students for clicker-enhanced pedagogy. Comments below ratings for each question also shed light on nuances of students’ thoughts about clicker-enhanced pedagogy in comparison with traditional teaching methods. Although students are comfortable and adept at using technology, their comments may be interpreted to mean that: they still want to engage frequently in open discussions about a variety of topics; they want to engage in casual, open dialogues with partners in a lab setting; want to help others who might struggle with concepts, instructions or lab directions; clicker-enhanced pedagogy does increase focus and attention in class, however the human component during interactions is most important; they want teachers to provide exciting, engaging activities in class; they do not want teachers just to lecture or “talk at” them.  Findings regarding quiz results indicate that neither clicker-enhanced pedagogy nor traditional teaching methods influence quiz outcomes. Factors relating to overall quiz results  125 could include: content review timeframe; teachers’ reviewing strategies (assigned homework, in-class review sessions, online research, games, etc.); students’ engagement during in-class reviews; an individual’s particular study habits; a student’s specific language or learning processing issues; or an extended absence from class. Other factors affecting quiz results may include: question difficulty level; question structure; lead-time given prior to quiz date; and length of quiz.  Chapter 5 offered a detailed discussion of findings regarding SEMLI_S categories and Bloom’s taxonomy levels by question and by set. This chapter also highlighted similarities and differences between specific results related to clicker-enhanced pedagogy and traditional teaching methods and offered explanations about factors that may have influenced these results. In addition, Chapter 5 discussed findings related to the student exit questionnaire and quiz results. Chapter 6 describes study conclusions regarding the effectiveness of clicker-enhanced pedagogies compared with traditional teaching methods and offers final thoughts related to research questions 1 and 2. Chapter 6 also discusses limitations connected with this study and elaborates on implications and recommendations for practice, curriculum and research.    126 Chapter 6: Conclusions, limitations, implications and recommendations 6.1 Conclusions This study was conducted to determine the effectiveness of clicker-enhanced pedagogies in middle school science compared with traditional teaching methods. Additionally, this study examined how clicker use affects metacognition and conceptual understanding and how results can be compared between clicker-enhanced pedagogy and traditional teaching methods using the SEMLI_S instrument (Anderson, Nashon & Thomas, 2008) and Bloom’s taxonomy in small group and whole class settings.  For research question 1, this study shows that clicker-enhanced pedagogy has little effect on initial learning (knowledge and comprehension) compared with traditional teaching methods. However, this research indicates that clicker-enhanced pedagogy is a more beneficial instructional method for concept review and reinforcement in the middle school science classroom compared with traditional teaching methods. Furthermore, the SEMLI_S instrument is a useful tool for identification and assessment of clicker-enhanced pedagogy and traditional teaching methods when examining metacognition and conceptual understanding in small group and whole class settings. In addition, in terms of SEMLI_S application, this study shows that, over time, metacognitive awareness and conceptual understanding improves in a clicker-enhanced pedagogy setting. And, although student confidence levels are higher in clicker-enhanced pedagogy settings, there is no noticeable difference in quiz results between clicker-enhanced pedagogy and traditional teaching methods.  For research question 2, this study shows that, for Bloom’s taxonomy knowledge level, SEMLI_S and Bloom’s taxonomy results over time, were similar. Additionally, when more challenging concepts are introduced to middle school science students, utilization of traditional  127 teaching methods result in greater student success. Results also show that clicker-enhanced pedagogy in Bloom’s taxonomy for initial learning (knowledge and comprehension) is more beneficial than in SEMLI_S, however for application, clicker-enhanced pedagogy is equally advantageous in both SEMLI_S and Bloom’s taxonomy. In addition, results show that SEMLI_S dimensions enhance learning if utilized prior to incorporating Bloom’s taxonomy into middle school science lessons, and the most beneficial teaching practice is to follow up use of SEMLI_S dimensions with Bloom’s taxonomy instruction. 6.2 Limitations It is important to note that this study was limited by a number of factors. First, classes were taught at an elite independent school where students may receive on average, more science instruction time than non-independent institutions. Additionally, class sizes were small resulting in a relatively small sample, and research was conducted for one subject (science) and on one grade level. Given the unique sample and setting, findings may not be readily generalized to other populations and settings. This is not the purpose of case study, since it is all about the case itself. Also, if comparisons were made to public schools or other institutions with larger class sizes, the generalizability of the results would have been strengthened.  The researcher of this study was also the instructor and facilitator for all science lessons. Complete objectivity might have been impossible since the researcher (myself) had a vested interest in the successful implementation of skills, strategies, and technology use related to this study. However, complete student anonymity along with the qualitative data analysis as well as the quantitative design methodology of this study did allow for neutrality in the treatment and analysis of data. I was also aware of the potential bias involved in approaching this study and acted accordingly.   128 6.3 Implications and recommendations 6.3.1 Implications and recommendations for practice Educators and instructors may benefit from implementation of a broad-based study of metacognition and clicker-enhanced pedagogy with a larger student population over a longer time period. Practical applications could also be made by: comparing data collected at schools that differ socioeconomically; investigating schools and/or classes with greater ethnic diversity; analyzing data collected across different grade levels; or conducting studies across different subject areas (for example, during integrated studies where there might be crossover between science and English or science and history). Instructors could also consider implementing different types of CRS devices rather than clickers, such as iPads or cell phones. Familiarity with other devices may raise comfort levels amongst students during testing and may influence or change student performance during CRS-enhanced pedagogy.  Instructors may also benefit from monitoring the level of concept difficulty when introducing science units prior to data collection. This way, fairness in similar research studies can be improved and maintained. Educators may also benefit from early introduction of clicker-enhanced pedagogy to ensure appropriate student use of technology in a data collection setting. Student enthusiasm is always helpful. However distractions may occur if students consider any type of CRS technology a toy instead of a tool (see student interviews: Appendix C).  Implications for practice may also include an examination of the advantages of considering KCA only as opposed to all Bloom’s taxonomy levels in research, an investigation to determine why Bloom’s taxonomy for knowledge and comprehension are more beneficial than SEMLI_S, and why clicker-enhanced pedagogy is equally advantageous in both SEMLI_S and  129 Bloom’s taxonomy. Other implications may include an investigation into why SEMLI_S dimensions enhance learning if utilized prior to incorporating Bloom’s taxonomy into middle school science lessons, and why the most beneficial teaching practice is to follow up use of SEMLI_S dimensions with Bloom’s taxonomy instruction. Additionally, benefits to stakeholders such as community members and scientists, who ultimately stand to benefit from students’ improved metacognitive abilities and higher confidence in technology use should be considered. Students who demonstrate higher level reasoning skills, competence in technology use, and are self-regulated learners (Azevedo & Aleven, 2013) may be more productive members of the workforce. For example, students who have a strong foundation in STEM-based learning, greater metacognitive awareness, and are confident, empowered learners may enter the science or engineering workforce as more proficient problem-solvers, more effective critical thinkers, and better collaborators. The community at large serves to benefit by future workers and employees who demonstrate these useful and important skills.   6.3.2 Implications and recommendations for curriculum The findings of this study may have implications for curriculum design and middle school science instruction involving metacognition and technology. This study reinforces a need for science educators at local or district levels to consider coming together to discuss science instruction in general. More specifically, this study urges science instructors and educational practitioners to consider how increased metacognitive awareness, use of technology such as clicker-enhanced pedagogy could play an important role in increasing learning opportunities for middle school science students. With a creative approach to curriculum design and a focus on  130 nurturing independent, confident learners through increased metacognitive awareness and clicker-enhanced pedagogy, practitioners may reconsider current traditional teaching methods. Instead they could focus on ways to implement teaching strategies which empower students, encourage STEM-based learning opportunities, and allow students to take greater ownership of the learning process. STEM-based learning opportunities could also include an exploration of topics other than chemistry. These may include robotics, biology, engineering, computer studies, earth science, invention units, and environmental studies. Design, construction, and implementation of science handouts, lab worksheets or online material, which addresses specific components of SEMLI_S dimensions, could be considered. These may include questions that address specific components of each dimension to determine similarities and differences between CRS-enhanced pedagogy and traditional teaching methods. My curricula design recommendations include: greater attention to metacognitive awareness in whole class and small group settings; promotion of self-analysis and self-reflection skills amongst students during small group clicker-enhanced pedagogy; design and creation of specific lessons to highlight and promote metacognitive awareness and conceptual understanding in middle school science; and a focus on flipped classroom (creating a student-centered environment instead of a traditional teacher-centric classroom setting) strategies (Mazur, 2009) to encourage independent learning situations. Such activities may take place during Pro-D days or through science learning workshops that support idea-sharing and meaningful collaboration amongst educators regarding. Stemming from these opportunities, science educators may revise portions of a curriculum to improve clicker-enhanced pedagogy, increase metacognitive awareness amongst students, and to create more authentic learning situations in the middle school science classroom. This study provides insights into the understanding of metacognition  131 and Bloom’s taxonomy in terms of the SEMLI_S dimensions (Anderson, Nashon & Thomas, 2008) which may be considered by science educators when designing and creating STEM-based middle school science curricula. 6.3.3 Implications and recommendations for research Investigations for future research could include a study of all 30 SEMLI_S dimensions and subsequent items (Anderson, Nashon & Thomas, 2008) as opposed to 17 SEMLI_S dimensions which were relevant to this specific study. This may depend on grade level or student ability levels. Further research could also include situations where PI pedagogy is examined with shared CRS devices as opposed to individual use of this technology. With one student representative per group acting as leader, students may discuss conceptual question choices in groups, collaborate on ideas, and then ultimately report an agreed upon answer within a predetermined timeframe.  Future research could also include how technology-enhanced scaffolding processes (Hannafin & Kim, 2010) may relate to metacognition and conceptual understanding utilizing the SEMLI_S dimensions (Anderson, Nashon & Thomas, 2008). Researchers could also explore alternate methods of incorporating the TPACK framework (Koehler & Mishra, 2006) into a metacognition/conceptual understanding study. The purpose of this research would be to examine more closely the relationship amongst CRS-enhanced pedagogy, student knowledge construction, and foundational (curriculum) content knowledge. Additionally, an in-depth investigation could be conducted to determine how dialogical discourse and meta-level communication in the TEFA model (Beatty & Gerace, 2009) might increase students’ self-regulation skills and enhance metacognition.  132 In addition, I propose a method of utilizing SEMLI_S dimensions for further investigation of metacognition and conceptual understanding called Facilitated Intrinsic Metacognition Awareness (FIMA). In this study, research could include an examination of how teachers could act as facilitators and mentors to help increase metacognitive awareness during middle school science lessons and activities. The result of this type of study may allow more effective implementation of strategies to increase metacognitive awareness and empower students to be more effective learners. As a researcher and practitioner, this study has been transformative in terms of my classroom teaching strategies. Firstly, I look at each small group interaction differently and constantly strive to guide students toward improved peer-to-peer interactions. This way, more meaningful collaborations take place to maximize learning and enhance metacognitive awareness. Secondly, this study reflects that SEMLI_S subscales can be modified to suit younger students, resulting in an authentic examination of conceptual understanding and metacognition in a middle school setting. And, there is potential to consider further modifications under the umbrella of a FIMA study. Additionally, with a clear understanding of metacognition and questioning strategies related to Bloom’s taxonomy levels, my students have become more self-reflective, independent learners. What is most powerful is that, with adjustments made to my teaching style and curriculum design, students have responded by demonstrating improved critical thinking skills, enhanced problem-solving strategies, and by gaining confidence in their abilities. Experiencing this metacognitive metamorphosis and witnessing growth and success amongst my students has been part of an incredibly exciting journey!    133 References Akana, K., & Yamauchi, H. (2004). Task performance and metacognitive experiences in problem-solving. Psychological Reports, 94(2), 715-722.  Anderson, D., & Nashon, S. (2007). Predators of knowledge construction: Interpreting students’ metacognition in an amusement park physics program. Science Education, 91(2), 298-320.  Anderson, D., Nashon, S., & Thomas, G. (2008). Development of an instrument designed to investigate elements of science students' metacognition, self-efficacy and learning processes: The SEMLI-S. International Journal of Science Education, 30(13), 1701-1724. doi:10.1080/09500690701482493  Ausubel, D. P. (1963). Cognitive structure and the facilitation of meaningful verbal learning. Journal of Teacher Education, 14, 217-222.  Ausubel, D. P. (1968). Educational psychology: A cognitive view. New York: Holt, Rinehart & Winston.  Azevedo, R., Aleven, V. A. W. M. M., & SpringerLink ebooks - Humanities,Social Sciences and Law. (2013). International handbook of metacognition and learning technologies (1; 2013 ed.). New York: Springer. doi:10.1007/978-1-4419-5546-3  Baird, J.R. (1986). Improving learning through enhanced metacognition: A classroom study. European Journal of Science Education, 8(3), 263-282.  Baird, J. R. (1992). Collaborative reflection, systematic inquiry, better teaching. In T. Russell & H. Munby (Eds.), Teachers and teaching: from classroom to reflection, 33-48. London: Falmer Press.  Barron, B. (2000). Achieving coordination in collaborative problem-solving groups. Journal of the Learning Sciences, 9(4), 403–436.  Bassey, M. (1993). Case study research in educational settings. Buckingham: Open University press.  Beatty, I. D. (2004). Transforming student learning with classroom communication systems (Research Bulletin No. ERB0403): Educause Center for Applied Research.  Beatty, I. D., & Gerace, W. J. (2009). Technology-enhanced formative assessment: a research-based pedagogy for teaching science with classroom response technology. Journal of Science Education and Technology. 18(2), 146-162.   Black, P. Wiliam, D. (2001) Inside the Black Box. Raising Standards Through Classroom Assessment. London: King’s College School of Education  134  Bloom, B. S. (1965). Taxonomy of Educational Objectives: The Classification of Educational Goals. New York: David McKay Company.   Bloom, B. S.; Engelhart, M. D.; Furst, E. J.; Hill, W. H.; Krathwohl, D. R. (1956). Taxonomy of educational objectives: The classification of educational goals. Handbook I: Cognitive domain. New York: David McKay Company.  Bloom, B. S. & Krathwohl, D. R. (1956). Taxonomy of educational objectives: The classification of educational goals, by a committee of college and university examiners. Handbook 1: Cognitive domain. New York, Longmans.  Bolotin, M. M., Antimirova, T., Petrov, A. (2010). Clickers beyond the first-year science classroom. Journal of College Science Teaching, 40(2), 14-18.  Boyle J. T.& Nicol, D. J. (2003). Using classroom communication system to support interaction and discussion in large class settings. Association of Learning Technology Journal, 11, 43-57.  Bransford, D., Brown, A. L., & Cocking, R. R. (2002). How people learn: Brain, Mind, Experience, and School. Washington, DC: National Academy Press.  Brown, A. (1987). Metacognition, Executive Control, Self-regulation and Other More Mysterious Mechanisms, in F. E. Weinert & R. H. Kluwe (Eds.) Metacognition, Motivation and Understanding. Hillsdale: Lawrence Erlbaum.  Burnstein, R. A. &  Lederman, L. M. (2001). Using wireless keypads in lecture classes. The Physics Teacher, 39, 8-11.   Butler, D. L., Beckingham, B., & Novak Lauscher, H. J. (2005). Promoting strategic learning by eighth-grade students struggling in mathematics: A report of three case studies. Learning Disabilities Research and Practice, 20, 156-174.  Butler, D. L. (2002). Qualitative approaches to investigating self-regulated learning: Contributions and challenges. Educational Psychologist, 37, 59-63  Caldwell, J.E. (2007). Clickers in the large classroom: Current research and best-practice tips, CBE Life Sci. Educ., 6, 9-20.   Caldwell J., Zelkowski J., & Butler M. (2006). Using personal response systems in the classroom. WVU Technology Symposium. Morgantown, WV. Retrieved 5 January 2013  Carter, E., Cushing, L., & Kennedy, C. (2008). Peer Support Strategies for Improving Students Social Lives and Learning. Baltimore, MD: Brooks Publishing Company.   135 Chaiklin, S. (2003). The zone of proximal development in vygotsky's analysis of learning and instruction. In A. Kozulin, B. Gindis, V. S. Ageyev & S. M. Miller (Eds.), Vygotsky's educational theory in cultural context (1 ed., pp. 39- 64). Cambridge: Cambridge University Press.  Cobb, P. (1994), Where is the mind? Constructivist and sociocultural perspectives on mathematical development. Educational Researcher, 23(7), 13-20.    Connor, C. M., Jakobsons, L. J., Crowe, E., & Meadows, J. (2009). Instruction, differentiation, and student engagement in reading first classrooms. Elementary School Journal, 109, 221-250.  Crouch, C. H. & Mazur, E. (2001). Peer Instruction: Ten years of experience and results. American Journal of Physics, 69(9), 970-977.  Davier, A. v. & Halpin, P. F., (2013). Collaborative problem solving and the assessment of cognitive skills: psychometric considerations, 13(41). ETS Research Report.   DeChurch, L. A., & Mesmer-Magnus, J. R. (2010). The cognitive underpinnings of effective teamwork. A meta-analysis. Journal of Applied Psychology, 95, 32–53.  Denscombe, M. (2002). Ground Rules for Good Research: a 10 point guide for social researchers. Philadelphia, PA: Open University Press  Dewey, J. (1916) Democracy and Education. An introduction to the philosophy of education (1966 ed.), New York: Free Press.  Draper, S.W. & Brown, M.I. (2004). Increasing interactivity in lectures using an electronic voting system. Journal of Computer Assisted Learning, 20, 81-94.  Driver, R., Squires, A., Rushworth, P., and Wood-Robinson, V. (1994). Making Sense of Secondary Science. London, Routledge.  Driver, R., Asoko, H., Leach, J., Mortimer, E. & Scott, P. (1994). Constructing scientific knowledge in the classroom. Educational Researcher, 23(7), 5-12.  Dufresne, R.J., Gerace, W.J., Mestre, J.P. & Leonard, W.J. (2000). ASK-IT/A2L: Assessing student knowledge with instructional technology (Tech. Rep. dufresne-2000ask). University of Massachusetts Amherst Scientific Reasoning Research Institute.  EDUCAUSE. Seven things you should know about clickers. http://net.educause.eduir/library/pdf/ELI7002.pdf. Retrieved online 29 December 2012  Elliot, C. (2003). Using a personal response system in economics teaching. International Review of Economics Education, 1(1), 80–86.  136 Ernest, P. (1995). The one and the many. In Steffe, L. & Gale, J. (Eds.), Constructivism in Education. New Jersey: Lawrence Erlbaum Associates, 459-486.  Ewing, J., Foster, D. & Whittington, M. (2011). Explaining Student Cognition during Class Sessions in the Context of Piaget’s Theory of Cognitive Development. NACTA Journal. Retrieved online 12 July 2013  Fike, D., Fike, R. & Lucio, K. (2012). Does Clicker Technology Improve Student Learning? Journal of Technology and Teacher Education, 20(2), 113-126.  Fischer, F., Kollar, I., Stegmann, K. & Wecker, C. (2013). Toward a script theory of guidance in computer-supported collaborative learning. Educational Psychologist, 48(1), 56-66. doi:  10.1080/00461520.2012.748005  Flavell, J. H. (1971). The uses of verbal behavior in assessing children's cognitive abilities. In D. R. Green, M. P. Ford & G. B. Flamer (Eds.), Measurement and Piaget (pp. 198-204). New York: McGraw-Hill.  Flavell, J.H.: 1976. Metacognitive Aspects of Problem Solving. In: The Nature of Intelligence. Resnick, Lauren B (ed.) p.233 Lawrence Erlbaum Associates.  Flavell, J. H. (1981). Cognitive monitoring. In W. P. Dickerson (Ed.), Children’s oral communication skills (35-60). New York: Academic Press  Garafalo, J. & Lester, F. (1985). Metacognition, Cognitive Monitoring, and Mathematical Performance. Journal for Research in Mathematics Education, 16(3), 163-176.  Geertz, C. (1973). Thick description: Toward an interpretive theory of culture. In C. Geertz, The Interpretation of Cultures, 3-30. New York: Basic Books.  Gergen, K. J. (1994). Realities and Relationships. Cambridge: Harvard University Press.  Glasersfeld, E. v. (1989). Constructivism in education. http://www.vonglasersfeld.com/114. Retrieved 2 January 2013.  Glasersfeld, E. v. (1996). Introduction: aspects of constructivism. In C. Fosnot (Ed.), Constructivism: theory, perspectives, and practice, 3-7. New York: Teachers College Press.  Grbich, C. (2013). Qualitative Data Analysis: An Introduction. Thousand Oaks, CA: Sage  Graham, R. D. (2013). Smart clickers in the classroom: technolust or the potential to engage students? Canadian Journal of Action Research, 14(1), (3-20).   137 Gunstone, R. (1994). Metacognition and learning to teach. International Journal of Science Education, 16(5), (523-537).  Hacker, D., Dunlosky, J., & Graesser, A. (1998). Metacognition in educational theory and practice. Mahwah, NJ: Erlbaum Associates.  Hafner, R. & Culp, S. (1996). Elaborating the structures of a science discipline to improve problem-solving instruction: an account of classical genetics’ theory structure, function and development. Science and Education, 5(4), 331-355.  Hannafin, M. & Kim, M. (2010). Scaffolding 6th graders’ problem solving in technology-enhanced science classrooms: a qualitative case study. Science+Business Media B.V. Retrieved 8 July 2013  Hewson, P. W., & Hewson, M. G. A’B. (1988). An appropriate conception of teaching science: A view from studies of science learning. Science Education, 72(5), 597-614.  Hmelo-Silver, C. E., Chinn, C. A., O’Donnell, A. M., & Chan, C. (Eds.). (2013). The International handbook of collaborative learning, 389-402. New York, NY: Routledge.   Jackson, M. & Trees, A. (2003) Clicker Implementation and Assessment, Information and Technology Services and Faculty Teaching Excellence Program, University of Colorado, Boulder, CO, [online] Available at: http://comm.colorado.edu/mjackson/clickerreport.htm  Jones, M. E., Antonenko, P. D., & Greenwood, C. M. (2012). The impact of collaborative and individualized student response system strategies on learner motivation, metacognition, and knowledge transfer. Journal of Computer Assisted Learning, 28(5), 477-487. doi:10.1111/j.1365-2729.2011.00470.x  Joseph, N. (2006). Strategies for success: teaching metacognitive skills to adolescent learners. New England Reading Association Journal, 42(1).   Joseph, N. (2010). Metacognition Needed: Teaching Middle and High School Students to Develop Strategic Learning Skills. Preventing School Failure, 54(2), 99-103.  Karpov, Y. (2003). Vygotsky's doctrine of scientific concepts: Its role for contemporary education. In A. Kozulin, B. Gindis, V. Ageyev & S. Miller (Eds.), Vygotsky's educational theory in cultural context (pp. 138-155). Cambridge: Cambridge University Press.  Kirschner, F., Pass, F., Kirschner, P. A., & Janssen, J. (2011). Differential effects of problemsolving demands on individual and collaborative learning outcomes. Learning and Instruction, 21, 587–599.    138 Krathwohl, D. R.; Bloom, B. S.; Masia, B. B. (1964). Taxonomy of educational objectives: The classification of educational goals. Handbook II: the affective domain. New York: David McKay Company.  Krathwohl, David R. (2002). A revision of Bloom's taxonomy: An overview. Theory Into Practice (Routledge) 41 (4): 212–218. doi:10.1207/s15430421tip4104_2. ISSN 0040-5841.  Kyronlampi-Kylmanen, T., & Maatta, K. (2011). Using children as research subjects: how to interview a child aged 5 to 7 years. Educational Research and Reviews, 6(1), 87-93. ISSN 1990-3839.  Larkin, S. (2006). Collaborative group work and individual development of metacognition in the early years. Research in Science Education, 36(1), 7–27  Lasry, N., Guillemette, J., & Mazur, E. (2014). Two steps forward, one step back. Nature Physics, 10(6), 402-403. doi:10.1038/nphys2988  Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. New York: Cambridge University Press.  Levy, P. (2003). A methodological framework for practice-based research in networked learning. Instructional Science, 31: 87-109.  MacArthur, J.R. & Jones, L. J. (2008). A review of literature reports of clickers Applicable to College Chemistry Classrooms, The Royal Society of Chemistry, 9, 187-195.  Mayer, C. & Sigley, R. (2014). Task-based interviews in mathematics education. Encyclopedia of Mathemetics Education, doi: 10.1007/978-94-007-4974-8  Mazur, E. (1996). Confessions of a converted lecturer. Retrieved 30 December 2012, from http://www.lincoln.edu/math/faculty/JPathak/Lincoln_MSPGP/IHE/Mazur.pdf  Mazur, E. (1997). Peer instruction: A user's manual. Upper Saddle River, N.J: Prentice Hall.  Mazur, E. (2009). Confessions of a converted lecturer. http://www.youtube.com/watch?v=WwslBPj8GgI. Retrieved 30 December 2012  Mead, G. H. (1934). Mind, self and society. Chicago, IL. University of Chicago Press.  Merriam, S. B. (1998). Qualitative research and case study applications in education. San Francisco: Jossy-Bass Publishers.  Milner-Bolotin, M. (2004). Tips for using a peer response system in a large introductory physics class. The Physics Teacher, 42(4), 253. doi:10.1119/1.1696604   139 Milner-Bolotin, M., Fisher, H., & MacDonald, A. (2013). Modeling active engagement pedagogy through classroom response systems in a physics teacher education course. LUMAT: Research and Practice in Math, Science and Technology Education, 1(5), 525- 544.  Mintzes, J. J., Wandersee, J. H., & Novak, J. D. (1998). Teaching science for understanding: A human constructivist view. Englewood Cliffs, NJ: Academic Press.  Mishra, P. & Koehler, M. J. (2006). Technological pedagogical content knowledge: a framework for teacher knowledge. Teachers College Board, 108(6), 1017-1054.  Nashon, S. M., Anderson, D., & Nielsen, W. (2005). Students’ metacognitive traits as pointers to their subsequent knowledge construction. Proceedings of the Annual Meetings of the National Association for Research in Science Teaching. Dallas, TX.  National Science Foundation, (2011). New report offers roadmap for success in K-12 STEM Education. Press release 11-126. http://www.nsf.gov/newsnews_summ.jsp?cntn_id=120902. Retrieved online 3 January 2013  Next Generation Science Standards (2013). Retrieved from http://www.nextgenscience.org/  Nickerson, R. S. 1988. On improving thinking through instruction. In Review of research in education, 15, ed. E. Z. Rothkopf, 3-57. Washington, D.C.: American Educational Research Association  Paris, S. G., & Winograd, P. W. (1990). How metacognition can promote academic learning and instruction. In B.J. Jones & L. Idol (Eds.), Dimensions of thinking and cognitive instruction, 15–51. Hillsdale, NJ: Lawrence Erlbaum Associates.  Patton, M. Q. (2002). Fieldwork strategies and observation methods. Qualitative Research and Evaluation Methods (3rd ed.), 259-332. Thousand Oaks, CA: Sage. In: Yin, R. K. (2014). Case Study Research: Design and Methods. Thousand Oaks, CA: Sage.  Piaget, J. (1978). Success and understanding. Cambridge, MA: Harvard University Press.  Pintrich, P. R. (2002). The role of metacognitive knowledge in learning, teaching and assessing. Theory into Practice, 41(4).   PISA Programme for International Student Achievement (2009). http://www.oecd.org/pisa/. Retrieved 29 December 2012  PISA Programme for International Student Achievement (2012). Retrieved online from https://nces.ed.gov/surveys/pisa/pisa2009highlights_4.asp   140 Posner, G. J., Strike, K. A., Hewson, P. W., & Gertzog, W. A. (1982). Accommodation of a scientific conception: towards a theory of conceptual change. Science Education, 66(2), 211-227.  Rand, S. (2005). Metacognition in science. Science Scope, 59-61.  Salomon, G., Perkins, D. N., & Globerson, T. (1991). Partners in cognition: Extending human intelligence with intelligent technologies. Educational Researchers, 20, 2-9.  Smith, M. K., Trujillo, C., Su, T.T. (2010). The benefits of using clickers in small-enrollment seminar-style biology courses, CBE Life Science Education, 1; 10(1): 14–17.   Stake, R.E. (1995). The art of case study research. London: Sage Publications.  Tarricone, P. (2011). The taxonomy of metacognition. New York, NY: Psychology Press.  Tillotson, J. (2000). Studying the game: action research in science education. The Clearing House, 74(1): 31-34.  Vacca. (2008).Using scaffolding techniques to teach a social studies lesson about Buddha to sixth graders. Journal of Adolescent and Adult Literacy, 51(8). 652-658.  Van den Hoonaard, D. (2012). Qualitative research in action: a Canadian primer. Don Mills, Ontario: Oxford University Press.  Vygotsky, L. S., (1978). Mind in society: the development of higher psychological processes. Cambridge, MA: Harvard University Press.  White, R.T. (1993). Insights on conceptual change derived from extensive attempts to promote metacognition. Paper given at: The Annual Meeting of American Educational Research Association. Atlanta, GA.  White, R. T., Bakopanos, V. & Swan, S. (1988). Increasing metalearning: two classroom interventions. Paper given at: The Meeting of the Australian Association for Research in Education. Armidale, Australia.   White, R.T., & Gunstone. R.F. (1998). Metalearning and Conceptual Change. International Journal Science Education. 11, 577-587.  Whitebread, D., et al. (2009). Development of two observational tools for assessing metacognition and self-regulated learning in young children. Metacognition and Learning, 4(1), 63-85.  Yin, R. K. (2014). Case Study Research: Design and Methods. Thousand Oaks, CA: Sage.    141 Appendices Appendix A: Clicker use curriculum calendar Date Class Topic Curriculum Details Instructional Methods Additional Information Week of: 1/27/14 All classes Chemistry Sample Terms: elements, compounds, mixtures Lecture, Q/A (whole class setting)  1/31/14 5A / 5B (=2/4 classes) Chemistry Sample Terms: elements, compounds, mixtures Clicker use: Power point (ppt) review session (5 Qs), ppt presentation #1  Week of: 2/3/14 All classes Chemistry Chemical reactions + Sunset in a Bag lab activity. Sample terms:  exothermic, endothermic, conduction, convection, radiation Lab introduction (whole class)  Lab activity (small group) Quiz: 2/4/14 for all classes  2/5/14 & 2/6/14  5A / 5B (=2/4 classes) Chemistry Exothermic, endothermic, conduction, convection, radiation Clicker use: ppt review session (5 Qs), ppt presentation #2   142 Date Class Topic Curriculum Details Instructional Methods Additional Information 2/7/14 & 2/10/14 All classes Chemistry Electron orbital diagrams (rules, electron shell configuration) Lesson on electron orbital diagrams (practice on prepared handouts, whole class interaction)  2/11/12 5A / 5B (=2/4 classes) Chemistry Electron orbital diagrams (rule for drawing them Small group (lab team) work = orbital diagram practice, exchange of diagrams & peer editing. Correct partner’s work, discuss and then return.   143 Date Class Topic Curriculum Details Instructional Methods Additional Information 2/19/14 & 2/20/14 5A /5B (=2/4 classes) Data Table construction (no template provided) Creation/design of lab data table for chemistry experiment (Alka-Seltzer lab) Clicker use: ppt review session (5 Qs), ppt presentation #3  Small group (lab team) work, then whole class discussion (+ reinforcing definition of metacognition), introduction of metacognition-based handout; Small group work re: chemistry lab data table creation. Questions: What does that mean? What does that look like? What am I learning about my thinking process throughout this activity/challenge? Set 1: Clicker = 5A/5B  144 Date Class Topic Curriculum Details Instructional Methods Additional Information 2/21/14  All classes Data Table construction  Revisiting data table construction Debrief re: challenges/success concerning data table creation. More brainstorming & dialogue (whole class discussion). Idea sharing. Adjustments made to some students’ data tables & some re-done/re-formatted as a result (small group setting).   145 Date Class Topic Curriculum Details Instructional Methods Additional Information 2/24/14 5C / 5D (=2/4 classes) Data Table construction  Data table organization & meaning of related terms Clicker use: ppt review session (6 Qs), ppt presentation #4 - Q#4 (Quantitative data): students asked to answer this question. Then students asked to have a side conversation w/ a partner to clarity definition of ‘quantitative.’ Students then asked to answer Q#4 again. Results differ (see histogram results)   146 Date Class Topic Curriculum Details Instructional Methods Additional Information 3/5/14 & 3/6/14 5C / 5D (=2/4 classes) Chemistry Arrangements of atoms (chemical reactions, rates of reactions). Packet distributed for reading, highlighting important points, Questions #1-6 within packet. Clicker use: ppt review session (5 Qs), ppt presentation #5 small group work w/ supporting handout on questioning guidelines (for all classes). Bloom’s taxonomy handout included + class discussion (all) on critical thinking question techniques.  Quiz: 3/10/14 for all classes For 5C: Q# 5 asked twice. Students asked to engage in side conversation between first & second try.  For 5D: Q# 3 asked twice. Students asked to engage in side conversation between first & second try.    147 Date Class Topic Curriculum Details Instructional Methods Additional Information 4/11/14 5C/5D (=2/4 classes) Chemistry Balancing Chemical Equations Small group work to solve balancing chemical equation questions (at lab stations). Balancing chemical equations question sheet given to students.  Clicker use = Chemistry power point #6 (4 Qs) Quiz: 4/14/14 for all classes Set 2: Clicker = 5C/5D  All classes received same balancing chemical equations question sheet.  5D: Q#1 - 50/50 result when Q asked initially. After a short time for ‘side conversations’ same Q was asked again = improved results 4/25/14 - 5/2/14 All classes Chemistry Molecular Structure Handouts #1-5 3 - 4 students per group on average. Observing student involvement/collaboration & interactions (videotaped)  148 Date Class Topic Curriculum Details Instructional Methods Additional Information 5/2/14 All classes Chemistry Molecular Structure    Handouts #4-5 Small Group Question/Comment Handout completed by all classes 5/5/14       5A/5B Chemistry Molecular Structure    Clicker use = Chemistry power point #7 (5 Qs) Quiz: 5/9/14 for all classes  Set 3: Clicker use = 5A/5B  5B: Q#2 asked twice after an opportunity for a side conversation. Hand count indicated that more than half of class found side conversations beneficial.     149 Appendix B: PowerPoint Presentations Chemistry Power Point 1   150   151 Chemistry Power Point 2 CHEMISTRY QUESTIONSPower Point #21Question #1• The term exothermic means: • A) A chemical reaction resulting in no release of heat or cold • B) A chemical reaction resulting in heat being absorbed • C) A chemical reaction resulting in the release of heat • D) A chemical reaction did not occur2Confidence level for Q#1• A) Very confident about my answer • B) Somewhat confident about my answer • C) Not confident about my answer3Answer to #1• C) A chemical reaction resulting in the release of heat4Question #2• The term endothermic means: • A) A chemical reaction resulting in heat being absorbed • B) A chemical reaction resulting in no release of heat or cold • C) A chemical reaction resulting in the release of heat • D) A chemical reaction did not occur5Confidence level for #2• A) Very confident about my answer • B) Somewhat confident about my answer • C) Not confident about my answer6Answer to Q#2• A) A chemical reaction where heat is absorbed7Question #3• An example of Radiation is: • A) Feeling the cool air from a fan • B) Heat traveling along a wire • C) Getting hot as you work up a sweat • D) When the sun heats the Earth8Confidence level for #3• A) Very confident about my answer • B) Somewhat confident about my answer • C) Not confident about my answer9 152    Answer to Q#3• D) When the sun heats the Earth10Question #4• An example of Conduction is: • A) When heat rises and cool air falls • B) When the sun heats the Earth • C) When heat travels through materials such as a frying pan to the handle • D) Only when heat travels through fluids such as water11Confidence level for #4• A) Very confident about my answer • B) Somewhat confident about my answer • C) Not confident about my answer12Answer to #4• C) When heat travels through materials such as a frying pan to the handle13Question #5• An example of Convection is: • A) Heat traveling along a piece of metal • B) The process of water boiling in a pot • C) Feeling the heat from the sun • D) None of the above14Confidence level for #5• A) Very confident about my answer • B) Somewhat confident about my answer • C) Not confident about my answer15Answer to Q#5• B) The process of water boiling in a pot16 153 Chemistry Power Point 3     154 Chemistry Power Point 4    Chemistry - Data TablesPower Point #41Question #1• The purpose of a data table is to:!• A) Organize information based on the experiment you are performing!• B) Share information with others clearly!• C) Record information efficiently!• D) Makes data easy to access & analyze!• E) All of the above2Answer to Q#1• E) All of the above3Question #2• A data table consists of:!• A) Rows only!• B) Columns only!• C) Both rows and columns!• D) A detailed graph4Answer to Q#2• C) Both rows and columns5Question #3• You need specific headings (titles) to describe what’s represented in a data table:!• A) Yes!• B) No6Answer to Q#3• A) Yes7Question #4• Quantitative data: !• A) Consists of detailed, written explanations!• B) Consists of numbers such as time, distance or weight !• C) Both A & B!• D) None of the above8Answer to Q#4• B) Consists of numbers such as time, distance or weight 9Question #5• Identical data tables can always be used for every experiment you perform:!• A) True!• B) False10Answer to Q#5• B) False11Question #6• Written observations are sometimes helpful & can be included in some data tables:!• A) True!• B) False12Answer to Q#6• A) True13Confidence Level for Qs#1-6• A) Very confident about my answers!• B) Somewhat confident about my answers!• C) Not confident about my answers14 155 Chemistry Power Point 5    Chemistry: “Chemical Reactions Alter Arrangements of Atoms” PacketPower Point #51Question #1In a chemical reaction: A) New arrangements of atoms form different substances B) New arrangements of atoms form the same substances C) Substances simply change their phase or state D) Atoms can never change their arrangement 2Answer to Q#1A) New arrangements of atoms form different substances3Question #2In a chemical reaction, REACTANTS are: A) Substances formed at the end of a reaction B) Substances formed between reactions C) Substances present at the beginning of a reaction D) None of the above4Answer to Q#2C) Substances present at the beginning of a reaction5Question #3In this type of chemical reaction, one reactant is always oxygen & another reactant often contains carbon & hydrogen: A) Decomposition reaction B) Synthesis reaction C) Precipitate reaction D) Combustion reaction6Answer to Q#3D) Combustion reaction7Question #4When a large piece of material is broken down into smaller parts, this increases the ____________ of the material & therefore affects the rate of the chemical reaction.   A) temperature B) surface area C) concentration D) tension8Answer to Q#4B) surface area9Question #5A substance that increases the rate of a chemical reaction but does not change itself during the reaction is called a(n): A) Molecule B) Element C) Catalyst D) Protein10Answer to Q#5C) Catalyst11Confidence Level for Qs#1-5A) Very confident about my answers B) Somewhat confident about my answers C) Not confident about my answers12 156 Chemistry Power Point 6   157 Chemistry Power Point 7    Question #5How many ‘bonding points’ does Hydrogen have? a) 1 b) 2 c) 3 d) 4 e) None of the above10Answer to Q#5a) 111Confidence Level for Qs#1-5A) Very confident about my answers B) Somewhat confident about my answers C) Not confident about my answers12 158 Appendix C: Student interview questions & transcripts Student Interview Questions – iClickers & Metacognition (Bullet point notes are OK here)  -!What do you like about using iClickers during science class?     -!What do you dislike about using iClickers during science class?     -!Do you think using iClickers helps your ability to learn & understand concepts? If so, can you explain how?     -!What are some differences in iClicker use between whole class & small group situations?      -!How do you think iClickers could be used more effectively during science (during small group work or whole class situations)?     -!Include any additional comments regarding iClicker use in science here.     -!What is metacognition?        159 Student Interview Transcripts 6/1/14  Student 1: -   I’m here to talk about iClickers and metacognition -   What I like most about using the iClickers during science class is that Mrs. D can give us an evaluation on how we’re learning w/o having to take a quiz I think iClickers can be used more effectively by giving us more opportunities for small group questions and a little bit harder questions for when we are in small groups I think that the (Bloom’s taxonomy) pyramid idea was very good because it spelled out what each level you should use was and I think the pyramid also helped people helped more people get involved in the discussion  Student 2:  I liked that it (iClickers) was better technology which kids like using instead of writing down on a piece of paper  Um, sometimes it (writing) hurts your hand or if you break your writing hand, it’s easy to push buttons instead of writing down your answers (What I dislike about iClickers) is when you answer and a person asks you ‘what did you answer with?’ and then you’re like, um, it’s kind of embarrassing is like, ’I answered A. Did you answer A?’ And then yah, so then you get embarrassed if you didn’t answer right and the people around you maybe get the wrong answer and you get the right one. (It’s like) people intruding on your screen.  (How do you think iClickers could be used more effectively during science class?) During like a pop-quiz it would be really good. So, you can just, um, they’d see how long you would take to answer a question, um, so you would see if they knew it or not. Like if they press it immediately, they know it but if they take a long time and really think about it, sometimes that means like you’re figuring out, or like maybe you’re looking over at another screen and cheating and you don’t know it (the answer). So the pop-quiz would be very good. And you could see who was maybe um cheating or doesn’t know the question and didn’t study. (What is metacognition?) Thinking of what you’re thinking about.  Student 3:   For my questions I thought I would do number 1, 2 & 5.  (1. What do you like about using iClickers during science class?) Nowadays, kids are more, like, kids like using electronics. Say you broke your right arm and you have a hard time writing if you’re a righty and you can just press a button for an answer. (2. What do you dislike about iClickers?) When you’re doing a pop-quiz people get distracted and start pressing random buttons and they’ll also kind of be like, ‘Oh, I bet I’m going be the first one to press in it’ and then there is an argument about ‘oh no, I pressed in first.’ (5. How do you think iClickers be used more effectively during science class?) Well, I think it could be used for like handouts, and… Like, if she (teacher) gives us a handout about a certain topic and you have to write in all the answers, it could be used to press in the answers and it could be used for tests, quizzes and pop-quizzes.  Student 4:   160 Questions I’ll be doing are numbers 1, 5 & 7 (1. What do you like about using iClickers during science class?) I feel like, during class, kids these days like using technology like iPads. (5. How do you think iClickers be used more effectively during science class?) They could be used during pop-quizzes because the teacher could see if they answered B, F, D, C. (7. What is metacognition?) Metacognition is thinking about thinking.   Student 5:  Hi, this is the student interview questions for iClickers and metacognition So, during science class I really enjoyed using iClickers because they made me more engaged using them and helping me learn better while still having fun. And, they also helped me, kind of use, think more instead of just using like normal pen and paper questions (5. How do you think iClickers be used more effectively during science class?) During science, I think iClickers could be used more effectively if they were used for like a quiz, like a one on one quiz I think that would be helpful in kind of be seeing iClickers and making people more comfortable with them.  In small group work, I think they’d be better kind of like to organize ideas a bit so that it’s easier to organize ideas for a lab report. Metacognition is just thinking about thinking and understanding how you’re supposed to think and getting, becoming more…(philosophical?) with thinking.    161 Appendix D: A taxonomy of Socratic questions Cited from: http://www.wolaver.org/teaching/socratic.htm A Taxonomy of Socratic Questions Richard Paul To make the Socratic questioning method readily usable by teachers, identifiable categories of questions have been established (Paul, Richard, Critical Thinking: How to Prepare Students for a Rapidly Changing World, 1993, pp. 276-77): •! Questions of clarification •! Questions that probe assumptions •! Questions that probe reasons and evidence •! Questions about viewpoints or perspectives •! Questions that probe implications and consequences •! Questions about the question Questions of clarification  Questions of clarification are basically asking for verification, additional information, or clarification of one point or main idea. The student would be expected to provide the information, expound on an opinion, rephrase the content, or explain why he/she made that particular statement. Clarification may also be requested from others in the discussion group. •  Why do you say that? •  How does this relate to our discussion? •  "Are you going to include diffusion in your mole balance equations?" Questions than probe assumptions  Many questions can center around the concept of assumptions. The student may be asked for clarification, verification, explanation, or reliability of the assumption. Students may also be asked to identify another assumption, which might apply to the particular case. •  What could we assume instead? •  How can you verify or disapprove that assumption? •  "Why are neglecting radial diffusion and including only axial diffusion?"  162 Questions that probe reasons and evidence  This category of probing questions asks for additional examples, evidence which has been discovered, reasons for making statements, adequacy for the reasons, process which lead student to this belief, or anything which would change the student's mind on this issue. •  What would be an example? •  What is....analogous to? •  What do you think causes to happen...? Why? •  "Do you think that diffusion is responsible for the lower conversion?" Questions about viewpoints or perspectives  The student might be asked whether there are alternatives to this viewpoint or perspective, how might other groups or people respond, what argument a person might use who disagrees with this viewpoint, or a comparison of similarities and differences between viewpoints. •  What would be an alternative? •  What is another way to look at it? •  Would you explain why it is necessary or beneficial, and who benefits? •  Why is the best? •  What are the strengths and weaknesses of...? •  How are...and ...similar? •  What is a counterargument for...? •  "With all the bends in the pipe, from an industrial/practical standpoint, do you think diffusion will affect the conversion?" Questions that probe implications and consequences  The student might be asked to describe and discuss the implication of what is being done or said, the effect which would result, the alternatives which might be feasible, or the cause-and-effect of an action. •  What generalizations can you make? •  What are the consequences of that assumption? •  What are you implying? •  How does...affect...? •  How does...tie in with what we learned before? •  "How would our results be affected if neglected diffusion?"  163 Questions about the question  The student might be asked to identify the question, the main point, or the issue at hand. In addition, the student might be asked to break the question into single concepts rather than multiple concepts or determine whether some type of evaluation needs to take place. The student or discussion group may also be asked to identify why this question is important. •  What was the point of this question? •  Why do you think I asked this question? •  What does...mean? •  How does...apply to everyday life? •  "Why do you think diffusion is important?"    164 Appendix E: Excel spreadsheet sample   165 Appendix F: Student exit questionnaire ratings Student Exit Questionnaire - Clickers (Rating 1-10) Class Question #1: Rate Clicker Lessons Question #2: Bar Graph Value Question #3: Focus/attention during clickers Question #4: Useful technology Question #5: Side conversation effectiveness 5B 7.0 5.0 8.5 9.0 9.5  10.0 10.0 10.0 10.0 10.0  8.0 5.0 7.0 5.0 8.0  6.  8.  9.0 10.0 9.5  8.0 7.0 10.0 9.0 9.0  9.0 10.0 9.0 10.0 8.0  8.0 7.0 8.0 6.0 10.0  9.0 10.0 8.0 10.0 5.0  8.0 6.0 9.0 8.0 7.0  8.0 10.0 2.0 9.  4.0  8.5 7.5 9.0 10.0 7.0  7.0 10.0 5.0 6.0 5.0  1.0 1.0 1.0 1.  1.0  7.0 9.  9.0 9.0 4.0  7.0 1.0 7.0 9.0 10.0  10.0 7.0 10.0 10.0 7.0  8.0 5.0 10.0 8.0 4.0  8.  9.5 8.0 9.  10.0  8.0 6.0 9.0 10.0 10.0 5D 4.0 2.0 3.5 4.5 6.0  8.0 6.0 10.0 9.  6.0  7.5 7.0 9.5 10.0 9.0  10.0 7.0 9.0 10.0 10.0  9.0 10.0 10.0 10.0 9.0  6.  8.0 5.0 9.  8.0  9.0 10.0 10.0 5.  8.0  10.0 10.0 10.0 10.0 10.0  8.0 8.  10.0 8.0 3.0  7.  10.0 8.0 9.  6.0  9.0 5.0 10.0 10.0 5.0  10.0 5.  10.0 10.0 6.0  10.0 7.0 5.0 10.0 10.0  10.0 9.0 10.0 9.  9.0  9.  6.0 8.0 8.  4.5  8.5 6.5 7.5 9.0 8.0 5C 6.0 5.0 8.0 8.5 1.0  166  9.0 3.0 9.0 7.0 4.0  10.0 1.0 10.0 10.0 1.0  8.0 9.5 7.0 7.0 3.0  9.  7.0 10.0 10.0 8.0  10.0 10.0 7.0 10.0 8.0  10.0 5.0 7.0 10.0 8.0  9.5 8.  9.0 10.0 6.0  4.  5.0 8.0 7.  2.0  9.0 8.0 7.0 9.  6.0  9.0 10.0 9.0 10.0 8.0  6.0 4.0 10.0 10.0 4.0  6.0 8.  10.0 3.  1.0  8.0 5.0 7.0 7.  9.0  8.0 10.0 9.0 10.0 3.0  8.0 10.0 9.0 8.0 2.0  8.5 10.0 5.0 6.5 3.0  10.0 7.  8.0 9.0 9.0 5A 6.0 6.  7.0 8.0 5.0  8.5 9.0 7.5 9.0 7.0  10.0 6.0 5.0 10.0 9.0  10.0 8.0 10.0 9.0 9.0  10.0 10.0 10.0 10.0 10.0  7.  9.0 6.0 6.0 4.0  5.  10.0 6.0 10.0 10.0  7.0 5.0 10.0 8.0 1.0  8.0 7.  4.0 7.  7.0  5.0 9.0 10.0 5.0 9.0  8.0 8.0 9.0 10.0 5.0  9.0 9.5 7.0 8.5 6.0  8.0 9.0 10.0 10.0 5.0  7.0 9.0 6.0 7.0 5.0  7.0 3.0 10.0 10.0 1.0  7.0 5.0 10.0 8.0 4.0  1.0 10.0 8.0 1.  1.0  8.0 10.0 10.0 9.0 6.0  8.0 6.  10.0 9.0 1.0  5.0 7.  10.0 6.0 1.0 Average 7.8 7.3 8.2 8.4 6.1     167 Appendix G: Histogram results for ppt#5, question #3 Before side conversation (D is correct answer)  After side conversation (D is correct answer)  

Cite

Citation Scheme:

        

Citations by CSL (citeproc-js)

Usage Statistics

Share

Embed

Customize your widget with the following options, then copy and paste the code below into the HTML of your page to embed this item in your website.
                        
                            <div id="ubcOpenCollectionsWidgetDisplay">
                            <script id="ubcOpenCollectionsWidget"
                            src="{[{embed.src}]}"
                            data-item="{[{embed.item}]}"
                            data-collection="{[{embed.collection}]}"
                            data-metadata="{[{embed.showMetadata}]}"
                            data-width="{[{embed.width}]}"
                            async >
                            </script>
                            </div>
                        
                    
IIIF logo Our image viewer uses the IIIF 2.0 standard. To load this item in other compatible viewers, use this url:
http://iiif.library.ubc.ca/presentation/dsp.24.1-0166622/manifest

Comment

Related Items