Open Collections

UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

From campus to classroom : a study of elementary teacher candidates’ pedagogical content knowledge Adler, James Douglas 2012

Your browser doesn't seem to have a PDF viewer, please download the PDF to view this item.

Item Metadata

Download

Media
24-ubc_2012_spring_adler_james.pdf [ 3.65MB ]
Metadata
JSON: 24-1.0055368.json
JSON-LD: 24-1.0055368-ld.json
RDF/XML (Pretty): 24-1.0055368-rdf.xml
RDF/JSON: 24-1.0055368-rdf.json
Turtle: 24-1.0055368-turtle.txt
N-Triples: 24-1.0055368-rdf-ntriples.txt
Original Record: 24-1.0055368-source.json
Full Text
24-1.0055368-fulltext.txt
Citation
24-1.0055368.ris

Full Text

FROM CAMPUS TO CLASSROOM: A STUDY OF ELEMENTARY TEACHER CANDIDATES’ PEDAGOGICAL CONTENT KNOWLEDGE by James Douglas Adler  B.E.S. (Hons.), The University of Waterloo, 1991 B.Ed. The University of British Columbia, 1993 M.A., The University of British Columbia, 1997  A THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF  DOCTOR OF PHILOSOPHY in THE FACULTY OF GRADUATE STUDIES (Curriculum Studies)  THE UNIVERSITY OF BRITISH COLUMBIA (Vancouver) April 2012 © James Douglas Adler, 2012  ABSTRACT In this study, I explored a cohort of elementary teacher candidates’ Pedagogical Content Knowledge (PCK) and how it was influenced before, during, and after participation in a science methods course and extended practicum experience in a 12month teacher education programme at a University in Western Canada. I adopted a case study approach that employed a mixed methods perspective to investigate the effect of the science teaching methods course and extended practicum on the participants’ PCK. I administered a non-disguised, self-enumerated, online questionnaire to assess and interpret the candidates’ PCK. I also elicited and interpreted their views on their experiences through semi-structured interviews using phenomenographic methods. My findings indicate the students who entered the program and participated in the online questionnaire had a PCK level of 7 out a maximum of 11 (M=7.1431, SD=1.07794). The teacher candidates I interviewed believed they possessed awareness of PCK from prior experiences including being science students at both elementary and secondary levels and during volunteer experiences. At the end of their science teaching methods course, teacher candidates participating in the on-line questionnaire scored a PCK level of 8 out of a maximum of 11 (M=8.0719, SD=1.03280); those interviewed attributed their increased PCK level to working with their science methods instructor in the Teacher Education Programme. At the end of the extended practicum, the same teacher candidates were again surveyed by completing the same online questionnaire. Analysis and interpretation of their responses indicated their PCK level as having dropped to 7 out of a maximum of 11 ii  (M=7.0783, SD=0.72437). My analysis of the follow-up interviews with the teacher candidates showed that their relationship with the school advisor and the real world teaching experience of the practicum influenced the drop in PCK. My research provides insight into teacher candidates’ PCK as they progress through a 12-Month Teacher Education Programme. It illustrates that experiences including time spent as science learners in elementary and secondary classrooms, during volunteer experiences, gaining admission into a teacher education programme, interacting with experts in teaching methods, and immersion in the real-world teaching context of the classroom are powerful factors in teacher candidates’ PCK.  iii  PREFACE This research study obtained the approval of the UBC Research Ethics Board (Behavioural Research Ethics Board; UBC BREB Number: (H10-02365).  iv  TABLE OF CONTENTS ABSTRACT ....................................................................................................................... ii PREFACE ......................................................................................................................... iv TABLE OF CONTENTS .................................................................................................. v LIST OF TABLES ......................................................................................................... viii LIST OF FIGURES ........................................................................................................ xii ACKNOWLEDGEMENTS ........................................................................................... xiii CHAPTER ONE: INTRODUCTION TO THE STUDY .............................................. 1 Study Background ..................................................................................................................... 1 Problem Statement .................................................................................................................... 3 Research Questions ................................................................................................................... 4 Significance of the Study ........................................................................................................... 4 Brief Researcher Background .................................................................................................. 5 Organization of the Thesis ........................................................................................................ 6  CHAPTER TWO: REVIEW OF THE LITERATURE ................................................ 8 Pedagogical Content Knowledge .............................................................................................. 8 A	
  Historical	
  Perspective	
  ...............................................................................................................................	
  8	
   PCK:	
  An	
  Elaboration	
  ....................................................................................................................................	
  12	
   Science	
  PCK	
  ......................................................................................................................................................	
  15	
   PCK: Theoretical Framework ............................................................................................... 23 Summary .................................................................................................................................. 23  CHAPTER THREE: METHODOLOGY AND METHODS ...................................... 25 Context of the Study ................................................................................................................ 25 The	
  Teacher	
  Education	
  Program	
  ...........................................................................................................	
  25	
   The	
  Elementary	
  Science	
  Methods	
  Course	
  (SCI	
  349)	
  .....................................................................	
  27	
   Teacher	
  Candidates	
  ......................................................................................................................................	
  28	
   The Study ................................................................................................................................. 29 Investigational Approach: Case Study .................................................................................. 34 Research Methodology ............................................................................................................ 39 Survey	
  Methods	
  .............................................................................................................................................	
  40	
   Qualitative	
  Methods	
  .....................................................................................................................................	
  41	
   Phenomenographic	
  Method	
  .....................................................................................................................	
  43	
   Summary	
  of	
  Phenomenography	
  .............................................................................................................	
  45	
   Data Collection Procedures .................................................................................................... 46 Quantitative	
  Data	
  Collection	
  ....................................................................................................................	
  47	
   Qualitative	
  Data	
  Collection	
  .......................................................................................................................	
  51	
   Reliability and Validity ........................................................................................................... 51 Reliability	
  and	
  Validity	
  of	
  the	
  Instrument	
  ..........................................................................................	
  52	
   Reliability	
  and	
  Validity	
  in	
  a	
  Phenomenographic	
  Study	
  ................................................................	
  54	
   Data Analysis ........................................................................................................................... 57 Quantitative	
  Data	
  Analysis	
  ........................................................................................................................	
  57	
   Phenomenographic	
  Data	
  Analysis	
  .........................................................................................................	
  60	
   Ethical Considerations ............................................................................................................ 62 Limitations of the Study .......................................................................................................... 63  v  Summary .................................................................................................................................. 64  CHAPTER FOUR: QUESTIONNAIRE DATA ANALYSIS, RESULTS AND DISCUSSION .................................................................................................................. 65 Questionnaire Data Analysis .................................................................................................. 66 Rounds One, Two and Three .................................................................................................. 70 Retrospective ............................................................................................................................ 71 Rounds One (T1), Two (T2) and Three (T3) - Analysis ......................................................... 71 Finding One .............................................................................................................................. 74 Factor Analysis ........................................................................................................................ 75 Factor Analysis of Round One (T1) ........................................................................................ 76 Descriptive	
  Statistics	
  for	
  	
  Round	
  One	
  (T1),	
  Factor	
  One(one),	
  Factor	
  Two(one),	
  Factor	
   Three(one):	
  Rounds	
  One	
  (T1)	
  ,	
  Two	
  (T2)	
  	
  and	
  Three	
  (T3)	
  ................................................................	
  83	
   Statistical	
  Analysis	
  of	
  Round	
  One	
  (T1),	
  Factor	
  One(one):	
  Rounds	
  One	
  	
  (T1),	
  Two	
  (T2)	
  and	
   Three	
  (T3)	
  .........................................................................................................................................................	
  84	
   Finding Two ............................................................................................................................. 86 Statistical	
  Analysis	
  of	
  Round	
  One	
  (T1),	
  Factor	
  Two(one)	
  ):	
  Round	
  One	
  (T1),	
  Two	
  (T2)	
  and	
   Three	
  (T3)	
  .........................................................................................................................................................	
  87	
   Finding Three........................................................................................................................... 88 Statistical	
  Analysis	
  of	
  Round	
  One	
  (T1),	
  Factor	
  Three(one):	
  Rounds	
  One	
  (T1),	
  Two	
  (T2)	
  and	
   Three	
  (T3)	
  .........................................................................................................................................................	
  88	
   Finding Four ............................................................................................................................ 91 Factor Analysis of Round Two (T2) ....................................................................................... 91 Descriptive	
  Statistics	
  for	
  Round	
  Two	
  (T2),	
  Factor	
  One(two),	
  Factor	
  Two(two):	
  Rounds	
  One	
   (T1),	
  Two	
  (T2)	
  and	
  Three	
  (T3)	
  ..................................................................................................................	
  97	
   Statistical	
  Analysis	
  of	
  Round	
  Two(T2),	
  Factor	
  One(two):	
  Rounds	
  Two	
  (T2)	
  ...........................	
  98	
   and	
  Three	
  (T3)	
  ................................................................................................................................................	
  98	
   Finding Five ........................................................................................................................... 100 Statistical	
  Analysis	
  of	
  Round	
  Two	
  (T2),	
  Factor	
  Two(two):	
  Rounds	
  Two	
  (T2)	
  and	
  Three	
   (T3)	
  ...................................................................................................................................................................	
  101	
   Finding Six ............................................................................................................................. 103 Factor Analysis of Round Three (T3) ................................................................................... 104 Descriptive	
  Statistics	
  for	
  Round	
  Three	
  (T3),	
  Factor	
  One(three),	
  Factor	
  Two(three),	
  Factor	
   Three(three):	
  Rounds	
  One	
  (T1),	
  Two	
  (T2)	
  and	
  Three	
  (T3)	
  .............................................................	
  111	
   Statistical	
  Analysis	
  of	
  Round	
  Three	
  (T3),	
  Factor	
  One(three),	
  Two(three)	
  and	
  Three(three)	
  ...	
  111	
   Statistical	
  Analysis	
  of	
  Round	
  Three	
  (T3),	
  Factor	
  One(three)	
  ........................................................	
  112	
   Finding Seven ......................................................................................................................... 114 Statistical	
  Analysis	
  of	
  Round	
  Three	
  (T3),	
  Factor	
  Two(three)	
  .......................................................	
  114	
   Finding Eight ......................................................................................................................... 116 Statistical	
  Analysis	
  of	
  Round	
  Three	
  (T3),	
  Factor	
  Three(three)	
  ....................................................	
  116	
   Finding Nine ........................................................................................................................... 118 Summary ................................................................................................................................ 119 Round	
  One	
  (T1)	
  ...........................................................................................................................................	
  120	
   Round	
  Two	
  (T2)	
  ..........................................................................................................................................	
  121	
   Round	
  Three	
  (T3)	
  .......................................................................................................................................	
  122	
    CHAPTER FIVE: INTERVIEW DATA ANALYSIS, RESULTS AND DISCUSSION ................................................................................................................ 124 Interview Data Analysis ........................................................................................................ 124 Theme One ............................................................................................................................. 127 Expression	
  of	
  Pre-­‐Programme	
  Awareness	
  of	
  Pedagogy	
  ...........................................................	
  127	
    vi  Theme Two ............................................................................................................................. 132 Expression	
  of	
  Pre-­‐Programme	
  Awareness	
  of	
  Their	
  Content	
  Knowledge	
  as	
  Regards	
  to	
   Science	
  ............................................................................................................................................................	
  132	
   Theme Three .......................................................................................................................... 137 Expression	
  of	
  Richer	
  Post-­‐Science	
  Methods	
  Course	
  Awareness	
  of	
  PCK.	
  ...........................	
  137	
   Theme Four ............................................................................................................................ 143 For	
  Some	
  Candidates	
  a	
  Successful	
  Practicum	
  Validated	
  or	
  Invalidated	
  Pre-­‐Practicum	
   Awareness	
  of	
  PCK.	
  .....................................................................................................................................	
  143	
   Summary ................................................................................................................................ 146  CHAPTER SIX: DISCUSSION ................................................................................... 148 Teacher Candidates Understand Teaching as The Way They Were Taught .................. 150 Campus to the Classroom: The Transition ......................................................................... 156 Teacher Education................................................................................................................. 160 Summary, Future Research and Conclusions..................................................................... 164 Summary ................................................................................................................................ 165 Recommendation for Further Research .............................................................................. 169 Conclusions ............................................................................................................................ 172  References ...................................................................................................................... 173 Appendices ..................................................................................................................... 197 Appendix A: Email Letter Consent to Participate ............................................................. 197 Appendix B: Questionnaire ................................................................................................. 199 Appendix C: Interview Questions ........................................................................................ 214  vii  LIST OF TABLES Table 1 Schedule of Events ............................................................................................... 46 Table 2 Cronbach’s alpha for Round One, Round Two and Round Three ....................... 54 Table 3 Pedagogical Content Knowledge (PCK).............................................................. 67 Table 4 Overall Scores: Number, Mean Standard Deviation and Standard Error Mean .. 72 Table 5 Overall Scores: Mauchly's Test of Sphericity ...................................................... 73 Table 6 Overall Scores: Tests of Within-Subject Effects ................................................. 73 Table 7 Overall Scores: Partial Eta Squared ..................................................................... 74 Table 8 Overall Scores: Pairwise Comparisons Bonferroni post-hoc ............................... 74 Table 9 Round One (T1): Factors/Components ................................................................. 77 Table 10 Round One: Extracted Factors and Percentage of Variance Accounted for by the Factors ....................................................................................................................... 79 Table 11 Round One (T1): Factor One(one): Pedagogical Content Knowledge Items ........ 80 Table 12 Round One (T1): Factor Two(one): Pedagogical Content Knowledge Items ....... 82 Table 13 Round One (T1): Factor Three(one): Pedagogical Content Knowledge Items ..... 82 Table 14 Descriptive Statistics for Round One(T1), Factor One(one), Factor Two(one), Factor Three(one): Rounds One (T1), Two (T2) and Three (T3) .................................. 84 Table 15 Round One(T1): Factor/Component One(one): Mauchly's Test of Sphericity ..... 85 Table 16 Round One(T1): Factor/Component One(one): Tests of Within-Subjects Effects 85 Table 17 Round One(T1): Factor/Component One(one): Partial Eta Squared..................... 86 Table 18 Round One(T1): Factor/Component One(one): Bonferroni post-hoc ................... 86 Table 19 Round One(T1): Factor/Component Two(one): Mauchly's Test of Sphericity..... 87 Table 20 Round One (T1): Factor/Component Two(one): Bonferroni post-hoc ................. 88  viii  Table 21 Round One(T1): Factor/Component Three(one): Mauchly's Test of Sphericity... 89 Table 22 Round One(T1): Factor/Component Three(one): Tests of Within-Subjects Effects ................................................................................................................................... 89 Table 23 Round One(T1): Factor/Component Three(one): Partial Eta Squared .................. 90 Table 24 Round One(T1): Factor/Component Three(one): Bonferroni post-hoc ................ 90 Table 25 Round Two(T2): Factor/Components................................................................. 92 Table 26 Round Two (T2): Extracted Factors and Percentage of Variance Accounted for by the Factors ............................................................................................................ 94 Table 27 Round Two(T2): Factor/Component One(two): Pedagogical Content Knowledge Item ........................................................................................................................... 95 Table 28 Round Two(T2): Factor/Component Two(two): Pedagogical Content Knowledge Items .......................................................................................................................... 97 Table 29 Descriptive Statistics for Round Two(T2), Factor One(two), Factor Two(two), Factor Three(two): Rounds One (T1), Two (T2) and Three (T3) .................................. 98 Table 30 Round Two(T2): Factor/Component One(two): Mauchly's Test of Sphericity .... 99 Table 31 Round Two(T2): Factor/Component One(two): Tests of Within-Subjects Effects ................................................................................................................................... 99 Table 32 Round Two(T2): Factor/Component One(two): Partial Eta Squared .................. 100 Table 33 Round Two(T2): Factor/Component One(two): Bonferroni post-hoc ................ 100 Table 34 Round Two(T2), Factor/Component Two(two): Mauchly's Test of Sphericity .. 101 Table 35 Round Two(T2), Factor/Component Two(two): Tests of Within-Subjects Effects ................................................................................................................................. 102 Table 36 Round Two(T2), Factor/Component Two(two): Partial Eta Squared ................. 102  ix  Table 37 Round Two(T2), Factor/Component Two(two): Bonferroni post-hoc ............... 103 Table 38 Round Three (T3): Factor/Components........................................................... 105 Table 39 Round Three (T3): Extracted Factors and Percentage of Variance Accounted for by the Factors .......................................................................................................... 107 Table 40 Round Three (T3): Factor One (three) Pedagogical Content Knowledge Items . 108 Table 41 Round Three (T3): Factor Two (three): Pedagogical Content Knowledge Items 110 Table 42 Round Three (T3): Factor Three(three): Pedagogical Content Knowledge Items 110 Table 43 Descriptive Statistics for Round Three(T3), Factor One(three), Factor Two(three), Factor Three(three): Rounds One (T1), Two (T2) and Three (T3) .............................. 111 Table 44 Round Three(T3): Factor/Component One(three): Mauchly's Test of Sphericity ................................................................................................................................. 112 Table 45 Round Three(T3): Factor/Component One(three): Tests of Within-Subjects Effects ................................................................................................................................. 112 Table 46 Round Three(T3): Factor/Component One(three): Partial Eta Squared .............. 113 Table 47 Round Three(T3), Factor/Component One (three): Bonferroni post-hoc ........... 113 Table 48 Round Three(T3): Factor/Component Two(three): Mauchly's Test of Sphericity ................................................................................................................................. 114 Table 49 Round Three(T3): Factor/Component Two(three): Tests of Within-Subjects Effects ..................................................................................................................... 115 Table 50 Round Three(T3): Factor/Component Two(three): Partial Eta Squared ............. 115 Table 51 Round Three(T3), Factor/Component Two (three): Bonferroni post-hoc .......... 116 Table 52 Round Three(T3), Factor/Component Three(three): Mauchly's Test of Sphericity ................................................................................................................................. 117  x  Table 53 Round Three(T3), Factor/Component Three(three): Tests of Within-Subjects Effects ..................................................................................................................... 117 Table 54 Round Three(T3), Factor/Component Three(three): Partial Eta Squared ............ 118 Table 55 Round Three(T3), Factor/Component Three (three): Bonferroni post-hoc ........ 118 Table 56 Questions Loaded per Factor ........................................................................... 120 Table 57 Round One(T1) Factors/Components Significant Changes in Difference Across Rounds..................................................................................................................... 121 Table 58 Round Two(T2) Factors/Components Significant Changes in Difference Across Rounds..................................................................................................................... 122 Table 59 Round Three(T3) Factors/Components Significant Changes in Difference Across Rounds ........................................................................................................ 123  xi  LIST OF FIGURES Figure 1 Timetable: Elementary Teacher Candidates ....................................................... 27 Figure 2 Example Page of Items from Instrument ........................................................... 48 Figure 3 Questionnaire Rating Scale................................................................................ 50 Figure 4 Questionnaire Rating Scale................................................................................ 69 Figure 5 Round One(T1): Scree Plot ................................................................................ 76 Figure 6 Round Two(T2): Scree Plot ................................................................................ 91 Figure 7 Round Three(T3): Scree Plot............................................................................. 104  xii  ACKNOWLEDGEMENTS  I would like to thank my Advisor Dr. Samson Madera Nashon and my committee members Dr. Anthony (Tony) Clark and Dr. Stephen (Steve) Petrina. Samson, your wisdom was a foundation of support. Tony, is there any bump in the road that you can’t smooth over? Steve, you have supported me since you were a part of my Masters committee. I cannot thank all of you enough. I extend a special thank you to my mentor and friend, Dr. R.W. (Bob) Carlisle who guided and inspired my initial post-graduate adventures. I also wish to thank UBC staff. Thank you to Saroj Chand for her kindness and caring and her unyielding faith and support. And thank you to Bob Hapke who was always keen to help out and engage in any discussion about science and servers. Thank you to my stats specialists Dr. Oksana Bartosh and Dr. Ebby K. Madera for their infinite patience in answering my questions and working with me to solve those ‘numbers’ issues. Finally, I would like to thank my daughter Silesia for her patience and faith, Mac who was always there in the rough times, my wife Sam who put up with so many unexpected adventures. Thank you to my parents Phillip and Irene Adler without whom this thesis would not be possible.  xiii  CHAPTER ONE: INTRODUCTION TO THE STUDY Study Background Teacher candidates tend to bring to teacher education programmes preconceptions of science instruction (Nashon, 2005, 2006). These are in part shaped by past experiences with the teaching and learning of science (Anning, 1988; Van Zee & Roberts, 2001). As such, teacher candidates’ understanding of science instruction is a consequence of a number of factors including ideas, beliefs, and values about the teaching and learning of science (Lanier & Little, 1986; Pajaras, 1992). Shulman (1986, 1987) developed the concept of pedagogical content knowledge (PCK) to affirm the importance of the integration of subject-matter knowledge and pedagogy in teaching. According to Shulman, PCK illustrates how the subject matter of a particular discipline is transformed for communication with learners. This powerful transformation is developed through illustrations, examples, explanations, and demonstrations. Shulman offers PCK as a way of representing and formulating subject matter that makes it comprehensible for others. Little is known about how teachers develop pedagogical content knowledge, although two necessary ingredients are typically assumed to be, subject matter knowledge and teaching experience (Magnusson, Krajcik, & Borko, 1999; van Driel, Verloop, & de Vos, 1998; van Driel, De Jong & Verloop, 2002). Subject matter knowledge is typically perceived to be “a prerequisite, preceding the development of PCK” (van Driel et al., 1998, p. 681), and PCK is developed through classroom experience (Baxter & Lederman, 1999; Gess-Newsome, 1999; Grossman, 1990; Magnusson et al., 1999; Van Driel et al.,  1  1998). However, the lack of teacher candidates’ classroom experience limits their ability to develop PCK (van Driel et al., 2002; van Driel et al., 1998). An account of the efficacy of teacher education programmes in developing PCK is found in van Driel, et al., (1998) review of the PCK literature. He asserts that “[t]eacher training programs usually do not exert a major influence on science teachers’ PCK” (p. 682). Science education methods courses generally promote constructivist pedagogy, that is, a pedagogy which incorporates a student-centred approach as the focus of instruction. This stems from the now widely acknowledged tenet of constructivism that knowledge construction on the part of the students is in part shaped by what they already know (Bodner, 1986; Driver, 1983; Erickson, 1989; Kelly, 1955; Nashon & Anderson, 2004). In my experiences as a science methods instructor, I have observed that science teacher candidates’ preconceptions of teaching impact their understanding and delivery of instruction. Contemporary literature continues to indicate that many science teachers often conform to and implement instructional models they were exposed to as students (Blanton, 2003; Nashon, 2006), and most candidates come to teacher education programmes inclined towards a more didactic, teacher-centred pedagogy (Hansen & Stephens, 2000). However, it is widely expected upon graduation that beginning teachers implement a student-centered curriculum (British Columbia Ministry of Education, Curriculum Packages for K-12; Richmond School District, 2007/2008). There is strong evidence to support the existence of a critical relationship between subject matter and effective science teaching (McDiarmid, Ball & Anderson, 1989), and the finding that many elementary teacher candidates do not have adequate knowledge of  2  key ideas in their disciplines ( Ball & Anderson, 1990; Ball & McDiarmid, 1990; Grosman, Wilson & Shulman, 1989; Wilson & Wineberg, 1988). In addition, they typically lack an understanding of the organization and connectedness of topics within their discipline (Zembal-Saul, Starr & Krajik, 1999). The majority of elementary teacher candidates lack competence in science which may be largely attributed to not having an undergraduate science degree compounded by a science phobia, which together undermine their ability to teach science. Hence, there is a lack of science being taught in the elementary classroom (Nashon, 2005, 2006). The coupling of this lack of confidence with the demand on elementary teacher candidates to teach science, despite the fact that most possess non-science degrees, led me to this study. The purpose was to investigate teacher candidates’ pedagogical content knowledge before, during, and after participating in an elementary science methods course.  Problem Statement The majority of elementary teacher candidates do not hold undergraduate science degrees, resulting in many not having adequate science knowledge to teach science. One result is a lack of science taught in elementary classrooms (Nashon, 2005). To address this problem, this study explored a cohort of elementary teacher candidates’ awareness of PCK and how their PCK was influenced before, during, and after participation in a science methods course and practicum experience in a 12-month teacher education programme at a university in Western Canada.  3  Research Questions The investigation into how these candidates’ PCK was affected as they participated in the science teaching methods course and practicum during their 12-month teacher education programme was guided by the following research questions: 1. What are elementary science teacher candidates’ levels of PCK: (a) at entry into the teacher education programme, (b) immediately after completing their science methods course, and (c) after completing their school practicum? 2. How does the collective elementary teacher candidates’ pedagogical content knowledge evolve as they participate in the science teaching methods course and extended practicum?  Significance of the Study This study provides insights into a cohort of teacher candidates’ PCK and how levels of their PCK were affected by a science teaching methods course and extended practicum. The findings will help instructors and administrators in teacher education programmes better understand and consider the role of PCK as they develop or reform curricula. The study offers theoretical insights into the nature of relationships between pedagogical and content knowledge that can assist administrators and educators with course design, implementation, instruction, and assessment. For teacher education candidates, participation in the study will certainly impact their awareness of the connections between their content knowledge and pedagogies and how these influence and inform the ways they teach. Also, through their participation in this study, teacher  4  candidates may become metacognitive about their PCK, which has the strong potential to help them prepare for the professional world of teaching.  Brief Researcher Background I bring to this study of teacher candidates’ PCK during their experiences in a teacher education programme a life-long interest in and passion for science and education and many years of experience as an elementary science educator and researcher. As a classroom teacher (Kindergarten to Grade 7), I incorporated a hands-on and inquiry focused approach where my students experienced science through predicting, observing, and explaining. When I began my graduate studies (MA), I had the opportunity to bring my love of elementary science to future teachers in my role as a science methods instructor. Every time that I instruct the teacher candidates how to teach elementary science, I notice that very few of them hold undergraduate science degrees. My first-day of-class activities includes the students sharing stories of their science learning experiences. An overwhelming number describe their classroom science experiences as “hard”, “boring”, and “not at all memorable.” Upon further discussion, I hear that many of these elementary teacher candidates have a fear of teaching science as they do not have a rich science background. This is characterised by a lack of confidence in their ability to teach science. My approach involves introducing the teacher candidates to teaching and learning science through hands-on inquiry-focused activities. Together we experience science and its teaching. At the end of the coursework, my students’ feedback consists of statements such as “I am so excited to do science on my practicum” and “I now feel I can teach science and learn along with my students.” Yet, when I visit practicing teachers in 5  the schools, I often observe a lack of science being taught in the elementary classroom. This background experience as an elementary science teacher, university science teaching methods instructor, and practicum supervisor complemented by my understanding of elementary students’ curiosity to inquire into the world around them motivated my desire to frame this study into how elementary teacher candidates’ PCK transforms as they experience science teaching methods course and extended practicum.  Organization of the Thesis This dissertation is organized into six chapters. Chapter 1 presented the introduction to the study, problem statement, research questions, and the study’s significance. Chapter 2 presents a literature review of Pedagogical Content Knowledge (PCK) from a historical perspective, relates PCK to the candidate/pre-service/beginning teacher, examines the commonalities and differences between the elementary and secondary teachers, and concludes with the use of PCK in this research. Chapter 3 describes the study context and the participants. It also discusses the methodology and details of the methods used to collect and analyze data in order to answer the research questions and concludes with comments on the study’s ethical considerations and limitations. Chapter 4 focuses on the questionnaire data analysis from which dominant/principle PCK components or factors indicative of key teacher candidates’ understandings of PCK were determined and interpreted. Chapter 5 presents analysis results of interview data sets and explores the correspondence between the factors determined from the quantitative data and the emergent themes from the phenomenographic analysis of the qualitative data. The analysis of questionnaire data in  6  Chapter 4 revealed components of PCK and the analysis of the interview data in Chapter 5 revealed three themes that correspond to the determined components of PCK. Chapter 6 presents a discussion of the quantitative and qualitative results and summarizes the findings and their implications. Also, conclusions and suggestions of areas for further research are presented.  7  CHAPTER TWO: REVIEW OF THE LITERATURE  This chapter provides an overview of PCK from a historical perspective and relates PCK to the teacher candidate also referred to as a pre-service or beginning teacher. The chapter examines the commonalities and differences between elementary teacher candidates, the majority of whom do not have an undergraduate degree in science, and secondary science teacher candidates, specifically those who possess an undergraduate degree in science. This chapter concludes with a discussion of PCK in this research study. Pedagogical Content Knowledge This section begins with a historical perspective outlining the origins of PCK and follows its evolution through to the current incarnation. I elaborate on the general nature of PCK and the essential inclusion of Shulman’s (1987) two key elements: knowledge of comprehensible representations and knowledge of learner difficulties (Grossman, 1990; Hashweh, 2005; Van Driel, Verloop & Vos, 1998). I then discuss these within a science education context. Finally, PCK is presented as a theoretical framework for the study.  A Historical Perspective In 1888 at the National Education Association (NEA) annual conference, S.S. Parr, then President of the NEA’s Department of Normal Schools, laid the foundation for PCK. Parr argued that an analysis of the process of teaching shows that there is special knowledge in each subject that belongs specifically to instruction. Parr viewed this knowledge to be distinct from other forms as it differed in purpose, in its relation to the 8  facts, and also by how it was obtained. He discussed how an academic subject is ordered by the arrangement of ideas as determined by their relationship to one another. He then compared how the order of the same ideas, when arranged for teaching, is determined by their relationship to the learning mind. Parr explained that the purpose of academic knowledge was based on becoming acquainted with necessary dependence. The purpose of teaching-knowledge was based on becoming acquainted with the processes of the learning mind. He added that the effective teacher has a special kind of knowledge revealed through the teaching process. This teaching knowledge is developed through viewing the various subjects in the order determined by their nature. Thus, according to Bullogh Jr. (2001), Parr laid the seeds of PCK by asserting that there is a special knowledge in each subject that belongs to instruction, and when ideas are arranged for teaching, their relation to the learning mind determines their order. Several speakers at a later NEA conference held in 1907 addressed the issue of the professional preparation of high-school teachers. Many noted that college graduates who lacked pedagogical training did not know how to manage classrooms or work with children effectively. Straton Brooks (1907), Superintendent of schools for Boston, observed that college graduates failed to see the pupils’ point of view. In his conference presentation, Brooks reinforced this position by outlining his belief that teacher training was largely academic training, and this professional training was incidental and superficial. He defined academic training as the study of a subject for the sake of mastering its logical and epistemological relations. In comparison, he described professional training as studying a subject with reference to its adaptability as an instrument for developing and training the student mind. A majority of the NEA  9  speakers agreed with these charges, including George Luckey (1907), Professor of Education at the University of Nebraska. He commented that scholarship alone within a single discipline was not sufficient to teach that discipline despite how thorough and extended the scholarship was. Therefore, it was essential for the teacher to know the disciplines from the learner’s viewpoint. In a speech at a subsequent NEA conference in 1918, William Chandler Bagley spoke about educators making the “unfortunate” distinction between academic and professional subject matter. Bagley’s speech reflected a view that not only must teachers know the subject, they must also know how to adapt the subject to the capacities and needs of students. Bagley viewed academic and professional training as beginning with practice and then progressing to theory; to go from cases to principles and from the concrete to the abstract. The proceedings from the 1888, 1907, and 1918 NEA conferences indicate a significant step for teacher education in the direction of PCK. Summaries of the conference presentations reveal the start of a movement away from the understanding of disciplines held by content area specialists as well as the professionalization of academic subject matter. These historical perspectives taken from the NEA conferences informed my understanding of PCK’s progression and enabled me to better understand the underpinnings of PCK. The articulation of PCK as conveyed in the early NEA conferences is gaining acceptance among teacher educators who agree on the inclusion of PCK in the development of teacher preparation programmes (Darling-Hammond, 2000; Hatton & Smith, 1995). Nonetheless, there still remains a distinction between academic and  10  professional courses, and, in some cases, the distinction appears to have deepened (Goodlad, 1990). Some scholars suggest that this distinction may be a defensive measure on the part of teacher educators in response to their concern about how to establish the academic legitimacy and educational value of pedagogical studies (Goodlad, 1990; Whitehead, 1961). In the past, there have been attempts by teacher educators to address these concerns by mimicking established and respectable academic studies (Monroe, 1952). In such cases, pedagogical training (Pelton & Pelton, 2005), while increasingly accepted in colleges and universities, was supplementary to academic preparation. The understanding that teacher education was an add-on and therefore would have minimal if any effect on the work or curriculum of the academic disciplines, made it tolerable to professors in those disciplines (Bullough Jr., 2001). Then, as now, even within the community of teacher educators, the greatest status accrues to those who, as Goodlad (1990) argues, distance themselves from both the schools and from teaching practice by focusing on disciplinary knowledge. Shulman (1986, 1987) developed the notion of pedagogical content knowledge (PCK) to affirm the importance of the integration of subject-matter knowledge and pedagogy in teaching. According to Shulman, PCK illustrates how the subject matter of a particular discipline is transformed for communication with learners. This powerful transformation is developed through illustrations, examples, explanations, and demonstrations. Shulman offers PCK as a way of representing and formulating subject matter that makes it comprehensible for others. In 1987, Shulman wrote an article titled, “Knowledge and Teaching: Foundations of a New Reform,” where he argued that the foundation of teacher education should be  11  pedagogical content knowledge or PCK. Shulman’s article began with a portrait of an expert teacher framed by the following questions: What does this person believe, understand, and how does he/she know how to teach as he/she does? How can other teachers be prepared through teacher education to teach with this kind of skill? Shulman responded by concluding that this expert teacher possesses knowledge of content, curriculum, learners, educational contexts, and aims. The teacher also holds a unique body of knowledge called pedagogical content knowledge (PCK). Shulman described PCK as the province of teachers; it is their own special form of professional understanding. PCK represents the blending of content and pedagogy into an understanding of how particular topics, problems, or issues are organized, represented, and adapted to the diverse interests and abilities of learners and then presented for instruction. Shulman further shows how an expert teacher’s PCK includes recognition of what makes specific topics difficult to learn as well as the conceptions students bring to their learning. He emphasizes the belief that there is a unique content to teacher education. Despite persistent claims to the contrary, this unique content reaches beyond standard academic courses, even when coupled with methods courses, to address the question of what it means to know a subject in order to teach it.  PCK: An Elaboration Even though there is no universal conceptualization of PCK, there is general agreement regarding its nature and the essential inclusion of Shulman’s (1987) two key elements: knowledge of comprehensible representations and knowledge of learner 12  difficulties (Grossman, 1990; Hashweh, 2005; Van Driel et al., 1998). Shulman summarized his version of PCK in his Presidential Address to the American Association of Education Researchers (AERA) in 1987. He stated that PCK represents the blending of content and pedagogy into an understanding of how particular topics, problems, or issues are organized, represented, and adapted to the diverse interests and abilities of learners, and are then presented for instruction. This definition implies that PCK is both an external and internal construct as it is constituted by what a teacher knows and does and the reasons for the teacher’s actions (Baxter & Lederman, 1999). Hence, PCK encompasses both teachers’ understanding and their enactment of that understanding. Van Driel et al. (1998) conclude that many scholars agree that PCK refers to the teaching of particular topics and may differ considerably from subject matter. A growing number of scholars continue to work on the concept of PCK since its inception, and many have modified and extended Shulman’s (1986, 1987) definition (Geddis, Onslow, Beynon & Oesch, Grossman, 1990; Hashweh 2005; Loughran & Mulhall, 2006; Magnusson & Krajcik, 1999; Marks 1990; Van Driel et al. 1998). For example, Geddis et al. (1993) define PCK as knowledge that plays a role in transforming subject matter into forms that are more accessible to students. Grossman (1990) perceives PCK as consisting of knowledge of representations and strategies, student learning and conceptions, curriculum available for teaching, and the purposes for teaching a particular subject. Carter (1990) views PCK as what teachers know about their subject matter and how they transform that knowledge into classroom curricular events. Taken together, I suggest that it is the transformation of subject matter knowledge for the purpose of teaching that forms the basis of PCK. In other words, it is  13  commonly stated that PCK is used to adapt subject matter knowledge for pedagogical purposes through a process which Shulman (1987) calls “transformation,” Ball et al. (1990) label “representation,” Veal and MaKinster (1999) term “translation,” Bullough Jr. (2001) names “professionalizing,” and Dewey (1902/1983) refers to as “psychologizing”. A number of researchers outline specific criteria for PCK. Grossman (1990) suggests four central components that contribute to pedagogical content knowledge development: (a) knowledge and beliefs about purpose, (b) knowledge of students’ conceptions, (c) curricular knowledge, and (d) knowledge of instructional strategies. Cochran, DeRuiter and King (1993) assume a constructivist perspective and propose the idea of pedagogical content knowing (PCK). Their four components of PCK include: (a) knowledge of students, (b) knowledge of environmental contexts, (c) knowledge of pedagogy, and (c) knowledge of subject matter. These authors maintain the basic elements of PCK and expand upon existing definitions by incorporating perspectives such as constructivism and ways of knowing (Shulman, 1992). One purpose of this review is to articulate a working definition of PCK for this dissertation. I find that the different versions or perspectives of PCK identified in the studies reviewed here have a number of similarities. Commonalities that I will draw upon to formulate a definition of PCK for my study include knowledge of subject matter, knowledge of students and their possible misconceptions, knowledge of curricula, and knowledge of general pedagogy. I will now review forms of PCK specific to my study context of teacher candidates and elementary science education.  14  Science PCK Some researchers and practitioners discuss PCK specifically within a science education context. Magnusson et al. (1999) view science PCK as including (a) a teacher’s orientation to teaching science, (b) knowledge of science curricula, (c) knowledge of science assessment, (d) knowledge of scientific literacy, (e) knowledge of students’ understanding of science, and (f) knowledge of instructional strategies. GessNewsome (1999) identifies two different epistemological views of science PCK: an integrative view and a transformative view. In the integrative view, science PCK is not seen as a separate area of teacher knowledge but rather an experiential application of other forms of knowledge, such as science content knowledge. In the transformative view, other forms of teacher knowledge are transformed into a new form of knowledge, science PCK, for the purpose of instruction. In the integrative and transformative views, other forms of teacher knowledge are seen as being transformed and combined through experience into a new form of knowledge, science PCK. This understanding of science PCK is also present in the work of Cochran et al. (1993), Grossman (1990), and Shulman (1986, 1987). These authors’ work furthered my understanding of PCK as they place PCK within a science education context. My study explores teacher candidates’ developing PCK during their science education methods course and the extended practicum. Little is known about how teachers develop pedagogical content knowledge. Two necessary ingredients are typically assumed to be subject matter knowledge and teaching experience (Magnusson et al., 1999; van Driel et al., 1998; van Driel, De Jong & Verloop, 2002). Subject matter knowledge is often perceived to be “a prerequisite,  15  preceding the development of PCK” (van Driel et al., 1998, p. 681), and PCK is developed through classroom experience (Baxter & Lederman, 1999; Gess-Newsome, 1999; Grossman, 1990; Magnusson et al., 1999; Van Driel et al., 2001). However, the lack of pre-service teachers’ classroom experience limits their ability to develop PCK (van Driel et al., 2002; van Driel et al., 1998). An account of the efficacy of teacher education programmes in developing PCK is found in van Driel’s (1998) review of the PCK literature: “[t]eacher training programs usually do not exert a major influence on science teachers’ PCK” (p. 682). However, if carefully fostered and assessed, changes in PCK have occurred even in pre-service teachers who have limited experience with teaching and learners. In fact, some studies describe specific aspects of pre-service science teachers’ PCK development (Anderson & Mitchener, 2000; Davis, 2004; Davis & Petish, 2005; Zembal-Saul, Krajcik & Blumenfeld, 2002). Therefore, given my experience with teacher candidates including the fact that their experiences as students and in other teacher-like roles that include tutoring or coaching, compelled me to investigate this phenomenon among teacher candidates. I hesitated to accept the claim that teacher education programmes have no positive effect on developing teacher candidates’ PCK as there is still a lack of definitive research on the topic. Hence, this motivated my study. Several researchers suggest that the development of PCK takes time as it builds through experience (Baxter & Lederman, 1999; Gess-Newsome, 1999; Grossman, 1990; Lee & Luft, 2005; Magnusson et al., 1999; Van Driel et al., 2001). In a study conducted by Lee, Brown, Luft, and Roehrig (2007), PCK levels of beginning and experienced teachers were compared. The study’s findings revealed that beginning science teachers  16  had limited or basic levels of PCK whereas the experienced teachers held proficient levels of PCK. In the case of experienced teachers, their subject matter knowledge and PCK integrate to form “content knowledge complexes” (Sherin, 2002), but new teachers have not yet had the classroom experiences to promote this integration. In fact, prospective teachers often demonstrate little explicit pedagogical content knowledge for specific science topics (Baxter & Lederman, 1999; Gess-Newsome, 1999). Similarly, in a study conducted by Lee, Brown, Luft, & Roehrig (2007), researchers found that a science specialty and/or undergraduate science degree has a limited effect on a preservice teachers’ PCK level. The literature indicates that possessing PCK enables teachers to develop confidence which improves their ability to plan and teach (Osborne & Simon, 1996) and leads to broad inquiry-based content focused discussions with their students (Roth, Anderson & Smith, 1986; Tobin & Garnett, 1988). A study by Sanders, Borko, and Lockard (1993) showed that despite teaching in unfamiliar areas of science, experienced teachers tend to draw upon their pedagogical knowledge for a teaching framework. In contrast, teachers with weak PCK, experience difficulty with the daily duties and responsibilities of a practising teacher such as monitoring students' movement, giving appropriate feedback, planning developmentally appropriate progressions, and generating explanations and tasks that match or adapt to learners' needs (Rovegno, 1991, 1993a, 1993b, 1994, 1995; Rovegno & Bandhauer, 1994). Few teacher candidates have access to “real” learners for an extended time other than during the final practicum. For example, a typical final or long-term practicum is 13 weeks in the Teacher Education Programme at the University of British Columbia.  17  However, listening to and describing learners’ ideas is critical for PCK development, and this only occurs when teacher candidates are in the classroom and have the opportunity to interact with and listen to students (Smith, 2000). In fact, Meyer (2004), in a comparative study of pre-service, first year, and expert teachers, found that the ability to predict ideas is a key difference between novice and expert teachers. Magnusson et al. (1999) state that the ability to anticipate learners’ ideas is an important part in the development of a teacher’s PCK. In a summary of studies exploring teachers’ pedagogical content knowledge of students’ ideas, Magnusson (1999) writes that “[t]he pattern of findings from this type of study is that although teachers have some knowledge about students’ difficulties, they commonly lack important knowledge necessary to help students overcome those difficulties” (p. 106). One possible reason that teachers struggle with students’ ideas is that their own subject matter knowledge is weak (Ball & Bass, 2000; Carlsen, 1992; Smith & Neale, 1989). Smith and Neale (1989) conducted an extensive study of elementary teachers that looked at how subject matter knowledge affected teaching. They concluded that weak subject matter knowledge often prevented teachers from predicting students’ ideas. However, even when teachers understood the scientific ideas, they did not address students’ ideas in their teaching. One reason for this result may be that pre-service and beginning teachers are often overwhelmed with workload and tend to focus on management issues or themselves (Bryan & Abell, 1999; Kettle & Sellars, 1996) rather than student learning (Fuller, 1969; LaBoskey, 1994). Although researchers identify various reasons why pre-service and beginning teachers tend not to acknowledge and incorporate student ideas, ultimately the result is that student learning becomes less  18  of a priority (Anderson et al., 2000; Bryan & Abell, 1999; Trumbull, 1999). Teacher candidates as well as pre and beginning elementary teachers often give precedence to student interest or engagement over student learning. Novelty engagement and interest in science are admirable goals for classroom teaching. However, these kinds of ‘wow’ strategies often overshadow science content learning, and the emphasis of instruction becomes making science ‘fun’. The literature indicates that pre-service teachers want their students to enjoy science and to develop positive feelings about science learning (Abell et al., 1998; Anderson et al., 2000; Bryan & Abell, 1999; Trumbull, 1999). Trumbull (1999) states in her work on reflective practice that many pre-service teachers want learners to have a different experience in science courses than they themselves had. Other scholars agree that pre-service teachers who report difficult or painful science experiences in their own histories want their students to enjoy science (Richmond, Howes, Kurth & Hazelwood, 1998). Pre-service teachers may believe that one way to achieve this result is to make learning ‘fun’ and direct the teaching focus away from student learning. Teacher candidates and in-service teachers can enrich their PCK. The school-based practicum (or the practical teaching experience) is widely considered to be the most important component in teacher education programmes by both teachers and teacher candidates due to the “real world” context (Collins, 2004; Howitt, 2007; Loughran, Mulhall & Berry, 2008; McIntyre, Byrd & Foxx, 1996). When it comes to practical experience however, some researchers suggest that more is not necessarily better as cited in McIntyre et al. (1996), comment that “the commonly structured student teaching practice prepares teacher candidates for the loneliness of the classroom not for reflection  19  and collegiality” (p.173). Pre-service teachers “often observe practices in the classroom that contradict what college instructors consider appropriate practice” (p. 173-175). This result can lead to pre-service teachers doubting the “worthiness” and value of what they are learning on campus. In their work on teacher education, Loughran, Korthagen, and Russell, (2008) state that pre-service teachers are concerned with acquiring techniques and skills while their teacher-educators ask them to also think about the links between theory and practice. The authors claim that this disconnect leads to a “big gap” in the mind of the pre-service teacher as cited in McIntyre et al. (1996, p. 174), notes that discussions among pre-service teachers and their colleagues usually focus on pupils in the classroom rather than on the curriculum. In reference to specialist teachers with strong science backgrounds such as secondary teachers, Abell (2008) discusses the “centrality of science content” (p. 1408). For specialist teachers, Abell suggests that it may be appropriate to emphasize science content. Yet, this approach may present problems for elementary teacher candidates due to the fact that many have weak science backgrounds. Requiring elementary teacher candidates to learn more science content may not be an effective way to engage them and may be counterproductive. As an alternative to teaching more content, researchers advocate for science learning experiences that challenge didactic models of traditional science teaching. These science-learning experiences may build teacher candidates’ confidence in teaching science and contribute to their understanding of scientific concepts, in short their PCK. This can be achieved through modeling constructivist approaches to learning and providing teacher candidates with opportunities to experience success in teaching science. If the aim is to change attitudes about and build confidence  20  in science teaching, enabling pre-service teachers to learn how to teach science may be more effective than requiring them to learn more content (Appleton, 2005; Baker, 1994; Hand & Peterson, 1995; Howitt, 2007; Palmer, 2006; Skamp, 1989). Applegate (2003) also noted a tendency to rely on more teacher directed approaches by both secondary science specialists and primary teachers who lack confidence in the science area being taught. This finding may indicate that PCK is linked to confidence. Therefore, teachers who incorporate a less didactic teaching approach may have greater confidence in teaching science and consequently show increased science PCK. Typically, most primary teachers avoid teaching science as they lack confidence in the area (Appleton, 2006). Yet these same primary teachers may have a broad range of pedagogical skills that they apply successfully in other curriculum areas. Therefore, on the basis of this argument, if primary teachers are supported to engage in meaningful science teaching and to reflect on their experiences, their confidence and thus their science PCK may improve. Appleton (2006) referred to this as “experiential PCK” (p. 38). Similarly, Howitt (2007) claims that primary teachers can build confidence in science and improve their PCK by drawing upon “a range of other forms of teacher knowledge” (p. 49). Thus the development of primary teachers’ PCK may be best achieved by supporting them as they learn how to teach science in authentic learning situations (Howitt, 2007; Jones, 2008; Palmer, 2006). Research into teachers’ PCK indicates that PCK and PCK development for elementary school teachers differ from that of secondary teachers (Appleton, 2006; Smith 1999; Smith & Neale 1989). For example, based on their work with secondary physics teachers, Bell, Veal, and Tippins (1999) outline a hierarchy of PCK ‘‘types’’ that suggest  21  different levels of generality. They identified the following types of PCK developed by secondary teachers: a broad base of science PCK, specific discipline PCK (e.g., physics), and specific topic PCK (e.g., electric circuits). Specific discipline PCK was prevalent, and the authors identified it as emerging from repeated experiences of teaching specific topic PCK. Therefore, Bell et al. conclude that secondary school teachers tend to develop specific discipline PCK in their own discipline specialty. In some cases, secondary teachers found their PCK for teaching a different science discipline was limited or nonexistent. Chan (1998) reports that scientists with doctorates in physics hold common misconceptions about selected biology topics; hence, highly qualified secondary teachers in one science discipline may not have the science content knowledge to develop effective PCK for other disciplines. Elementary teachers work with PCK quite differently than secondary teachers as they approach the development of science PCK with the idea that science should be activity based and stem from specific activity ideas (Appleton, 2006). Therefore, elementary teachers tend to work with specific topic PCK as the majority do not have a science undergraduate degree or science specialization. Many elementary teachers tend to have limited knowledge in both science content knowledge and in science PCK, given that few elementary school teachers are science discipline specialists (Appleton, 2003; Goodrum, Hackling & Rennie, 2001). For instance, in the teacher education programme where I teach elementary science methods, the majority of teacher candidates hold undergraduate degrees in the humanities and consequently have limited science content knowledge.  22  The literature on science PCK indicates that my study participants will for the most part have limited science PCK. This led me to hypothesize that the lack of science education at the elementary level may be a reflection of the science knowledge held by elementary teachers. This literature synthesis was critical to my study design in terms of participants, data collection, and data analysis.  PCK: Theoretical Framework For the purpose of this study, I adopted Schulman’s (1986) model to understand the teacher candidates’ PCK by interpreting any shifts the analysis might reveal and GessNewsome’s (1999) approach to understand whether deficiencies exist in their PCK, and if their PCK is influenced by teacher education programme experiences. These two models are complementary, and I used this framework to assess and interpret the teacher candidates’ PCK changes during their tenure in the 12-months teacher education programme.  Summary In summary, the literature synthesis identified science PCK as a form of teacher knowledge (Ball, 1990; Cochran et al, 1993; Grossman, 1990; Magnusson et al. 1999; Shulman, 1986, 1987). PCK is closely connected to other kinds of teacher knowledge such as the teacher’s science content knowledge. Science PCK is developed through the teacher’s own experiences as well as his/her science teaching practice and those of his/her colleagues.  23  In developing science PCK, teachers draw upon a range of teacher knowledge such as knowledge of curriculum, context, general pedagogy, and student ideas. It is important to note that elementary teachers develop and work with science PCK quite differently than secondary teachers. Secondary science teachers are often disciplinespecific in their PCK; this is in part due to the fact that they repeatedly teach the same discipline, and they usually hold an undergraduate degree or specialization in that discipline. Elementary teachers tend to develop specific topic PCK as the majority do not hold an undergraduate degree or specialization in science and may lack the content knowledge associated with a specific discipline. In the next chapter of this thesis, Chapter 3, I will address the context of the study and the participants as well as the methodology and methods used to collect and analyze data in order to answer the research questions. I conclude with ethical considerations, the study’s limitations, and my researcher role.  24  CHAPTER THREE: METHODOLOGY AND METHODS  This study inquired into elementary teacher candidates’ PCK and how this was influenced during the candidates' experiences in a science methods course and extended practicum. In this chapter I describe the study context and the participants. I also discuss methodology and details of the methods used to collect and analyze data to answer the research questions. I conclude with comments on ethical considerations and limitations of the study and then reflect on my researcher role.  Context of the Study In this section I discuss the study context, which involves the Teacher Education Programme, the science methods course (SCI 349), and the teacher candidates.  The Teacher Education Program The Teacher Education Programme is located in the Education Building on the main campus of a Western Canadian University. The programme is built upon two different options, which teacher candidates select before entry, in order to be awarded a Bachelor of Education Degree (B.Ed.). Each option is based on time spent in the programme. The options involve either a 12-month intensive program or a two-year academic-calendar programme. The Secondary Option is a 12-month intensive programme that prepares the teacher candidate to teach specialty areas in Grades 8 to 12 in the province’s schools. The Middle Years option is a 12-month intensive programme that prepares the teacher candidate to teach Grades 6 to 9 in both school and alternative 25  settings. The Elementary Option offers teacher candidates a choice of two programmes: a 12-month intensive programme that prepares teacher candidates to be generalist teachers for Kindergarten to Grade 7, or a two-year academic-calendar programme that prepares generalist teachers for K-7 with the added advantage of specializing in Kindergarten/Primary, English as a Second Language, Expressive Arts, Humanities, Math & Science, and Special Education. This study was conducted at the aforementioned university and focused on all 284 students enrolled in the Elementary Teacher Education programme (12-month) in the academic year 2009/2010. A typical teacher candidate’s timetable at the university depicts a deployment of methods courses in Term 2 (January – March) with the science methods course labeled SCI 349 as seen in Figure 1. According to the posted timetable SCI 349 met every Thursday for three hours.  26  Sun Mon  Tue  Wed  800  LANG 300 ROOM - 208  830  LANG 300 ROOM - 208  Thu  900  LANG 300 ROOM - 208  LANG 300 ROOM - 208  LANG 300 ROOM - 208  930  LANG 300 ROOM - 208  LANG 300 ROOM - 208  LANG 300 ROOM - 208  1000  LANG 300 ROOM - 208  EDUC 331 ROOM - 208  LANG 300 ROOM - 208  LANG 300 ROOM - 208  1030  LANG 300 ROOM - 208  EDUC 331 ROOM - 208  LANG 300 ROOM - 208  EDUC 314 ROOM - 104  1100  LANG 300 ROOM - 208  EDUC 331 ROOM - 208  LANG 300 ROOM - 208  EDUC 314 ROOM - 104  1130  LANG 300 ROOM - 208  EDUC 331 ROOM - 208  LANG 300 ROOM - 208  EDUC 314 ROOM - 104  Fri  Sat  EDUC 314 ROOM - 104  1200 1230 1300  EDUC 310 ROOM - 209  EDUC 314 ROOM - 1004  1330  EDUC 310 ROOM - 209  EDUC 314 ROOM - 1004  EDUC 316C ROOM - 207  1400  EDUC 310 ROOM - 209  EDUC 314 ROOM - 1004  EDUC 316C ROOM - 207  1430  EDUC 310 ROOM - 209  EDUC 314 ROOM - 1004  EDUC 316C ROOM - 207  EDUC 349 ROOM - 127  EDUC 316C ROOM - 207  1500  SCI 349 ROOM - 127  1530  EDUC 331 ROOM - 206  EDUC 349 ROOM - 127  SCI 349 ROOM - 127  1600  EDUC 331 ROOM - 206  EDUC 349 ROOM - 127  SCI 349 ROOM - 127  1630  EDUC 331 ROOM - 206  EDUC 349 ROOM - 127  SCI 349 ROOM - 127  1700  EDUC 331 ROOM - 206  Figure 1 Timetable: Elementary Teacher Candidates  The Elementary Science Methods Course (SCI 349) According to the university Calendar, SCI 349 is described as “SCI 349 Science Elementary: Curriculum and Pedagogy.” Course instructors create their own course definition and course intentions. The following is an excerpt from the SCI 349 course outline for students: Course Description: “No special knowledge of science is required.”  27  The main purpose of the course is to develop a working knowledge of curriculum organization in elementary science; and to develop confidence and skill in the creation and organization of specific science themes. Course Intentions: to encourage active participation in the teaching and learning of science, fostering positive inquiring attitudes; to examine, explore and practice strategies and methods of teaching elementary science; emphasis is "hands-on", inquiry-based, child centered methodologies; to plan and develop learning activities and evaluation procedures for elementary science students; to become aware of the variety of learning resources and materials available in the school and community; to integrate science content with other curriculum areas; and to use the Ministry of Education, Integrated Resource Package K-7 Science Curriculum (2005 Edition) in the creation of science lessons/units for specific grades.  Teacher Candidates The participants in this study were recruited from the 12-Month Elementary Teacher Education Programme at the University. Each teacher candidate possesses a four-year degree (120 credits) or the equivalent (e.g., a three year degree and 30 additional academic credits) and has a minimum academic average of 65% (2.5 on a four point scale) over the 36 prerequisite credits. The breakdown for the 36 credits are as follows: six credits of English literature and composition; three credits of a laboratory science or physical geography; three credits of mathematics; three credits of Canadian history or Canadian human geography; three credits of Canadian studies. A further 18 Senior Credits are required from one of the following categories: A: 18 senior credits  28  (third or fourth year) from one or a combination of: Art, Biology, Chemistry, Earth Science/Geology, English, French, Geography, History, Mathematics, Music, Physics; B: 18 senior credits (third or fourth year) from a combination of the following, with no more than 12 credits from any one discipline: Anthropology, Asian Studies, Astronomy, Biochemistry, Botany, Canadian Studies, Chinese, Classical Studies, Creative Writing, Dance, Drama, Economics, Family Studies, First Nations Studies, German, Italian, Japanese, Kinesiology, Linguistics, Microbiology, Oceanography, Philosophy, Physical Education, Physiology, Political Science, Psychology, Punjabi, Russian, Sociology, Spanish, Statistics, Women’s Studies, Zoology; and C: 18 credits (third or fourth year) from any combination of courses from List A and B with no more than 12 credits from any one discipline from List B. All teacher candidates are fully competent in both written and spoken English. In addition to the academic requirements, each teacher candidate has more than 75 hours of experience working with school age children/youth. In a majority of cases, this experience involves working with children in a group setting and who are within the age range the teacher candidate plans to teach such as elementary or secondary. Based on the admission requirements for the Elementary Teacher Education Programme, an applicant requires one post-secondary laboratory science course. A physical geography course can also be considered as a laboratory science course.  The Study Newman, Ridenour, Newman, and DeMarco (2003) argue that selection of research methods should start with understanding one’s research purpose and selecting the right  29  research questions. The purpose of this research is to examine teacher candidates’ PCK with respect to science during their one-year experience in the university. To address this purpose the following questions focused the study’s design and implementation: 1. What are the elementary science teacher candidates’ levels of PCK: (a) at entry into the teacher education programme, (b) immediately after completing their science methods course, and (c) after completing their school practicum? 2. How does the collective elementary teacher candidates’ pedagogical content knowledge evolve as they participate in the science teaching methods course and extended practicum?  To investigate these questions, I adopted a case study approach that used a mixed methods perspective (Newman et al., 2003). I employed quantitative methods to assess the effect of the science teaching methods course and extended practicum experiences on the participants’ PCK. I used an adapted instrument to assess the candidates’ levels of PCK at various times during their experience in the 12-months teacher education programme. I also examined student views on their experiences using qualitative methods that included semi-structured interviews. The literature indicates that there are both supporters and opponents of mixed methods research. The most vivid debate focuses on whether mixing of quantitative and qualitative paradigms is actually possible. Briefly summarized, the quantitative paradigm, based on a positivist worldview, has been characterized as regarding reality as single, objective, and fragmented (Creswell, 1994, 2003). This paradigm often concentrates on confirming theory, testing hypotheses, and what can be measured and  30  analyzed statistically. Facts and events are seen as objective reality and are independent of the researcher and learner. In contrast, the qualitative paradigm, based on a naturalistic worldview (Lincoln & Guba, 1985), is interested in exploring meanings of people’s experiences, and how people make sense of their lives and the world. In this paradigm, reality is regarded as being socially constructed. The researcher, who is actively involved in the research process, builds abstractions, concepts, and theories from observations, interviews, and interactions using complex, interwoven descriptions and variables that are difficult to measure. While these paradigms are based on different worldviews, logic, and views of reality, Caracelli and Greene (1997) and Johnson and Onwuegbuzie (2004) identify several parallels between quantitative and qualitative research. These authors argue that the criteria of trustworthiness (or validity) in qualitative research (Guba & Lincoln, 1989; Lincoln & Guba, 1985) include the notions of credibility and transferability that are similar to the concepts of internal and external validity used by quantitative researchers (Johnson & Christensen, 2000, 2008). Furthermore, both quantitative and qualitative researchers attempt to minimize confirmation bias and other sources of invalidity (or lack of trustworthiness) that exist in every study (Sandelowski, 1986). Finally, both quantitative and qualitative methodologies “describe their data, construct explanatory arguments from their data, and speculate about why the outcomes they observed happened as they did” (Sechrest & Sidani, 1995, p. 78). Tashikorri and Teddie (2003) and Johnson and Onwuegbuzie (2004) argue that there is a third paradigm that is linked to mixed methods research. This paradigm is grounded in a compatibility thesis as well as in the philosophy of pragmatism (Johnson &  31  Onwuegbuzie, 2004; Tashikorri & Teddie, 2003). Proponents of this third research paradigm argue that quantitative and qualitative methods are compatible, and that they can both be used in a single research study. Supporters of the philosophy of pragmatism suggest using the combination of methods that work best in a real world situation, regardless of the philosophical or paradigmatic assumptions. One of the limitations of the mixed methods however, is that this approach requires a researcher to be proficient in both quantitative and qualitative methods (Creswell & Plano, 2007). The researcher must have the ability to move between the paradigms, playing the role of objective observer during quantitative data collection events and then becoming a participant observer who actively interacts with participants during qualitative data collection. Finally, as Creswell and Plano (2007) indicate, quantitative and qualitative studies often involve different sample sizes and different sampling strategies, and their convergence may present challenges for researchers as they interpret the results. My perspective on the qualitative - quantitative discussion is based on the “dialectic” stance as described by Tashikorri and Teddie (2003). I do not prioritize one paradigm over the other but rather prefer to build upon the strengths of each method. In this study, I achieved this by analyzing each data set using methods best suited for the data type. The integration of methods occurred primarily at the data analysis and interpretation stages where I merged my findings to provide a more comprehensive and in-depth analysis (Greene & Caraselli, 1997). To fully exploit strengths of each underlying paradigmatic position, I chose Triangulation Design (Creswell, Clark, Gutmann & Hanson, 2003) to obtain  32  complementary quantitative and qualitative data on the same topic. I followed the approach as recommended by Morse (1991) who states that the purpose of this design is “to obtain different but complementary data on the same topic” (p. 122) to better understand the research problem. I used the convergence model variant of Triangulation Design as it allowed me to collect and analyze quantitative and qualitative data separately on the same phenomenon and then converge the different results by comparing and contrasting them during the interpretation. The advantage of this variant within the model is that results can be used to validate, confirm, or corroborate quantitative results with qualitative findings with the purpose of developing valid and well-substantiated conclusions (Creswell, 2006). This study used a sequential explanatory mixed methods design consisting of two distinct rounds (Creswell, 2002, 2003; Creswell et al., 2003). In the first round, the quantitative numeric data were collected using an on-line questionnaire, and the data were subjected to a discriminant function analysis. The goal of the quantitative round was to identify dominant factors and to allow for purposeful selection of informants for the second round. In the second round, a qualitative case study approach was used to collect data through individual semi-structured interviews, documents, and elicitation materials to help explain why certain factors tested in the first round may be significant indicators of the teacher candidates’ PCK. The rationale for this approach is that the quantitative data and results provided general insight into answering the research questions while the qualitative data and its analysis further validated, refined, and explained statistical results as discerned from the analysis of the participants’ views.  33  Investigational Approach: Case Study In this study, case study was employed as an investigational approach. I begin this section with a discussion of case study and outline the ways my study was suited to case study design. Stake (1995), Merriam (1998), and Yin (2003b) agree that the intent of the case study is to represent the case, the object being studied, and conducting case study research involves the search for emergent patterns. Case study approach is well suited to in-depth and descriptive studies of bounded systems such as social groups, events, and programmes within real life contexts. I viewed the teacher education programme as a bounded system as defined by Stake (1995) due to boundaries in the duration of the methods course (10 weeks), practicum (13 weeks), the number of enrolled students (284), and the length of the programme (12 months). The case study researcher documents information using multiple methods and data sources and is rigorous in data collection and analysis. Stake (1995), Merriam, (1998), and Yin (2003b) confirm that a case study approach is suitable to both qualitative and quantitative research designs as well as a mixed method approach. My research questions included ‘how’ and ‘why’ as recommended by Merriam (1998) and Yin (2003b) for studies that explore phenomenon and processes. As Merriam (1998) points out, case study research can be a useful approach when studying professional practice and problems of practical significance such as my work on teacher candidates’ experiences. Multiple evidence sources (surveys, interviews) are involved in case study research to ensure triangulation through corroboration of data to produce more convincing and accurate findings (Patton, 2002; Yin, 2003b). As recommended, I drew  34  upon multiple data sources including surveys (Attewell & Rule, 1991) and interviews (Noor, 2008). According to Stake (1995), a case study can ‘catch the complexity’ of a situation that gives it particular relevance here. To ‘catch this complexity’, Stake (1995) suggests that case studies may be either intrinsic or instrumental. Intrinsic case studies focus on a case because it is of interest in its own right; in other words the case is ‘intrinsically’ interesting. In contrast, instrumental case studies (whether of a single case or ‘cumulative’ cases) seek primarily to highlight what can be learned about and applied to other like cases. While the results of a case study are not generalizable in the classical sense (Novak, 2007), it is argued that findings from a single case can be ‘related to’ (Bassey, 1999), ‘transferred to’ (Guba & Lincoln, 1989), or ‘recontextualised’ (Morse, 1991) to other like contexts. I view my study as an instrumental case study as this investigation of students’ experiences and PCK can be applied to other cases such as teacher candidates and teacher education programmes. A research strategy is often described as an activity that involves the search for and assessment of information (Patton, 1990; Taylor & Bogdan, 1998). Research design is often referred to as a proposal that outlines the study’s logistics and research process including a problem statement, hypothesis, research questions, and the means and methods of investigation (Strauss & Corbin, 1990). In his typology of case study, Yin (1981, 1984) discusses case study as a both a strategy and design. One design strategy that Yin advocates is replication logic, that is, replicating and affirming the results of previously conducted studies. He stresses that this is an essential strategy for multiple  35  case analysis. Yin also addresses concerns of validity and reliability in experimental research and emphasizes the need to incorporate both into case study research design. Other scholars focus on case study’s role as a research strategy. Miles and Huberman (1984) focus on understanding the dynamics present within single settings. They conclude that case study as a research strategy is well suited to capturing the knowledge of practitioners and developing theories. Barnes, Christensen, and Hansen (1994), acknowledged authorities on case method teaching within a business context, present case study as a field strategy. Their interpretation of case study involves a practitioner-led trial-and-error process deemed necessary for knowledge accumulation. The authors emphasize that it is incumbent upon scientists to formalize this knowledge and recommend that a case study strategy be employed to document the experiences of practice. For the purposes of my study, I define case study as a research strategy due to the focus on documenting the teacher candidates’ practice in both their science methods course and the extended practicum. Yin (1994) divides case study research into different categories based on the study’s purpose. He designates case studies as being exploratory, descriptive, or explanatory. The exploratory case study is an intuitive investigation that is made prior to defining a research question. This type of case study is appropriate when seeking answers to questions regarding presumed causal links in real-life interventions that are too complex for the survey or experimental strategies. In evaluation language, the explanations would link program implementation with program effects (Yin, 2003). In an exploratory case study, field-work and data collection may be undertaken prior to definition of the research questions and hypotheses and survey questions may be  36  dropped or added based on the outcome of the pilot study (Yin, 1989). An example of an exploratory case study is Joia’s (2002) research on a web based e-commerce learning community. A descriptive case study is a complete description of a phenomenon in its context (Tellis, 1997; Yin, 2003). Descriptive cases require the investigator to begin with a descriptive theory and then proceed to describe an intervention or phenomenon and the real life context in which it occurred. If a theory is not present at the outset of the study, the researcher may encounter problems during the project (Yin, 2003). An example of a descriptive case study is found in Pyecha’s (1998) work on special education where researchers employed a descriptive case study to identify, describe, and compare theoretical patterns Hakim (1987) further classifies descriptive case studies as typical or selective. Typical descriptive case studies may be illustrative of aspects thought to be representative or typical. The selective case study may focus on a particular issue or aspect of behaviour with the objective of refining knowledge in a particular area. The exploratory case study aims to explain cause and affect relationships and thus is well suited to causal studies (Tellis, 1997). This type of study is used to explore those situations in which the intervention being evaluated has no clear single set of outcomes (Yin, 2003). In very complex and multivariate cases, the analysis can make use of pattern-matching techniques. Yin and Moore (1988) conducted an exploratory case study to examine the reason why some research findings gain entry into practical usage. They used a funded research project as the unit of analysis where the topic was constant, but  37  the project varied. The utilization outcomes were explained by three rival theories: a knowledge-driven theory, a problem-solving theory, and a social-interaction theory. Yin (2003) also differentiates case studies as being either single or multiple-case. Multiple case studies enable the researcher to explore differences within and between cases to replicate findings across cases (Yin, 2003). Because comparisons will be drawn, it is imperative that the cases for a multiple case study are chosen carefully so that the researcher can predict similar results across cases or contrasting results based on a theory (Yin, 2003). A research study may involve both single and multiple case studies. For example, in a study of an educational programme, a single case study approach could explore individual participants’ experiences whereas a multiple case design could look across a number of different classes for any differences and commonalities. Stake (1995) further delineates the case study by addressing the purpose of the inquiry. He describes the intrinsic case study as enhancing understanding of a particular case and suggests that researchers who have a genuine interest in the case should use this approach when the intent is to better understand the case. It is important to remember that the study is not undertaken primarily because the case represents other cases or it illustrates a particular trait or problem, but because in all its particularity and ordinariness, the case itself is of interest. Stake asserts that the purpose of the intrinsic case study is not to understand some abstract construct or generic phenomenon or build theory (although that is an option; Stake, 1995). Drawing upon the case study categorization presented in the above paragraphs, I characterize my study as a descriptive case study as teacher candidates’ experiences were explored within a real life context of the teacher education programme. Prior to  38  commencing the study, I drew upon definitions of PCK as presented in the literature to inform the research process. My study followed a selective case study approach as I investigated participants’ PCK with the intent to build add upon to current research in the field. Based on Yin’s (2003) work, my study was an exploratory case study as the outcomes were not clear prior to commencing the study. I incorporated an intrinsic case study design to illuminate the case of elementary teacher candidates’ PCK as they experienced their science methods course and extended practicum. I conducted a multiple case study as I explored the teacher candidates’ experiences as well as the teacher education programme, specifically the science methods course and extended practicum. In sum, I employed case study methods and processes in order to collect data. My approach intertwines a descriptive and explanatory multiple case approach that is intrinsically interesting with the intent of drawing conclusions and insights that I believe are of wider relevance.  Research Methodology The study employed mixed quantitative and qualitative methods. Quantitative data were collected using survey methods, and the qualitative data were collected using phenomenographic methods.  39  Survey Methods To collect the requisite information, survey methods were employed. I chose to administer a questionnaire as it is considered to be an efficient tool to collect primary data (Statistics Canada, 2010). Surveys can be useful when a researcher wants to collect data on phenomena that cannot be directly observed (such as opinions on library services). According to Statistics (Stats) Canada (2010), a survey is any activity that collects information in an organised and methodical manner about characteristics of interest from some or all units of a population using well-defined concepts, methods, and procedures and compiles such information into a useful summary form. In a survey, researchers sample a population. Basha and Harter (1980) define a population as any set of persons (or objects) who possess at least one common characteristic. To collect the data, I chose self-enumeration to allow the respondent to complete the survey tool without the assistance of an interviewer and to enable the respondent to proceed at their own pace and have the time and opportunity to consult personal records. I also used Computer-Assisted Self Interviewing (CASI) due to the relative ease in administration and to the fact that interviewer-assisted methods can be time consuming. Questionnaires play a central role in the data collection process (Statistics Canada, 2010). Some of the key advantages of a questionnaire are very pragmatic. The data entry and tabulation and analysis of a questionnaire can be done using a variety of software packages. After researching types of questionnaires, I determined that a structured, nondisguised questionnaire was the best fit for the data collection. I formulated the items in advance and listed them in pre-arranged order (Siniscalco & Auriat, 2005) which I found to be ideal for a computerized survey. I chose this method as the computer asks the same  40  questions created by the researcher without the varying characteristics of different interviewers and contexts. The computer then records the answers (data) in a form that is easy to edit, tabulate, and use in a statistical software package (Beri, 2007). I chose closed-ended items that involved the respondents selecting from one of the possible options. My intention of implementing a non-disguised questionnaire was to inform respondents about the relevance of the study thus enabling them to provide the desired information accurately (Crisp, 1958). I chose to use a Likert scale, a method of summated ratings (Edwards & Kilpatrick, 1948) consisting of a series of statements to which the respondent indicates their degree of agreement and disagreement with each response. The respondent’s statement is given a numerical score which I used to measure responses to the closed-ended questions. In summary, the quantitative data were collected using survey methods. The questionnaire was the tool used to collect the data. The questionnaire in this study was non-disguised and self-enumerated. It was conducted through Computer-Assisted Self Interviewing and consisted of a series of closed ended items. Participants indicated their responses on a Likert scale.  Qualitative Methods The study employed aspects of phenomenographic methodology for qualitative data collection and analysis. The primary phenomenographic method employed to collect qualitative data was the interview. For the purpose of this study, phenomenographic methods were considered suitable since the aim of my study was to understand the various ways in which different  41  participants experienced, perceived, or understood the same phenomenon. Although Sonnemann (1954) first used the term phenomenography, the impetus for the development of phenomenography as a research approach did not occur until the 1970s (Hasselgren & Beach, 1997). Since the 1970s, phenomenography has developed into a distinctive qualitative approach to understanding not only learning but also a broad range of phenomena. The approach has been described by Marton (1981) as “research which aims at description, analysis, and understanding of experiences; that is, research which is directed towards experiential description” (p. 180). Different people will not experience a given phenomenon in the same way; rather, there will be a variety of ways in which people experience or understand that phenomenon (Marton, 1981). As the researcher, I sought to identify the multiple conceptions, or meanings, that this particular group of people, teacher candidates, had for a particular phenomenon or a number of phenomena. Thus, the objects of study in phenomenographic research are the qualitatively different ways in which people experience or make sense of different phenomena in the world around them. The outcome of phenomenographic research is therefore a list or description of the qualitative variation in the ways the sample participants (e.g., students) experience an object of study, a phenomenon, a concept, or an activity (e.g., the study of physics) (Marton, 1986). In my study, the outcome was the ways teacher candidates experienced their science methods course and the extended practicum viewed through the lens of PCK. A way of experiencing a phenomenon or concept can be characterized by the dynamic structure of an individual’s awareness, and that awareness has both a structural and referential aspect. Therefore categories describing variations in how something is  42  experienced will have both structural and referential components, and the categories differ from one another depending on the critical aspects which are discerned and kept in focal awareness simultaneously. Marton and Booth (1997) state, “A way of experiencing something springs from a combination of aspects of the phenomenon being both discerned and presented in focal awareness simultaneously. An aspect is … a dimension of variation” (p. 136). The highest hierarchical category will consist of discerned key aspects which are in focal awareness simultaneously. Whereas low categories may correspond to few or no aspects being discerned, intermediate categories relate to more aspects being discerned and perhaps being used in sequence (Stephanou, 1999). However, in phenomenographic analysis there is no attempt to ‘fit’ the data into predetermined categories. Some phenomenographic researchers consider that the categories are constructed from the data, and others believe that they are constituted within the data and are therefore discovered (Hasselgren & Beach, 1997; Walsh, 2000). The latter corresponds to my own view although I began by assuming that a limited number of conceptions and approaches could be found. The data were examined as a whole and during the analysis I endeavoured to incorporate all aspects of the data. This will be discussed in further detail within the analysis section of this chapter.  Phenomenographic Method Interviews can be developed according to both the interviewee’s conversation and his or her response to the predetermined questions. According to Ashworth and Lucas (1998), “(T)he researcher and researched must begin with some kind of (superficially) shared topic, verbalized in terms which they both recognize as meaningful” (p. 299). In  43  this study, when the interviewee wanted to further explain his or her understanding about their experiences, I encouraged him or her do so. I asked probing questions such as, Could you explain that further? or What is an example? These types of questions were necessary and useful prompts when respondent explanations were unclear when I required clarification of meaning (Bruce, 1992; Patton, 1990), or to confirm particular responses. I made it clear to the respondent that the interview was open and encouraged the interviewee to think aloud, be doubtful, and also pause. I did not evaluate the answers as being right or wrong, and I conveyed through verbal and non-verbal cues that I was interested in the teacher candidates expressing themselves clearly (Sjöström & Dahlgren, 2002). My goal for the interviews was to focus on the world of the interviewee and to reveal their beliefs, values, reality, feelings, and experience. I employed a semi-structured individual interview as this format is the preferred method to elicit a broad range of understanding (Ornek, 2008). The aim of the interview was to have the participant reflect on his/her experiences and then relate those experiences in such a way that we came to a mutual understanding about the meanings and the account of the experiences (Orgill, 2002). My intention was to focus on the phenomenon as experienced by the interviewee and to foster a liberal interview structure, a relaxed interpersonal relationship, and a feeling of individual freedom. To achieve the goal of a phenomenographic interview, I adopted an accepting attitude, a relaxed and friendly interview style, and I demonstrated a genuine interest in what the participant had to say.  44  Summary of Phenomenography From this methodological approach it is irrelevant if conceptions are considered “correct” or “incorrect” by current standards; the aim is simply to elucidate the different possible conceptions that people have for a given phenomenon in a specified situation. However, it is more than simply identifying these conceptions and ‘outcomes spaces’. The analysis involves looking for their underlying meanings and the relationship between them (Entwistle, 1997). As Åkerlind (2005b) states: The aim is to describe variation in experience in a way that is useful and meaningful, providing insight into what would be required for individuals to move from less powerful to more powerful ways of understanding a phenomenon. (p. 72) As this study illustrates, one might conduct phenomenographic research to study the qualitatively different ways students approached their learning, and the different ways they perceived their learning environment. In each case an outcome space is developed. The researcher can then examine the two outcome spaces to find the relationship between how students approach their learning and how they perceive their learning environment. This type of relational phenomenographic study has been carried out by a number of researchers (Biggs, 1979; Marton & Säljö, 1997; Ramsden, 1992). In summary, phenomenography provides a way of experiencing a phenomenon in terms of both the features of individual awareness, and the dimensions of variation that are discerned and simultaneously focused upon. However, the qualitative part of my study is not a “pure” phenomenography as it does not fulfil all the conditions of a true phenomenography. For instance, it does not generate categories in a hierarchical order  45  but instead uses the procedures to generate themes. In other words, the way phenomenography is used in this thesis connotes a source of tools that I deemed a useful addition to the mixed methods toolkit that enabled my investigation and understanding of the teacher candidates’ levels of PCK.  Data Collection Procedures This study commenced in September 2009 and continued until June 2010 (see Table 1). A combination of quantitative and qualitative approaches were used to gather data from the teacher candidates. The quantitative data were collected using an on-line questionnaire; the qualitative data were gathered through interviews.  Table 1 Schedule of Events Schedule of Events Email sent to One Year Elementary Teacher Candidates inviting participation  September 2, 2009  Round #1 Survey activated  September 14, 2009  Round #1 Survey de-activated  September 28, 2009  Round #2 Students invited to participate  March 5, 2010  Round #2 Survey activated  March 12, 2010  Round #2 Survey de-activated  March 26, 2010  Students Interviewed  April 9 - April 30, 2010  Round #3 Survey activated  June 18, 2010  Round #3 Survey de-activated  July 2, 2010  Students Interviewed  July 15, - August 20, 2010  46  Quantitative Data Collection To determine the teacher candidates’ levels of PCK for this study and to quantitatively establish a starting value and to monitor change through two interventions, a diagnostic questionnaire was employed. All of the Teacher Candidates in the 12-month program were invited via an email (Appendix A) to participate in the study on September 2, 2009. To ensure only the teacher candidates were able to participate, a password for access was included in the invitational email. Each teacher candidate who logged into the survey when it opened on September 14, 2009 for Round One was given a unique identity that was used solely for tracking question responses. Upon closure of the survey on September 28, 2009, the total number of participants was 114 (N=114). These same 114 candidates who responded to the questionnaire were invited through email to respond to the same questionnaire on two more occasions: Round Two on March 5, 2010 and Round Three June 11, 2010. A key strength of questionnaires is that they allow the researcher to ask participants about their activities directly rather than being forced to rely on outside observation (Ulmer & Wilson, 2003). The diagnostic tool used in this study was an adaptation of an instrument created by Schmidt, Baran, Thompson, Koehler, Shin and Mishra (2009). The original examined Technological Pedagogical Content Knowledge (TPCK) and was adapted for this study through the substitution of “science” for “technology.” A copy of the instrument is provided in Appendix B. The instrument is based on a Likert scale assessment consisting of 57 items with each response scored on the range of -5 through to +5. The questionnaire was web-based and accessed through a URL. Figure 2 is an example of a page of items from the instrument.  47  Figure 2 Example Page of Items from Instrument  48  According to Likert (1932, 1967), Lankford (1994) and Veal (1998) the advantages of the Likert method are as follows: 1. This method is based entirely on empirical data regarding subjects' responses rather than the subjective opinions of judges; 2. This method produces more homogeneous scales and increases the probability of a unitary attitude being measured; as a result, validity (construct and concurrent) and reliability are reasonably high; and 3. This method has greater ease of preparation. An 11-point Likert scale was suitable because odd-numbered scales allow respondents to express a neutral option that is impossible in even-numbered scales (Callero, 1992). An 11-point scale corresponded to values between -5 and +5. Thus, neutral characteristics are assigned a value of 0, while characteristics which are identified with the role are assigned +-5. The fact that these items were used to assess students’ PCK, which is a non zero construct, compelled me to transform this scale into one without a zero. In terms of PCK, a minus (-) or a zero (0) is illogical. It makes sense to consider the teacher candidates (TCs) as having some awareness and knowledge of PCK upon entering the programme. This argument is based upon the fact these candidates have experienced some form of teaching (e.g., peer teaching and tutoring) as well as being taught by teachers throughout their schooling. The decisions for the values were arbitrary with consistency in mind. To simplify the statistical calculations, the range of -5 to 0 to +5 was transformed into the following values as found in Figure 3.  49  -5 strongly Disagree -4 -3 -2 -1 0 Agree/Disagree 1 2 3 4 5 strongly Agree  =1 =2 =3 =4 =5 =6 =7 =8 =9 =10 =11  Figure 3 Questionnaire Rating Scale  A brief explanation of the questionnaire and consent to participate (Appendix A) were included in the questionnaire. The data were automatically stored in a database and were analyzed using Statistical Package for Social Scientists, Version 19 (IBM, 2010). The results are presented and discussed in Chapter 4. Ulmer and Wilson (2003) identify a number of conditions for the use of statistical analysis in research including: 1. Using quantitative analysis as a pre-cursor to qualitative study of social processes; 2. Recognizing that quantitative analysis is not the same thing as empiricism; 3. Recognizing that actors have agency, and that variables do not; 4. Recognizing that causality lies in interpretive processes, not in variables or statistical models; 5. Maximizing validity by minimizing the conceptual distance between the quantitative measurements and the phenomena being measured.  50  This research project adheres to these principles in the following ways. The first round of the study allows the quantitative round to provide input to a richer, qualitative round in which nuanced understandings are developed. Thus, statistical analyses of the quantitative findings from the first round do not attempt to establish causality but rather identify areas for further empirical investigation during the qualitative round. Likewise by using the quantitative round to seed the qualitative round allows agency to be investigated qualitatively and not attributed to the outcomes of statistical tests.  Qualitative Data Collection To gain more in-depth understanding of the teacher candidates’ understandings of pedagogical content knowledge, semi-structured 45-minute interviews were conducted with selected teacher candidates 14 days after the de-activation of Round Two, which was after their methods course and 13 days after the de-activation of Round Three, which was after the extended practicum as shown in Table 1. All of the teacher candidates in the Elementary Teacher Education Programme were invited to participate in the interviews at the beginning of the year, and 14 teacher candidates were randomly selected from the 114 who volunteered. The interviews were recorded and transcribed. The interview questions are included in Appendix C.  Reliability and Validity In my research, I employed triangulation (Mathison, 1988) of multiple data sources to increase the credibility and validity of the results. According to O’Donoghue and Punch (2003), triangulation is a “method of cross-checking data from multiple sources to 51  search for regularities in the research data" (p. 78). I employed Denzin’s (1978) methodological triangulation which involves using more than one method to gather data. The data in my research came from a survey deployed on three separate occasions and the qualitative interviews. Throughout the study I emphasized my independent status as researcher. All teacher candidates were informed through an email (Appendix A) that I was not involved in any of their courses, course assessment, and evaluation. Participants were therefore encouraged to contribute ideas and talk about their experiences without fear of losing credibility in the eyes of course instructors, other faculty, and administration in Education at the university. It was made clear to all participants in an email (Appendix A) that they had the right to withdraw from the study at any point and were not required to disclose an explanation to the investigator.  Reliability and Validity of the Instrument Reliability and validity are the fundamental criteria for judging the quality of measures used in collecting research data (Thomas & Hult, 2010). The definitions of reliability and validity in quantitative research reveal two strands: The first addresses reliability and whether the result is replicable. The second addresses validity and whether the instruments’ measurements are accurate, and whether the instruments are actually measuring what they are intended to measure. Kirk and Miller (1986) state that there are three types of reliability referred to in quantitative research. They are: (1) the degree to which a measurement given repeatedly remains the same, (2) the stability of a measurement over time; and (3) the similarity of  52  measurements within a given time period. Conversely, Joppe (2000) states that a problem with the test-retest method can make the instrument, to a certain degree, unreliable as the test-retest method may sensitize the respondent to the subject matter and hence influence the responses given. The degree of sensitization is thought to be small given the time frame between survey rounds and the intensity of the teacher education programme. Hammersley (1989) views validity as "(a)n account that is valid or true if it represents accurately those features of the phenomena that it is intended to describe, explain, or theorize" (p.69). Validity of the questionnaire was ensured by adapting the existing survey to one conceived and tested by Schmidt, Baran, Thompson, Koehler, Shin and Mishra (2009). However, the concepts of reliability and validity are viewed differently by qualitative researchers who strongly consider these concepts defined in quantitative terms as inadequate. In other words, these terms as defined in quantitative terms may not apply to the qualitative research paradigm. The question of replicability in the results does not concern qualitative researchers (Glesne & Peshkin, 1992). Precision (Winter, 2000) and credibility (Hoepf, 1997) together provide the lens to evaluate the findings of qualitative research. To track the effect of the science methods course and the extended practicum on the teacher candidates the same questionnaire was administered on three separate occasions (Rounds): on admission to the programme to establish their baseline knowledge, after the science methods course, and after the extened practicum. At each round the Cronbach’s alpha (α) reliability of the questionnaire scores was determined as shown in Table 2.  53  The reliability of a questionnaire refers to its ability to be consistent, stable, and reproducible. Internal consistency was measured by statistical procedures. Table 2 Cronbach’s alpha for Round One, Round Two and Round Three Cronbach’s alpha for Round One, Round Two and Round Three Survey Rounds  Cronbach’s alpha (α=)  Round One  .94  Round Two  .95  Round Three  .85  Reliability of the survey was ensured by adapting the existing validated survey by Schmidt, Baran, Thompson, Koehler, Shin and Mishra (2009). According to Schmidt et. al. (2009), their instrument scored a Cronbach’s alpha of .83 comparable to the .94 scored on the instrument in Round One, .95 in Round Two and .85 in Round Three in the study. The study by Chai, Koh, and Tsai (2010) who used an adaptation of the Schmidt et. al. (2009) instrument, scored a Cronbach’s alpha .93. Based on the comparable Cronbach’s alpha values, the instrument used in this study is reliable.  Reliability and Validity in a Phenomenographic Study Reliability in phenomenographic studies often revolves around the researchers’ interpretive awareness or how interpretations have been controlled and checked throughout the research process (Sandberg, 1997, 2005). Åkerlind (2005) recognizes two types of reliability checks that are commonly used in qualitative research involving answers to open questions: (1) coder reliability (interjudge or interrater reliability), and (2) dialogic reliability.  54  Coder reliability is more commonly known as interjudge or interrater reliability and is the reliability measure used most often in phenomenographic research. In principle it is an indicator of the agreement between independent judges. The procedure is to allocate categories to the transcripts independently after which the categorizations of the two judges are compared, and the level of agreement is reached. Dialogic reliability is a measure of agreement between researchers “reached through discussion and mutual critique of the data and of each researcher’s interpretive hypotheses” (Åkerlind, 2005, p. 331). Sandberg (1997, 2005) proposes three phases in the phenomenographic process where communicative validity is relevant: (i) within the interviews communicating with the subjects; (ii) in the analysis process communicating with the text; and (ii) in communicating the results to other researchers and professionals. For the first phase of communicative validity within the interviews, subjects are informed prior to the interview that the researchers are interested in their experiences of the aspect of the world under investigation. They are also informed that there are no right or wrong answers, and that no personal judgments would be made about what was discussed. Beginning with this phase develops a joint understanding between subject and the researcher about what is discussed in the interview (Sandberg, 1997, 2005). The second phase during the analysis process involves focusing on each transcript as a whole rather than trying to extract parts of a transcript and to analyze each part out of context. This focus is maintained by looking at the similarities and differences between whole transcripts, especially where a particular statement taken out of the transcript may appear to fit into one category, but when seen within the whole transcript, fits into  55  another category (Sandberg, 1997, 2005). The third phase of communicative validity involves obtaining feedback from other researchers and professionals. Hamer and van Rossum (2010) expand this to include three phases of validity: communicative, pragmatic, and convergence of outcomes. Communicative validity involves the persuasiveness of researchers regarding the appropriateness of their research methods and final interpretations as judged by the relevant research community (Hamer & van Rossum, 2010). Pragmatic validity is favoured by many phenomenographic researchers (Hamer & van Rossum, 2010) as the most important test for their research results: “The extent to which the research outcomes are seen as useful… and… are meaningful to their intended audience” (Åkerlind, 2005, p. 331). Convergence of outcomes validity is, as Åkerlind (2005) says “widely regarded as the extent to which a study is seen as investigating what it aimed to investigate, or the degree to which the research findings actually reflect the phenomenon being studied” (p. 330). Reliability was achieved in this study though the employment of two colleagues who acted as judges and independently reviewed the transcripts and allocated categories. After each judge’s review and the categories were determined, a consensus was reached through discussion and critique of each person’s position. Coder reliability is more commonly known as interjudge or interrater reliability and is the reliability measure used most often in phenomenographic research. In principle it is an indicator of the agreement between independent judges, usually two. The procedure is to allocate categories to the transcripts independently after which the categorizations of  56  the two judges are compared, and the level of agreement is usually presented (Hamer & van Rossum, 2010). Validity of this study was achieved through asking for volunteers for interviews and providing an outline of the potential questions to be asked. During the analysis process, transcripts were treated with a focus on similarities and differences. The pragmatic and convergence validity are future dependent.  Data Analysis This study’s design is Pre-Post Study and examines whether the teacher candidates in an intervention (methods class, and practicum) change during the course of the intervention and then attributes any such improvement or deterioration to the intervention.  Quantitative Data Analysis The data from the quantitative instrument were analyzed using the statistical software SPSS version 19 to determine through descriptive statistics, one-way Analysis of Variance (ANOVA), repeated measures and factor analysis. The components were characterized by clusters of items, and these clusters were tracked in subsequent questionnaire data sets to determine possible changes which could be attributed to the possible effect of the interventions. Descriptive statistics (mean, standard deviation and correlation coefficient) were used to describe the data collected. Because the same subjects were measured by the same instrument on three occasions, a repeated measures analysis of variance (ANOVA) 57  was used to test whether or not change had occurred over time. Like other ANOVAS, a repeated measure ANOVA requires that the dependent variable is normally distributed. The test for this is by computing skewness and kurtosis, each of which should fall between -1.0 and +1.0 to satisfy the assumption. There is widespread consensus that violations do not seriously affect the result (Hinkle, Wiersma & Jurs, 2003; Howell, 2002; Pagano, 2004). In repeated measures ANOVA, the same subjects are used at each time period so there is an expectation that the measures will have similar correlations across subjects and across time periods. This is the principle of sphericity. For this study, Mauchly’s test of sphericity and its significance were examined. If the sphericity assumption is violated, the degrees of freedom will be corrected using the Greenhouse-Geisser estimate of sphericity. The Eta squared tells us the magnitude of the effect, partialling out other factors. Cohen (1992) suggests effect sizes for various indexes where 0.1 is a small effect, 0.25 is a medium effect, and 0.4 is a large effect. To investigate the underlying factor structure of the scale at the different times, exploratory factor analysis was performed on the data (Kim & Mueller, 1978; SPSS, 1998; Stevens, 1996; Tabachnik & Fidell, 2001). I expected the factors to be correlated with one another therefore I employed a promax (oblique) rotation (Marcoulides & Hershberger, 1997; Tabachnik & Fidell, 2001).  58  I applied four criteria suggested in the literature to extract factors: 1. only factors with an eigenvalue higher than 1.0 were accepted as common factors; 2. only factors that appeared above the “elbow” of the scree plot were retained, the rest were rejected; 3. proportion of variance accounted for by a given factor had to be greater than 5%; 4. each factor had at least three items with loading greater than 0.10 and these items measured the same construct. A Scree Plot was used to display the eigenvalues for successive factors. Cattell (1966) proposed that a scree plot can be used to graphically determine the optimal number of factors to retain. The scree test involves finding the place where the smooth decrease of eigenvalues appears to level off to the right of the plot. To the right of this point, presumably, we find only "factorial scree" – "scree" is the geological term referring to the debris that collects on the lower part of a rocky slope. Thus, no more than the number of factors to the left of this point should be retained. Scree plots can be difficult to assess, and it is not clear from the literature where the cut-off should be (Pett, Lackey & Sullivan, 2003). Cattell (1966) originally suggested that the first factor on the straight line be the cut-off factor. Later, he revised this rule to be the number of factors that appear prior to the beginning of the straight line (Cattell & Jaspars, 1967). Several authors (Gorusch, 1983; Tabachnick & Fidell 2001) suggest that if the cut-off for the number of factors is unclear, the researcher might find it useful to undertake several factor analyses with different numbers specified.  59  Phenomenographic Data Analysis During data analysis in phenomenographic research, the researcher identifies qualitatively separate categories that describe the ways in which different people experience a different concept. Sjöström (2002) stated that the analysis includes seven steps. The first step is familiarization which means the researcher becomes familiar with the material by means of reading through the transcripts. This step is important in making corrections in the transcripts. The second step is compilation of answers from participants to a certain question. The researcher should identify the most significant elements in the participants’ answers. The third step is a condensation, or reduction of the individual answers to find the central parts of a dialogue. The fourth step is preliminary grouping or classification of similar answers. The fifth step is a preliminary comparison of categories. The sixth step is the naming of categories. The last step is a contrastive comparison of categories and includes a description of the character of each category as well as similarities between categories. Another view from Orgill (2002) in terms of analyzing data is that the researcher examines the transcriptions of participants’ responses, looking for not only similarities but also differences between them. During this process, the researcher develops initial categories that describe different people’s experiences of the given phenomenon. After covering multiple aspects of that phenomenon, the researcher develops categories that explain different variations in the data. Then, based on initial categories, the researcher re-examines the transcripts to determine whether the categories are sufficiently descriptive and indicative of the data. This second review of the data modifies, adds, or deletes category descriptions, and the next examination of the data is reviewed for  60  internal consistency of the categories of description. This process of modification and data review continues until the modified categories seem to be consistent with the interview data. In phenomenographic research, data are generated using methods that allow openness and variation in responses (Bowden, 2000). In this study, data were obtained by means of audio-taped individual interviews using phenomenographic interview technique (Dahlgren, 1993). Each transcript was read and re-read multiple times to enable me to become familiar with the data. Responses from a number of participants were then pooled (Marton, 1981, 1986, 1988). Key elements that characterize teacher candidates’ experiences of the phenomenon were identified for all transcripts (Bowden, 2000). The ways of experiencing the relationship among the key elements for each teacher candidate were compared among and within transcripts through an ongoing iterative process to derive the variation of categories of understanding and the underlying meaning of the phenomenon (Bowden, 2000). With each iteration of the transcripts, I identified and refined the most meaningful ways of experiencing and understanding. I then reduced the variation to a limited set of categories that depicted significant differences in ways of construing the phenomenon (Marton, 1981, 1986, 1988). Rigour was established in the study by following the tenets of phenomenographic approach (Sandberg, 1997; Bowden, 2000). I incorporated a decision trail, allowing the reader to follow the steps taken during data collection, analysis, and development of description categories (Sandberg, 1997). I took into consideration that in phenomenographic research, replication of categories of description as findings is not  61  considered necessary (Marton, 1988). Another researcher, or in the case of this study, an instructor working with teacher candidates, may recognize the categories of description once they have been identified (Marton, 1988).  Ethical Considerations To ensure my study followed appropriate ethical guidelines, I used individual committee member’s feedback in consultation with my principal advisor and successfully applied to the UBC Behavioural Ethics Review (BREB) board for approval. In accordance with BREB procedures all participants received a “Consent To Participate” letter outlining the principal investigator(s) and the conditions for participating in and withdrawal from the study. To ensure anonymity and to maintain privacy and confidentiality, pseudonyms were used for all participants. An e-mail (Appendix A) inviting the students to participate in the survey was sent to the Teacher Education Programme Planning Manager with a request to forward the email to all One Year Programme students in the 2009/2010 Elementary Teacher Education Programme. The email contained a link to the online survey and the letter of consent. The PCK Study Proposal information e-mail contained a contact email and phone number so prospective respondents could request additional information about the study. The online questionnaire (Appendix B) was installed on my home Department’s secure server located in the Faculty of Education. During data collection, data were retained on the secure server http://www.xxxxxx.ca (disguised). All responses were confidential and anonymous and IP addresses were not collected. All data were stored on a password-protected computer and documents stored in a locked filing cabinet.  62  Limitations of the Study Throughout this research study, I made decisions about what to do and when, what data to collect and what lens to bring to my analyses. I acknowledge that during the conduct of this type of research, personal bias is inevitable (Glaser, 1992; Janesick, 2004; Strauss & Corbin, 1998). Hence my study’s findings are influenced by my own beliefs and experiences associated with the teacher candidates. While I was aware of these influences and recognized this limitation, I endeavoured to make my account of the teacher candidates real and as unbiased as possible. In this study’s interviews, teacher candidates were invited to participate in the interviews, and the interviewees were selected from these volunteers. This approach may have introduced a selection bias due to the possibility that teacher candidates who initially volunteered may have been interested in the topic of the study, and thus may have different characteristics than the overall study sample. Bias may have occurred amongst those selected as the teacher candidates may have answered the interview questions the way they believe they should in their role as study participants. Also known as the Hawthorne Effect, it involves how participants’ behaviours might change when they become engaged in a research study. However, the Hawthorne Effect has been proven to be minimal when feedback is absent as it was in this study (SteeleJohnson, Beauregard, Hoover & Schmidt, 2000). Keeping my researcher limitations in mind, to the best of my ability, I made every attempt to remain unbiased and neutral regarding the choice of questions and the interpretation of the questionnaire results. I acknowledge that the questionnaire does not take into account the ‘uniqueness’ of each participant’s ability to interpret their  63  experiences and construct their own meanings. The complexity of the individual experience is a limitation in this study, specifically regarding the questionnaire due to the ‘human’ variable which cannot be eliminated completely. I also recognize that my interpretation of the questionnaire results is specific to this study and may not be generalized to other contexts. As I constructed, administered, and analysed the questionnaire, I made every attempt to remain as unbiased and neutral as possible keeping my personal and researcher biases in mind. This study was conducted over one year, with only elementary teacher candidates in one language at one university. Therefore findings may not be representative of other teacher candidates in other years and those at similar levels or institutions.  Summary This research project was conducted in three rounds using both quantitative and qualitative methods. The case study research project was designed to generate both quantitative and qualitative data which were analyzed using factor analysis and complemented by descriptive statistics (quantitative data) and phenomenographic analysis (qualitative data) elicited through interviews. Elaboration of the analyses is presented in Chapter 4.  64  CHAPTER FOUR: QUESTIONNAIRE DATA ANALYSIS, RESULTS AND DISCUSSION  This chapter reports on the analyses of questionnaire data in answer to the research questions: 1. What are the elementary science teacher candidates’ levels of PCK: (a) at entry into the teacher education programme, (b) immediately after completing their science methods course, and (c) after completing their school practicum? 2. How does the collective elementary teacher candidates’ pedagogical and science content knowledge evolve as they participate in the science teaching methods course and extended practicum?  These questions were investigated through the administration of a specially adapted validated PCK questionnaire and individual interviews at significant points during the teacher candidates’ experience in their science methods course and extended practicum. The quantitative questionnaire data were collected at three points during the teacher candidates’ experience in the teacher education programme. These points or “Rounds” have been defined as: Round One – questionnaire and related interview data gathered at the point of entry into the Teacher Education Programme; Round Two – questionnaire and related interview data gathered at the end of participating in a science methods course; and Round Three – questionnaire and related interview data gathered at the end of their practicum experience. This chapter therefore focuses on the questionnaire data analysis from which dominant/principal PCK factors or components were determined and  65  interpreted. These factors were later compared for correspondence with the emergent themes from phenomenographic analysis of the qualitative data in Chapter 5.  Questionnaire Data Analysis The questionnaire shown in Table 3 which was adapted from Schmidt et al. (2009) comprised 57 items with ratings ranging from Strongly Disagree (-5) to Neither Disagree nor Agree (0) to Strongly Agree (+5) .  66  Table 3 Pedagogical Content Knowledge (PCK) Pedagogical Content Knowledge (PCK) 1. I know science. 2. I have the scientific skills I need to use science. 3. I keep up with important new science discoveries. 4. I know how to solve my own problems using scientific processes. 5. I can learn science easily. 6. I frequently play around with science. 7. I have had sufficient opportunities to work with science. 8. I can use a mathematical way of thinking. 9. I can use a literary way of thinking. 10. I can use a scientific way of thinking. 11. I can use a historical way of thinking. 12. I have various ways and strategies of developing my understanding of mathematics. 13. I have various ways and strategies of developing my understanding of language literacy. 14. I have various ways and strategies of developing my understanding of science. 15. I have various ways and strategies of developing my understanding of social studies. 16. I know about various examples of how mathematics applies in the real world. 17. I know about various examples of how language literacy applies in the real world. 18. I know about various examples of how science applies in the real world. 19. I know about various examples of how social studies applies in the real world. 20. I have sufficient knowledge about mathematics. 21. I have sufficient knowledge about language literacy. 22. I have sufficient knowledge about science. 23. I have sufficient knowledge about social studies. 24. I can use a wide range of teaching approaches in a classroom setting (collaborative learning, direct instruction, inquiry learning, problem/project based learning etc.). 25. I can adopt my teaching style to different learners. 26. I know how to assess student performance in a classroom. 27. I am familiar with common student understandings and misconceptions. 28. I can assess student learning in multiple ways. 29. I can adopt my teaching based-upon what students currently understand or do not understand. 30. I know how to organize and maintain classroom management. 31. I know that different mathematical concepts do not require different teaching strategies. 32. I know that different language literacy concepts do not require different teaching strategies. 33. I know that different science concepts do not require different teaching strategies. 34. I know that different social studies concepts do not require different teaching strategies. 35. I know how to select effective teaching strategies to guide student thinking and learning in mathematics.  67  Table 3 (cont’d) 36. I know how to select effective teaching strategies to guide student thinking and learning in language literacy. 37. I know how to select effective teaching strategies to guide student thinking and learning in science. 38. I know how to select effective teaching strategies to guide student thinking and learning in social studies. 39. I know what sciences I can use for understanding and doing mathematics. 40. I know what sciences I can use for understanding and doing language literacy. 41. I know what sciences I can use for understanding and doing science. 42. I know what sciences I can use for understanding and doing social studies. 43. I have the scientific skills I need to use science appropriately in teaching. 44. I can adapt the science I am learning about to different teaching activities. 45. I am thinking critically about how to use science in my classroom. 46. I have the classroom management skills I need to use science appropriately in teaching. 47. My teacher education programme has caused me to think more deeply about how science could influence the teaching approaches I use in my classroom. 48. I can choose the sciences that enhance the teaching approaches for a lesson. 49. I can choose sciences that enhance students' learning for a lesson. 50. I can teach lessons that appropriately combine mathematics, science and teaching strategies. 51. I can teach lessons that appropriately combine language literacy, science and teaching strategies. 52. I can teach lessons that appropriately combine science, science and teaching strategies. 53. I can teach lessons that appropriately combine social studies, science and teaching strategies. 54. I can select the sciences to use in my classroom that enhance what I teach, how I teach and what students learn. 55. I can use strategies that combine content, the sciences and teaching approaches that I learned about in my coursework in my classroom. 56. I can provide leadership in helping others to coordinate the use of content, the sciences and teaching approaches at my school. 57. I can choose the sciences that enhance the content for a lesson.  68  Figure 4 presents the questionnaire rating scale. Given that 0 does not mean zero level of PCK, the scores were transformed to avoid zero value during the statistical analysis. -5 strongly Disagree -4 -3 -2 -1 0 Agree/Disagree 1 2 3 4 5 strongly Agree  =1 =2 =3 =4 =5 =6 =7 =8 =9 =10 =11  Figure 4 Questionnaire Rating Scale  Thus, the values were transformed as shown in Figure 4 prior to the statistical analysis. The first analysis of the data examined whether the teacher candidates as a single unit changed over time. The statistical software SPSS version 19 (IBM, 2010) was used to perform descriptive analysis and one-way repeated measures ANOVA. It is worth pointing out that different people understand PCK differently. It thus makes sense to argue that the nature of teacher candidates’ PCK depends on what they have previously experienced including when they were students. This argument is similar to what Anderson and Nashon (2007) make with regard to metacognition. The authors made a case for the definitions of metacognition being in fact components, perspectives, or dimensions of metacognition. In the same vein, the case can be made for PCK which seems to be understood or defined similarly. On the basis of this understanding, it was  69  prudent to determine possible dominant variations or components of the teacher candidates’ PCK. There was a possibility of having unlimited number of components of PCK among the participating teacher candidates. For the purpose of this study, I highlighted the most dominant components at entry into the programme and studied the effect of the science methods course and practicum on these components. To determine these components, I used SPSS version 19 and performed factor analysis on the first questionnaire data set. Guided by the principles of factor determination, the dominant factors were interpreted and described. The data were also described using descriptive statistics and differences in means across time were examined using the one-way repeated measures ANOVA. The components were characterized by clusters of items, and these clusters were tracked in subsequent questionnaire data sets to determine possible changes which could be attributed to the possible effect of the interventions. The polarities of the scores from Questions 31 through to 34 were reversed because the items were negatively worded. This is consistent with a procedure where some questions in an instrument are framed in ways the require first transforming the items to have the same orientation (DeCoster, 2004).  Rounds One, Two and Three The term Round here refers to the time the same survey was administered to the same group of teacher candidates. Therefore Rounds One (T1), Two (T2) and Three (T3) refer to First, Second, and Third time (T) the same survey was administered to the same teacher candidates (N=114) (Appendix B) respectively during their experience in the  70  science teaching methods course and extended practicum within the 12-Month Teacher Education Programme.  Retrospective Retrospectives were employed in the study to test hypothetical situations for further insight. A retrospective is taken into account when dominant components in one round may not dominate in another round, but they still play a role in the teacher candidates’ PCK. For example findings from Round Two (T2) could only be compared to the findings in Round Three (T3) as the study examined changes in the teacher candidates’ PCK. In retrospect, the findings from Round Two could be compared with the findings in Round One (T1) thereby giving insight in to the possibility of change occurring earlier than when the findings from Round Two (T2) are compared with Round Three (T3).  Rounds One (T1), Two (T2) and Three (T3) - Analysis During the 12-month period, data were collected from the student teachers three times, initially upon entry into the programme, at the end of the science methods course, and after the practicum. Table 4 presents the descriptive statistics of these data which include the Means, Standard Deviation, and Standard Error Mean for the same group (N=114) at T1, T2 and T3. The mean was higher at T1, increased at T2 and decreased at T3. However students seem to have similar views at the end of the practicum as shown by the smaller standard deviation.  71  Table 4 Overall Scores: Number, Mean Standard Deviation and Standard Error Mean Overall Scores: Number, Mean Standard Deviation and Standard Error Mean Time T1 T2 T3  N 114 114 114  Mean 7.1431 8.0719 7.0783  Std. Deviation 1.07794 1.03280 .72437  Std. Error Mean .10096 .09673 .06784  A one-way repeated measures ANOVA was used to examine whether there were any significant differences in the mean scores on the PCK questionnaire across time (i.e., preconceptions, after methods course, and after a practicum). Mauchly’s test of sphericity indicated that the assumption of sphericity was violated χ2(2) = 13.4, p =.001 (Table 5), and therefore a Greenhouse-Geisser correction as shown in Table 6 was used (ε=.89).  72  Table 5 Overall Scores: Mauchly's Test of Sphericity Overall Scores: Mauchly's Test of Sphericity Within Subjects Effect  Mauchly's W  Approx. ChiSquare  Survey Rounds  .887  13.391  df  Sig.  2  .001  Table 6 Overall Scores: Tests of Within-Subject Effects Overall Scores: Tests of Within-Subject Effects  Source Survey Rounds  Error (Survey)  Type III Sum of Squares  df  Mean Square  F  Sig.  70.448  2  35.224  33.612  .000  70.448  1.797  39.193  33.612  .000  70.448  1.000  70.448  33.612  .000  236.839  226  1.048  GreenhouseGeisser  236.839  203.111  1.166  Lower-bound  236.839  113.000  2.096  Sphericity Assumed GreenhouseGeisser Lower-bound Sphericity Assumed  There was a significant effect, and the mean scores were statistically significantly different F(1.1797, 203.111) = 33.612, p < 0.0005. The Eta squared revealed in Table 7 the magnitude of the effect. Cohen (1992) suggested effect sizes for various indexes where 0.1 is a small effect, 0.25 is a medium effect and 0.4 is a large effect. In this case the effect is small at 23%.  73  Table 7 Overall Scores: Partial Eta Squared Overall Scores: Partial Eta Squared Source Survey Rounds  Sig.  Partial Eta Squared  Sphericity Assumed  .000  .229  Greenhouse-Geisser  .000  .229  Lower-bound  .000  .229  The ANOVA results indicated an overall significant difference in the means. Table 8 provides the results of the Bonferroni post-hoc test which determined the differences. There was a significant difference in survey responses between T1 and T2 (p = .000) and between T2 and T3 (p = .000), but no significant differences between T3 and T1 (p = 1.000).  Table 8 Overall Scores: Pairwise Comparisons Bonferroni post-hoc Overall Scores: Pairwise Comparisons Bonferroni post-hoc (I) Survey Rounds T1  (J) Survey Rounds T2  Mean Difference (I-J) -.929*  Std. Error .154  Sig. .000  T2  T3  .994*  .115  .000  T3  T1  -.065  .135  1.000  Finding One The teacher candidates experienced a significant change from Round One (T1) to Round Two (T2), and from Round Two (T2) to Round Three (T3). The teacher candidates did not experience a significant change from Round One (T1) to Round Three (T3).  74  Factor Analysis To investigate the underlying factor structure at the different times (T1, T2 and T3), exploratory factor analysis was performed on the data (IBM, 2010; Kim & Mueller, 1978; Stevens, 1996; Tabachnik & Fidell, 2001). I expected the factors to be correlated with one another; therefore I employed a promax (oblique) rotation (Marcoulides & Hershberger, 1997; Tabachnik & Fidell, 2001). I applied four criteria as suggested in the literature to identify the factors. These criteria are: 1) only factors with an eigenvalue higher than 1.0 were accepted as common factors; 2) only factors that appeared above the “elbow” of the scree plot were retained, the rest were rejected; 3) proportion of variance accounted for by a given factor had to be greater than 5%; 4) each factor had at least three items with loading greater than 0.10 and these items assessed the same construct.  Factor analysis of the data is presented as factors or components of PCK. Naming of the factors involved discerning the items that were representative of all the items loaded on the factor. This was informed by background as a science teacher, as well as the literature reviewed. The factor naming was further fine-tuned by insights from interviewing the students about their science teaching and learning experiences.  75  Factor Analysis of Round One (T1) Factor Analysis of the initial data (at T1) identified three interpretable factors of PCK as shown in the scree plot in Figure 5.  Figure 5 Round One(T1): Scree Plot  The items and their factor loadings are shown in Table 9.  76  Table 9 Round One (T1): Factors/Components Round One (T1): Factors/Components  .704 .685 .658 .651 .650 .635 .625 .608 .588 .581 .576 .574 .560 .556 .554 .553 .553 .552 .550 .535 .525 .522 .520 .518 .517 .516 .513 .510 .508 .507 .484 .483 .482 .473 .472 .463 .458 .457 .449 .443 .431 .430 .429 .427 .400  Components 2 -.052 .005 .341 .308 -.171 .040 -.083 .155 -.329 .283 -.175 .033 .069 .304 -.102 -.293 .090 -.397 -.353 -.356 -.233 .317 -.190 -.319 -.094 .329 -.342 .270 .091 -.248 -.300 -.243 -.391 -.092 .428 -.348 -.348 -.125 -.024 -.086 -.052 -.390 .337 .381 .379  Question_9 Question_23 Question_13 Question_22 Question_17 Question_53  .298 .410 .485 .396 .358 .436 .333  .582 .581 .573 .565 .547 .510 -.432  .006 .096 .029 .133 -.250 .253 .155  Question_32 Question_25 Question_31 Question_34 Question_8  .347 .423 .376 .415 .416  -.454 .364 -.447 -.391 .340  .489 .461 .351 .300 -.242  Question # Question_1 Question_28 Question_14 Question_10 Question_37 Question_26 Question_2 Question_5 Question_7 Question_12 Question_41 Question_18 Question_45 Question_30 Question_38 Question_39 Question_27 Question_3 Question_6 Question_51 Question_55 Question_24 Question_43 Question_52 Question_36 Question_20 Question_57 Question_29 Question_46 Question_35 Question_50 Question_42 Question_40 Question_48 Question_15 Question_54 Question_56 Question_4 Question_49 Question_47 Question_44 Question_33 Question_11 Question_19 Question_16 Question_21  1  3 -.388 .206 -.172 -.285 .071 .351 -.384 -.383 -.256 -.165 -.115 -.230 -.107 .281 .324 -.077 .155 -.143 -.277 .120 -.024 .288 -.358 -.045 .366 -.173 -.058 .211 .021 .205 .067 .053 .047 -.173 .256 .019 .015 -.434 -.223 -.146 -.235 .394 .250 .241 -.112  77  Factor analysis extracted three factors (see Table 10) that included a robust set of items that were based on the same construct and were easily interpreted. The rotated solution was interpreted by identifying the conceptual meanings of items grouped into the same factors and the conceptual differences between factors. To be retained for the further analysis each factor should have had at least three items with loadings greater than 0.10 and these items should measure the same construct (Cattell, 1966; Hatcher, 1994; Stevens, 1996; Tabachnik & Fidell, 2001). Thus, the three factors accounted for 42.2% (see Table 10) of the total variance of scores. What this means is that 42.2% of the total variance of how the teacher candidates scored on the items that loaded on the initial factors could be explained in terms of the conceptual interpretation assigned to the first three factors. According to the percentage of variance, the first factor accounted for 25.9% of variance. This factor consisted of 47 items with loadings higher than 0.10. The second factor accounted for 10.5% of variance and consisted of seven items. The third factor accounted for 5.8% of variance and consisted of five items, all of which had loadings higher than 0.10 and were retained.  78  Table 10 Round One: Extracted Factors and Percentage of Variance Accounted for by the Factors Round One: Extracted Factors and Percentage of Variance Accounted for by the Factors  Initial Eigenvalues  Component 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57  Total 14.784 5.970 3.301 2.682 2.174 1.911 1.658 1.507 1.410 1.322 1.273 1.161 1.095 1.060 1.021 .971 .871 .864 .817 .780 .762 .676 .609 .571 .551 .520 .505 .480 .452 .433 .427 .402 .326 .324 .301 .284 .281 .245 .237 .202 .185 .178 .168 .149 .145 .127 .119 .105 .099 .094 .082 .077 .075 .061 .047 .037 .033  % of Variance 25.937 10.473 5.791 4.706 3.814 3.352 2.910 2.643 2.473 2.320 2.234 2.037 1.920 1.860 1.791 1.703 1.529 1.516 1.433 1.369 1.336 1.186 1.069 1.002 .967 .912 .885 .842 .793 .759 .749 .705 .572 .568 .529 .498 .493 .431 .416 .354 .324 .312 .295 .261 .254 .223 .208 .185 .173 .166 .143 .136 .132 .106 .083 .065 .058  Cumulative % 25.937 36.410 42.202 46.908 50.721 54.073 56.982 59.626 62.099 64.419 66.653 68.690 70.610 72.470 74.261 75.964 77.493 79.009 80.442 81.811 83.147 84.333 85.402 86.404 87.371 88.282 89.168 90.009 90.802 91.562 92.311 93.015 93.587 94.155 94.684 95.183 95.675 96.106 96.522 96.876 97.200 97.512 97.806 98.067 98.321 98.544 98.753 98.937 99.110 99.276 99.419 99.555 99.687 99.794 99.877 99.942 100.000  Extraction Sums of Squared Loadings Total 14.784 5.970 3.301  % of Variance 25.937 10.473 5.791  Cumulative % 25.937 36.410 42.202  Rotation Sums of Squared Loadingsa Total 10.130 11.511 9.443  79  The items that loaded on the first factor as presented in Table 11 were associated with the awareness of the knowledge of the process of science and appear valid as the items appear related through a common theme of one’s knowledge of the process of science. Table 11 Round One (T1): Factor One(one): Pedagogical Content Knowledge Items Round One (T1): Factor One(one): Pedagogical Content Knowledge Items 1. I know science. 28. I can assess student learning in multiple ways. 14. I have various ways and strategies of developing my understanding of science. 10. I can use a scientific way of thinking. 37. I know how to select effective teaching strategies to guide student thinking and learning in science. 26. I know how to assess student performance in a classroom. 2. I have the scientific skills I need to use science. 5. I can learn science easily. 7. I have had sufficient opportunities to work with science. 12. I have various ways and strategies of developing my understanding of mathematics. 41. I know what sciences I can use for understanding and doing science. 18. I know about various examples of how science applies in the real world. 45. I am thinking critically about how to use science in my classroom. 30. I know how to organize and maintain classroom management. 38. I know how to select effective teaching strategies to guide student thinking and learning in social studies. 39. I know what sciences I can use for understanding and doing mathematics. 27. I am familiar with common student understandings and misconceptions. 3. I keep up with important new science discoveries. 6. I frequently play around with science. 51. I can teach lessons that appropriately combine language literacy, science and teaching strategies. 55. I can use strategies that combine content, the sciences and teaching approaches that I learned about in my coursework in my classroom. 24. I can use a wide range of teaching approaches in a classroom setting (collaborative learning, direct instruction, inquiry learning, problem/project based learning etc.). 43. I have the scientific skills I need to use science appropriately in teaching. 52. I can teach lessons that appropriately combine science, science and teaching strategies. 36. I know how to select effective teaching strategies to guide student thinking and learning in language literacy. 20. I have sufficient knowledge about mathematics. 57. I can choose the sciences that enhance the content for a lesson. 80  Table 11 (cont’d) 29. I can adopt my teaching based-upon what students currently understand or do not understand. 46. I have the classroom management skills I need to use science appropriately in teaching. 35. I know how to select effective teaching strategies to guide student thinking and learning in mathematics. 50. I can teach lessons that appropriately combine mathematics, science and teaching strategies. 42. I know what sciences I can use for understanding and doing social studies. 40. I know what sciences I can use for understanding and doing language literacy. 48. I can choose the sciences that enhance the teaching approaches for a lesson. 15. I have various ways and strategies of developing my understanding of social studies. 54. I can select the sciences to use in my classroom that enhance what I teach, how I teach and what students learn. 56. I can provide leadership in helping others to coordinate the use of content, the sciences and teaching approaches at my school. 4. I know how to solve my own problems using scientific processes. 49. I can choose sciences that enhance students' learning for a lesson. 47. My teacher education programme has caused me to think more deeply about how science could influence the teaching approaches I use in my classroom. 44. I can adapt the science I am learning about to different teaching activities. 33. I know that different science concepts do not require different teaching strategies. 11. I can use a historical way of thinking. 19. I know about various examples of how social studies applies in the real world. 16. I know about various examples of how mathematics applies in the real world. The items on the second factor as presented in Table 12 were associated with awareness of self-competence.  81  Table 12 Round One (T1): Factor Two(one): Pedagogical Content Knowledge Items Round One (T1): Factor Two(one): Pedagogical Content Knowledge Items 48. I can choose the sciences that enhance the teaching approaches for a lesson. 15. I have various ways and strategies of developing my understanding of social studies. 54. I can select the sciences to use in my classroom that enhance what I teach, how I teach and what students learn. 56. I can provide leadership in helping others to coordinate the use of content, the sciences and teaching approaches at my school. 4. I know how to solve my own problems using scientific processes. 49. I can choose sciences that enhance students' learning for a lesson. 47. My teacher education programme has caused me to think more deeply about how science could influence the teaching approaches I use in my classroom. 44. I can adapt the science I am learning about to different teaching activities. 33. I know that different science concepts do not require different teaching strategies. 11. I can use a historical way of thinking. 19. I know about various examples of how social studies applies in the real world. 16. I know about various examples of how mathematics applies in the real world. This seems to be intuitively true as the characteristics appear to be related through one’s self-competence. The items on the third factor as presented in Table 13 were associated with the awareness of teaching strategies.  Table 13 Round One (T1): Factor Three(one): Pedagogical Content Knowledge Items Round One (T1): Factor Three(one): Pedagogical Content Knowledge Items 32. I know that different language literacy concepts do not require different teaching strategies. 25. I can adopt my teaching style to different learners. 31. I know that different mathematical concepts do not require different teaching strategies. 34. I know that different social studies concepts do not require different teaching strategies. 8. I can use a mathematical way of thinking. The three factors or components of the nature of the teacher candidates’ PCK seem to be reasonably acknowledged in the literature in terms of teacher candidates expressing  82  a dislike or liking for their teachers based on how they were taught as well as teacher attitudes. Other influences both negative and positive, include not liking or liking science or aspects of science due to a perceived incompetence or competence in skill and related science knowledge, and knowledge of how teaching ought to be conducted (Nashon, 2007; Norman & Spencer, 2005).  Descriptive Statistics for Round One (T1), Factor One(one), Factor Two(one), Factor Three(one): Rounds One (T1) , Two (T2) and Three (T3) The candidates’ scores on items on the three factors were tracked and compared in the three rounds. Table 14 shows the descriptive statistics for Rounds One (T1), Two (T2) and Three (T3) for the items that loaded on Factor One(one), Two(one), and Three(one) in the Round One (T1) data.  83  Table 14 Descriptive Statistics for Round One(T1), Factor One(one), Factor Two(one), Factor Three(one): Rounds One (T1), Two (T2) and Three (T3) Descriptive Statistics for Round One (T1), Factor One(one), Factor Two(one), Factor Three(one): Rounds One (T1), Two (T2) and Three (T3) Survey Rounds  Mean  Std. Deviation  Factor1 Round1  7.0822  1.12647  Factor1 Round2  8.1098  1.08204  Factor1 Round3  7.0887  .74292  Factor2 Round1  7.1906  1.06429  Factor2 Round2  8.0032  1.10644  Factor2 Round3  7.0926  .85511  Factor3 Round1  6.9421  1.57114  Factor3 Round2  7.3544  1.39024  Factor3 Round3  7.0404  1.14943  Statistical Analysis of Round One (T1), Factor One(one): Rounds One (T1), Two (T2) and Three (T3) The means for items that loaded on Factor One(one) in T1 factor analysis was compared to the means of the same items in T2 and T3 using a one way repeated measures analysis of variance. Mauchly's test indicated that the assumption of sphericity had been violated as shown in Table 15, and the Greenhouse-Geisser estimate of sphericity was used to correct the degree of freedom; hence, Factor One(one)=( χ2(2) = 13.4, p = .001),(ε=0.98) as shown in Table 16.  84  Table 15 Round One(T1): Factor/Component One(one): Mauchly's Test of Sphericity Round One(T1): Factor/Component One(one): Mauchly's Test of Sphericity  Within Subjects Effect  Mauchly's W  Approx. ChiSquare  Factor 1  .887  13.370  df  Sig.  2  .001  Table 16 Round One(T1): Factor/Component One(one): Tests of Within-Subjects Effects Round One (T1): Factor/Component One(one): Tests of Within-Subjects Effects  Source Factor 1  Type III Sum of Squares Sphericity Assumed  df  Mean Square  F  79.754  2  39.877  35.153  Greenhouse-Geisser  79.754  1.798  44.364  35.153  Lower-bound  79.754  1.000  79.754  35.153  Sphericity Assumed  256.373  226  1.134  Greenhouse-Geisser  256.373  203.141  1.262  Lower-bound  256.373  113.000  2.269  As shown in Table 16 there was a significant effect, and the mean score is statistically significantly different Factor One(one): (F(1.798, 226) = 35.153, p = < 0.0005. The Eta squared revealed in Table 17 is 24%, in the magnitude of the effect. In this case the effect is small according to Cohen (1992).  85  Table 17 Round One(T1): Factor/Component One(one): Partial Eta Squared Round One (T1): Factor/Component One(one): Partial Eta Squared Source Factor 1  Sig.  Partial Eta Squared  Sphericity Assumed  .000  .237  Greenhouse-Geisser  .000  .237  Huynh-Feldt  .000  .237  Lower-bound  .000  .237  The ANOVA results indicated there was an overall significant difference in the means. Table 18 provide the results of the Bonferroni post-hoc test to determine where the differences were. There was a significant difference in survey responses between Factor One(one) between T1 and T2 (p = .000), T2 and T3 (p = .000) but no significant differences between time T1 and T3 (p = 1.000).  Table 18 Round One(T1): Factor/Component One(one): Bonferroni post-hoc Round One (T1): Factor/Component One(one): Bonferroni post-hoc (I) Survey Rounds T1  (J) Survey Rounds T3  Mean Difference (I-J) -1.028*  Std. Error .161  Sig. .000  T2  T3  1.028*  .161  .000  T3  T1  .006  .140  1.000  Finding Two Based on these statistical analysis of Round One (T1), Factor One(one): Rounds One (T1), Two (T2) and Three (T3) there is a significant change in the levels of awareness of 86  the knowledge of the process of science from the start of the programme through each survey round. However when comparing scores on items on Factor One(one) in Round Three with scores on the same items in Round One no significant change was indicated.  Statistical Analysis of Round One (T1), Factor Two(one) ): Round One (T1), Two (T2) and Three (T3) The means for items that loaded on Factor Two(one) in T1 factor analysis were compared to the means of the same items in T2 and T3 using a one way repeated measures analysis of variance. Mauchly's test indicated that the assumption of sphericity had been not been violated (χ2(2) = 4.699, p = .095) as shown in Table 19.  Table 19 Round One(T1): Factor/Component Two(one): Mauchly's Test of Sphericity Round One(T1): Factor/Component Two(one): Mauchly's Test of Sphericity Within Subjects Effect  Mauchly's W  Approx. ChiSquare  Factor 2  .959  4.699  df  Sig.  2  .095  The ANOVA results indicated an overall significant difference in the means. Table 20 provides the results of the Bonferroni post-hoc test of differences. There was a significant difference in survey responses between Factor Two(one) between T1 and T2 (p = .000), T2 and T3 (p = .000), and between T3 and T1 (p = .000).  87  Table 20 Round One (T1): Factor/Component Two(one): Bonferroni post-hoc Round One(T1): Factor/Component Two(one): Bonferroni post-hoc (I) Survey Rounds T1  (J) Survey Rounds T3  Mean Difference (I-J)  Std. Error  Sig.  -.813*  .154  .000  T2  T3  .911*  .129  .000  T3  T1  -.911*  .129  .000  Finding Three The results indicated a significant change in the teacher candidates’ levels of awareness of self-competence throughout the three rounds.  Statistical Analysis of Round One (T1), Factor Three(one): Rounds One (T1), Two (T2) and Three (T3) The means for items that loaded on Factor Three(one) in T1 factor analysis was compared to the means of the same items in T2 and T3 using a one way repeated measures analysis. Mauchly's test indicated that the assumption of sphericity had been violated as shown in Table 21, and the Greenhouse-Geisser estimate of sphericity was used to correct the degree of freedom; hence, Factor Three(one)=( χ2(2) = 13.2, p = .001),(ε=0.98) as shown in Table 22.  88  Table 21 Round One(T1): Factor/Component Three(one): Mauchly's Test of Sphericity Round One (T1): Factor/Component Three(one): Mauchly's Test of Sphericity Within Subjects Effect Factor 3  Mauchly's W .889  Approx. ChiSquare 13.163  df 2  Sig. .001  Table 22 Round One(T1): Factor/Component Three(one): Tests of Within-Subjects Effects Round One (T1): Factor/Component Three(one): Tests of Within-Subjects Effects Source Factor 3  Type III Sum of Squares Sphericity Assumed 10.573  df 2  Mean Square 5.287  F 2.782  Greenhouse-Geisser  10.573  1.968  5.372  2.782  Lower-bound  10.573  1.000  10.573  2.782  Sphericity Assumed  429.480  226  1.900  Greenhouse-Geisser  429.480  222.419  1.931  Lower-bound  429.480  113.000  3.801  There was a significant effect, and the mean score is significantly different, Factor Three(one) F(1.968, 226) = 2.782, p < 0.0005. The Eta squared revealed in Table 23 is 22%, in the magnitude of the effect. In this case the effect is small according to Cohen (1992).  89  Table 23 Round One(T1): Factor/Component Three(one): Partial Eta Squared Round One (T1): Factor/Component Three(one): Partial Eta Squared Sig.  Partial Eta Squared  .000  .216  Greenhouse-Geisser  .000  .216  Huynh-Feldt  .000  .216  Lower-bound  .000  .216  Source Factor 3  Sphericity Assumed  The ANOVA results indicated an overall significant difference in the means. Table 24 provides the results of the Bonferroni post-hoc test of differences. There was a significant difference in survey responses on Factor Three(one) between T1 and T2 (p = 1.000), T2 and T3 (p = 1.000) and between T3 and T1 (p = 1.000).  Table 24 Round One(T1): Factor/Component Three(one): Bonferroni post-hoc Round One (T1): Factor/Component Three(one): Bonferroni post-hoc (I) Survey Rounds T1  (J) Survey Rounds T3  Mean Difference (I-J) -.985*  Std. Error Sig. .169 1.000  T2  T3  .985*  .169  1.000  T3  T1  -.035  .138  1.000  90  Finding Four Throughout all three rounds there was no indication of a significant change in the teacher candidates’ levels of awareness of teaching strategies.  Factor Analysis of Round Two (T2) Factor Analysis of the initial data (at T2) identified two interpretable factors of PCK as shown in scree plot in Figure 6.  Figure 6 Round Two (T2): Scree Plot  The items that loaded on each factor are shown in Table 25.  91  Table 25 Round Two(T2): Factor/Components Round Two(T2): Factor/Components Question # Question_55 Question_53 Question_7 Question_26 Question_50 Question_48 Question_44 Question_29 Question_45 Question_43 Question_57 Question_24 Question_2 Question_56 Question_49 Question_1 Question_51 Question_14 Question_41 Question_21 Question_9 Question_52 Question_8 Question_22 Question_12 Question_10 Question_36 Question_28 Question_20 Question_54 Question_40 Question_18 Question_46 Question_5 Question_13 Question_37 Question_39 Question_25 Question_17 Question_16 Question_27 Question_19 Question_30 Question_23 Question_4 Question_15 Question_6 Question_42 Question_11 Question_3 Question_47 Question_35 Question_31 Question_34 Question_32 Question_33 Question_38  Component 1  2 .692 .658 .636 .632 .618 .614 .610 .610 .606 .603 .598 .596 .596 .593 .592 .591 .586 .578 .573 .572 .568 .566 .558 .554 .550 .548 .544 .540 .540 .538 .537 .533 .530 .512 .510 .508 .504 .503 .493 .490 .484 .484 .483 .483 .466 .465 .454 .449 .447 .432 .416 .381 .157 .178 .287 .254 .384  .082 .062 .111 .054 -.079 .241 .143 -.002 -.221 .008 .040 -.188 .034 .090 .206 -.004 -.018 -.101 -.071 -.404 -.246 -.056 .084 -.341 -.189 -.239 .075 -.071 -.263 .076 .129 .109 -.156 -.139 -.209 .213 .007 .054 -.081 .018 .112 -.129 -.118 -.387 -.093 -.141 .141 .187 -.122 .271 .107 .186 .811 .758 .711 .684 .118  92  Factor analysis extracted two factors (see Table 26) that included a robust set of items that were based on the same construct and were easily interpreted. The rotated solution was interpreted by identifying the conceptual meanings of items grouped into the same factors and the conceptual differences between factors. To be retained for further analysis each factor should have had at least three items with loadings greater than 0.10 and these items should measure the same construct (Cattell, 1966; Hatcher, 1994; Stevens, 1996; Tabachnik & Fidell, 2001). Thus, the two factors accounted for 34.2% (see Table 26) of the total variance of scores. What this means is that 34.2% of the total variance of how the teacher candidates scored on the items that loaded on the initial factors could be explained in terms of the conceptual interpretation assigned to the first two factors. According to the percentage of variance, the first factor accounted for 27.9% of variance. This factor consisted of 52 items with loadings higher than 0.10. The second factor accounted for 6.3% of variance and consisted of five items.  93  Table 26 Round Two (T2): Extracted Factors and Percentage of Variance Accounted for by the Factors Round One: Extracted Factors and Percentage of Variance Accounted for by the Factors Initial Eigenvalues  Component 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57  Total 15.89 3.593 2.331 2.132 2.029 1.733 1.611 1.491 1.432 1.351 1.315 1.237 1.211 1.197 1.086 1.038 .974 .957 .943 .836 .818 .754 .717 .676 .661 .632 .572 .560 .533 .511 .474 .440 .409 .390 .377 .339 .325 .316 .300 .284 .276 .252 .240 .220 .198 .187 .174 .159 .138 .125 .112 .100 .094 .070 .067 .061 .053  % of Variance 27.878 6.303 4.089 3.741 3.559 3.040 2.826 2.615 2.512 2.370 2.308 2.169 2.125 2.101 1.906 1.820 1.708 1.679 1.654 1.467 1.435 1.324 1.257 1.185 1.159 1.109 1.004 .983 .935 .897 .832 .772 .718 .684 .661 .595 .570 .555 .527 .497 .484 .443 .422 .386 .347 .328 .305 .278 .242 .220 .196 .176 .164 .123 .118 .106 .093  Cumulative % 27.878 34.181 38.270 42.011 45.570 48.611 51.437 54.052 56.563 58.933 61.241 63.411 65.535 67.636 69.542 71.362 73.070 74.750 76.404 77.871 79.306 80.630 81.887 83.073 84.232 85.341 86.345 87.328 88.263 89.160 89.991 90.763 91.481 92.165 92.826 93.420 93.991 94.546 95.073 95.570 96.054 96.497 96.919 97.305 97.652 97.980 98.285 98.563 98.805 99.025 99.220 99.396 99.560 99.683 99.801 99.907 100.000  Extraction Sums of Squared Loadings Total 15.89 3.593  % of Variance 27.878 6.303  Cumulative % 27.878 34.181  Rotation Sums of Squared Loadingsa Total 15.061 8.582  94  The first factor as presented in Table 27 was associated with an “awareness of the nature of science.” Table 27 Round Two(T2): Factor/Component One(two): Pedagogical Content Knowledge Item Round Two(T2): Factor/Component One(two): Pedagogical Content Knowledge Items 55. I can use strategies that combine content, the sciences and teaching approaches that I learned about in my coursework in my classroom. 53. I can teach lessons that appropriately combine social studies, science and teaching strategies. 7. I have had sufficient opportunities to work with science. 26. I know how to assess student performance in a classroom. 50. I can teach lessons that appropriately combine mathematics, science and teaching strategies. 48. I can choose the sciences that enhance the teaching approaches for a lesson. 44. I can adapt the science I am learning about to different teaching activities. 29. I can adopt my teaching based-upon what students currently understand or do not understand. 45. I am thinking critically about how to use science in my classroom. 43. I have the scientific skills I need to use science appropriately in teaching. 57. I can choose the sciences that enhance the content for a lesson. 24. I can use a wide range of teaching approaches in a classroom setting (collaborative learning, direct instruction, inquiry learning, problem/project based learning etc.). 2. I have the scientific skills I need to use science. 56. I can provide leadership in helping others to coordinate the use of content, the sciences and teaching approaches at my school. 49. I can choose sciences that enhance students' learning for a lesson. 1. I know science. 51. I can teach lessons that appropriately combine language literacy, science and teaching strategies. 14. I have various ways and strategies of developing my understanding of science. 41. I know what sciences I can use for understanding and doing science. 21. I have sufficient knowledge about language literacy. 9. I can use a literary way of thinking. 52. I can teach lessons that appropriately combine science, science and teaching strategies. 8. I can use a mathematical way of thinking. 22. I have sufficient knowledge about science. 12. I have various ways and strategies of developing my understanding of mathematics. 10. I can use a scientific way of thinking. 36. I know how to select effective teaching strategies to guide student thinking and learning in language literacy. 28. I can assess student learning in multiple ways. 20. I have sufficient knowledge about mathematics. 95  Table 27 (cont’d) 54. I can select the sciences to use in my classroom that enhance what I teach, how I teach and what students learn. 40. I know what sciences I can use for understanding and doing language literacy. 18. I know about various examples of how science applies in the real world. 46. I have the classroom management skills I need to use science appropriately in teaching. 5. I can learn science easily. 13. I have various ways and strategies of developing my understanding of language literacy. 37. I know how to select effective teaching strategies to guide student thinking and learning in science. 39. I know what sciences I can use for understanding and doing mathematics. 25. I can adopt my teaching style to different learners. 17. I know about various examples of how language literacy applies in the real world. 16. I know about various examples of how mathematics applies in the real world. 27. I am familiar with common student understandings and misconceptions. 19. I know about various examples of how social studies applies in the real world. 30. I know how to organize and maintain classroom management. 23. I have sufficient knowledge about social studies. 4. I know how to solve my own problems using scientific processes. 15. I have various ways and strategies of developing my understanding of social studies. 6. I frequently play around with science. 42. I know what sciences I can use for understanding and doing social studies. 11. I can use a historical way of thinking. 3. I keep up with important new science discoveries. 47. My teacher education programme has caused me to think more deeply about how science could influence the teaching approaches I use in my classroom. 35. I know how to select effective teaching strategies to guide student thinking and learning in mathematics.  The factor appears valid as the items appear to be related through a common theme of one’s awareness of the nature of science. The second factor as presented in Table 28 appears to be related to the awareness of pedagogy.  96  Table 28 Round Two(T2): Factor/Component Two(two): Pedagogical Content Knowledge Items Round Two(T2): Factor/Component Two(two): Pedagogical Content Knowledge Items 31. I know that different mathematical concepts do not require different teaching strategies. 34. I know that different social studies concepts do not require different teaching strategies. 32. I know that different language literacy concepts do not require different teaching strategies. 33. I know that different science concepts do not require different teaching strategies. 38. I know how to select effective teaching strategies to guide student thinking and learning in social studies. The factor appears valid as the items appear to be related to one’s understanding of pedagogy. The items under this factor points to an awareness of one’s awareness of understanding how to teach science.  Descriptive Statistics for Round Two (T2), Factor One(two), Factor Two(two): Rounds One (T1), Two (T2) and Three (T3) The candidates’ scores on items on the two factors were tracked and compared in the three rounds. Table 29 shows the descriptive statistics for Rounds T1, T2, and T3 for the items that loaded on Factor One(two) and Factor Two(two) in the Round Two (T2) data.  97  Table 29 Descriptive Statistics for Round Two(T2), Factor One(two), Factor Two(two), Factor Three(two): Rounds One (T1), Two (T2) and Three (T3) Descriptive Statistics for Round Two (T2), Factor One(two), Factor Two(two): Rounds One (T1), Two (T2) and Three (T3) Survey Rounds  Mean  Std. Deviation  Factor1 Round1  7.1861  1.07618  Factor1 Round2  8.1001  1.03923  Factor1 Round3  7.0754  .74562  Factor2 Round1  6.7000  1.46275  Factor2 Round2  7.7789  1.26669  Factor2 Round3  7.1123  1.04009  Statistical Analysis of Round Two(T2), Factor One(two): Rounds Two (T2) and Three (T3) The means for items that loaded on Factor One(two) in T2 factor analysis was compared to the means of the same items in T3 and T1 using a one way repeated measures analysis of variance. Mauchly's test indicated that the assumption of sphericity was violated as shown in Table 30 and the Greenhouse-Geisser estimate of sphericity was used to correct the degree of freedom; hence, Factor One(two)1= (χ2(2) =11.367, p = .003), (ε=0.98) as shown in Table 31.  98  Table 30 Round Two(T2): Factor/Component One(two): Mauchly's Test of Sphericity Round Two(T2): Factor/Component One(two): Mauchly's Test of Sphericity Within Subjects Effect  Mauchly's W  Approx. ChiSquare  df  Sig.  .903  11.367  2  .003  Factor 1  Table 31 Round Two(T2): Factor/Component One(two): Tests of Within-Subjects Effects Round Two(T2): Factor/Component One(two): Tests of Within-Subjects Effects Source Factor 1  Type III Sum of Squares Sphericity Assumed 72.116  df 2  Mean Square 36.058  F 33.779  Greenhouse-Geisser  72.116  1.824  39.538  33.779  Lower-bound  72.116  1.000  72.116  33.779  Sphericity Assumed  241.253  226  1.067  Greenhouse-Geisser  241.253  206.108  1.171  Lower-bound  241.253  113.000  2.135  There was a significant effect, and the mean score is statistically significantly different, Factor One(two) F(1.824, 226) = 33.779, p = .003. The Eta squared revealed in Table 32 is 23%, in the magnitude of the effect. In this case the effect is small according to Cohen (1992).  99  Table 32 Round Two(T2): Factor/Component One(two): Partial Eta Squared Round Two(T2): Factor/Component One(two): Partial Eta Squared Source Factor 1  Partial Eta Squared .230 .230 .230 .230  Sig. .000 .000 .000 .000  Sphericity Assumed Greenhouse-Geisser Huynh-Feldt Lower-bound  The ANOVA results indicated an overall significant difference in the means. Table 33 provide the results of the Bonferroni post-hoc test of differences. There was a significant difference in survey responses for Factor One(two) between T1 and T2 (p = .000), T2 and T3 (p = .000) but no significant difference between T3 and T1 (p = 1.000).  Table 33 Round Two(T2): Factor/Component One(two): Bonferroni post-hoc Round Two(T2): Factor/Component One(two): Bonferroni post-hoc (I) Survey Rounds T1  (J) Survey Rounds T2  Mean Difference (I-J) -.914*  Std. Error .154  Sig. .000  T2  T3  1.025*  .117  .000  T3  T1  -.111  .136  1.000  Finding Five As the students experienced the science teaching methods course and extended practicum there is some significant change in their levels of awareness of the nature of science as they move through their methods course and practicum. There is no  100  significant change from their beginning mean score to their mean score after the practicum.  Statistical Analysis of Round Two (T2), Factor Two(two): Rounds Two (T2) and Three (T3) The mean for items that loaded on Factor One(one) in T1 factor analysis was compared to the means of the same items in T2 and T3 using a one way repeated measures analysis. Mauchly's test indicated that the assumption of sphericity had been violated as shown in Table 34 and the Greenhouse-Geisser estimate of sphericity was used to correct the degree of freedom; hence, Factor One(one) (χ2(2) = 9.363, p =.009),(ε=0.98) as shown in Table 35.  Table 34 Round Two(T2), Factor/Component Two(two): Mauchly's Test of Sphericity Round Two (T2), Factor/Component Two(two): Mauchly's Test of Sphericity Within Subjects Effect Factor 2  Approx. ChiMauchly's W Square .920  9.363  df  Sig.  2  .009  101  Table 35 Round Two(T2), Factor/Component Two(two): Tests of Within-Subjects Effects Round Two(T2), Factor/Component Two(two): Tests of Within-Subjects Effects Source Factor 2  Type III Sum of Squares Sphericity Assumed 67.585  df 2  Mean Square F 33.792 19.965  Greenhouse-Geisser  67.585  1.852  36.502 19.965  Lower-bound  67.585  1.000  67.585 19.965  Sphericity Assumed  382.522  226  1.693  Greenhouse-Geisser  382.522  209.221  1.828  Lower-bound  382.522  113.000  3.385  There was a significant effect, and the mean score is statistically significantly different, Factor Two(two) F(1.1852, 226) = 19.965, p < 0.0005. The Eta squared revealed in Table 36 is 15%, in the magnitude of the effect. In this case the effect is minimal according to Cohen (1992).  Table 36 Round Two(T2), Factor/Component Two(two): Partial Eta Squared Round Two(T2), Factor/Component Two(two): Partial Eta Squared Source Factor 2  Sphericity Assumed Greenhouse-Geisser Huynh-Feldt Lower-bound  Sig. .000 .000 .000 .000  Partial Eta Squared .150 .150 .150 .150  The ANOVA results indicated an overall significant difference in the means. Table 37 provides the results of the Bonferroni post-hoc test of differences. There was no  102  significant difference in survey responses between Factor Two(two) between time T1 and T2 (p = 1.000), T2 and T3 (p = 1.000), and between T1 and T3 (p = .061).  Table 37 Round Two(T2), Factor/Component Two(two): Bonferroni post-hoc Round Two(T2), Factor/Component Two(two): Bonferroni post-hoc (I) Survey Rounds T1  (J) Survey Rounds T3  Mean Difference (I-J) -1.079  Std. Error Sig. .191 1.000  T2  T3  .667  .148  1.000  T3  T1  .412  .175  .061  Finding Six There appears to be no significant change in levels of awareness of pedagogy when Round Two is compared to Round Three. However there appears to be a significant change between the scores in Round Three and Round One.  103  Factor Analysis of Round Three (T3) Factor Analysis of the initial data (at T3) identified three interpretable factors of PCK as shown in scree plot in Figure 7.  Figure 7 Round Three (T3): Scree Plot The items that loaded on each factor are shown in Table 38.  104  Table 38 Round Three (T3): Factor/Components Round Three (T3): Factor/Components Question # Question_37 Question_36 Question_31 Question_44 Question_24 Question_50 Question_57 Question_39 Question_25 Question_30 Question_53 Question_4 Question_2 Question_21 Question_13 Question_11 Question_47 Question_42 Question_10 Question_20 Question_40 Question_34 Question_18 Question_54 Question_52 Question_27 Question_1 Question_6 Question_33 Question_17 Question_22 Question_48 Question_28 Question_5 Question_38 Question_35 Question_15 Question_32 Question_43 Question_46 Question_23 Question_26 Question_14 Question_51 Question_12 Question_7 Question_49 Question_3 Question_41 Question_8 Question_9 Question_45 Question_55 Question_16 Question_19 Question_56 Question_29  1 .590 .576 .541 .506 .506 .485 .484 .473 .469 .461 .427 .412 .403 .389 .386 .377 .375 .372 .366 .342 .335 .333 .332 .329 .326 .324 .314 .296 .284 .269 .256 .253 .246 .216 .173 .154 .285 .360 .336 .310 .340 .316 .252 .159 .103 .143 .166 .246 .317 .113 .248 .288 .245 .292 .123 .234 .143  Component 2 .009 .196 -.108 -.294 .188 -.161 .075 .259 .299 -.159 -.167 -.049 -.133 .063 .184 .215 -.151 -.114 .126 .079 .253 -.311 .007 .215 -.143 .156 -.099 .120 -.215 .151 .097 -.170 -.113 -.208 .040 -.470 .393 -.387 -.372 -.365 .355 -.339 .302 -.294 .272 .244 -.172 .188 .010 .162 .152 -.343 .181 .144 .156 .257 -.197  3 -.073 -.093 .224 -.236 .135 .316 -.379 -.261 .319 .160 -.105 -.226 .149 -.199 .037 -.118 .035 -.012 -.286 -.009 .107 .077 -.172 -.234 .198 .205 .262 .128 -.115 .097 .148 .131 .195 -.110 -.074 -.128 .133 .053 .058 .082 .017 .012 -.043 .123 -.200 .110 -.035 .523 -.437 -.387 -.355 -.352 -.338 .310 .281 .260 .256  105  Factor analysis extracted three factors (see Table 39) that included a robust set of items that were based on the same construct and were easily interpreted. The rotated solution was interpreted by identifying the conceptual meanings of items grouped into the same factors and the conceptual differences between factors. To be retained for the further analysis each factor should have had at least three items with loadings greater than 0.10 and these items should measure the same construct (Cattell, 1966; Hatcher, 1994; Stevens, 1996; Tabachnik & Fidell, 2001). Thus, the three factors accounted for 21.2% (see Table 39) of the total variance of scores. What this means is that 21.2% of the total variance of how the teacher candidates scored on the items that loaded on the initial factors could be explained in terms of the conceptual interpretation assigned to the first three factors. According to the percentage of variance, the first factor accounted for 25.9% of variance. This factor consisted of 41 items with loadings higher than 0.10. The second factor accounted for 5.9% of variance and consisted of 11 items. The third factor accounted for 4.5% of variance and consisted of five items, all of which had loadings higher than 0.10 and were retained.  106  Table 39 Round Three (T3): Extracted Factors and Percentage of Variance Accounted for by the Factors Round One: Extracted Factors and Percentage of Variance Accounted for by the Factors Initial Eigenvalues  Component 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57  Total 6.715 2.781 2.588 2.281 2.166 2.083 1.971 1.838 1.815 1.777 1.602 1.547 1.505 1.426 1.383 1.320 1.307 1.256 1.167 1.133 1.029 1.006 .963 .940 .871 .840 .803 .756 .732 .674 .644 .621 .603 .579 .532 .498 .446 .441 .399 .391 .365 .358 .317 .295 .289 .261 .247 .213 .207 .193 .174 .147 .140 .115 .099 .086 .068  % of Variance 11.780 4.879 4.541 4.002 3.801 3.655 3.459 3.225 3.183 3.118 2.810 2.714 2.641 2.501 2.426 2.316 2.293 2.203 2.047 1.987 1.806 1.764 1.689 1.648 1.527 1.474 1.408 1.326 1.284 1.183 1.131 1.089 1.057 1.016 .933 .873 .783 .773 .699 .686 .641 .628 .555 .518 .507 .458 .434 .373 .363 .338 .306 .259 .245 .202 .173 .151 .119  Cumulative % 11.780 16.659 21.200 25.202 29.003 32.657 36.116 39.341 42.524 45.642 48.452 51.166 53.808 56.309 58.734 61.050 63.343 65.547 67.594 69.581 71.387 73.151 74.841 76.489 78.017 79.490 80.898 82.224 83.508 84.691 85.821 86.910 87.967 88.984 89.916 90.790 91.573 92.347 93.046 93.732 94.372 95.000 95.556 96.073 96.580 97.038 97.472 97.845 98.208 98.546 98.852 99.111 99.356 99.558 99.731 99.881 100.000  Extraction Sums of Squared Loadings Total 6.715 2.781 2.588  % of Variance 11.780 4.879 4.541  Cumulative % 11.780 16.659 21.200  Rotation Sums of Squared Loadingsa Total 5.099 4.743 4.581  107  The items that loaded on the first factor as presented in Table 40 were associated with “awareness of competence and confidence.”  Table 40 Round Three (T3): Factor One (three) Pedagogical Content Knowledge Items Round Three (T3): Factor One (three) Pedagogical Content Knowledge Items 37. I know how to select effective teaching strategies to guide student thinking and learning in science. 36. I know how to select effective teaching strategies to guide student thinking and learning in language literacy. 31. I know that different mathematical concepts do not require different teaching strategies. 44. I can adapt the science I am learning about to different teaching activities. 24. I can use a wide range of teaching approaches in a classroom setting (collaborative learning, direct instruction, inquiry learning, problem/project based learning etc.). 50. I can teach lessons that appropriately combine mathematics, science and teaching strategies. 57. I can choose the sciences that enhance the content for a lesson. 39. I know what sciences I can use for understanding and doing mathematics. 25. I can adopt my teaching style to different learners. 30. I know how to organize and maintain classroom management. 53. I can teach lessons that appropriately combine social studies, science and teaching strategies. 4. I know how to solve my own problems using scientific processes. 2. I have the scientific skills I need to use science. 21. I have sufficient knowledge about language literacy. 13. I have various ways and strategies of developing my understanding of language literacy. 11. I can use a historical way of thinking. 47. My teacher education programme has caused me to think more deeply about how science could influence the teaching approaches I use in my classroom. 42. I know what sciences I can use for understanding and doing social studies. 10. I can use a scientific way of thinking. 20. I have sufficient knowledge about mathematics. 40. I know what sciences I can use for understanding and doing language literacy. 34. I know that different social studies concepts do not require different teaching strategies. 18. I know about various examples of how science applies in the real world. 54. I can select the sciences to use in my classroom that enhance what I teach, how I teach and what students learn. 52. I can teach lessons that appropriately combine science, science and teaching strategies.  108  Table 40 (con’t) 27. I am familiar with common student understandings and misconceptions. 1. I know science. 6. I frequently play around with science. 33. I know that different science concepts do not require different teaching strategies. 17. I know about various examples of how language literacy applies in the real world. 22. I have sufficient knowledge about science. 48. I can choose the sciences that enhance the teaching approaches for a lesson. 28. I can assess student learning in multiple ways. 5. I can learn science easily. 38. I know how to select effective teaching strategies to guide student thinking and learning in social studies. This factor appears valid as the items appear related through a common experience of passing the practicum. The items on the second factor as presented in Table 41 were associated with the awareness of skills and knowledge needed. This seems to be intuitively true as the characteristics appear to be related to one’s experience teaching in the classroom. The items on the third factor as presented in Table 42 were associated with the awareness of the relevance of science teaching strategies. It seems that the teacher candidates are aware of the relevance of teaching science in the classroom.  109  Table 41 Round Three (T3): Factor Two (three): Pedagogical Content Knowledge Items Round Three (T3): Factor Two (three): Pedagogical Content Knowledge Items 35. I know how to select effective teaching strategies to guide student thinking and learning in mathematics. 15. I have various ways and strategies of developing my understanding of social studies. 32. I know that different language literacy concepts do not require different teaching strategies. 43. I have the scientific skills I need to use science appropriately in teaching. 46. I have the classroom management skills I need to use science appropriately in teaching. 23. I have sufficient knowledge about social studies. 26. I know how to assess student performance in a classroom. 14. I have various ways and strategies of developing my understanding of science. 51. I can teach lessons that appropriately combine language literacy, science and teaching strategies. 12. I have various ways and strategies of developing my understanding of mathematics. 7. I have had sufficient opportunities to work with science. 49. I can choose sciences that enhance students' learning for a lesson.  Table 42 Round Three (T3): Factor Three(three): Pedagogical Content Knowledge Items Round Three (T3): Factor Three(three): Pedagogical Content Knowledge Items 3. I keep up with important new science discoveries. 41. I know what sciences I can use for understanding and doing science. 8. I can use a mathematical way of thinking. 9. I can use a literary way of thinking. 45. I am thinking critically about how to use science in my classroom. 55. I can use strategies that combine content, the sciences and teaching approaches that I learned about in my coursework in my classroom. 16. I know about various examples of how mathematics applies in the real world. 19. I know about various examples of how social studies applies in the real world. 56. I can provide leadership in helping others to coordinate the use of content, the sciences and teaching approaches at my school. 29. I can adopt my teaching based-upon what students currently understand or do not understand.  110  Descriptive Statistics for Round Three (T3), Factor One(three), Factor Two(three), Factor Three(three): Rounds One (T1), Two (T2) and Three (T3) The candidates’ scores on items on the three factors were tracked and compared in the three rounds. Table 43 shows the descriptive statistics for Rounds T1, T2, and T3 for the items that loaded on Factor One(two), Factor Two(two) in the Round Three (T3) data.  Table 43 Descriptive Statistics for Round Three(T3), Factor One(three), Factor Two(three), Factor Three(three): Rounds One (T1), Two (T2) and Three (T3) Descriptive Statistics for Round Three (T3), Factor One(three), Factor Two(three), Factor Three(three): Rounds One (T1), Two (T2) and Three (T3) Survey Rounds  Mean  Std. Deviation  Factor1 Round1  7.1282  1.07615  Factor1 Round2  8.0590  1.03133  Factor1 Round3  7.0818  .83651  Factor2 Round1  7.0851  1.21718  Factor2 Round2  8.0603  1.11566  Factor2 Round3  7.0623  .77828  Factor3 Round1  7.1755  1.16986  Factor3 Round2  8.1610  1.22910  Factor3 Round3  7.1401  .75159  Statistical Analysis of Round Three (T3), Factor One(three), Two(three) and Three(three) A retrospective was employed using the three factors discovered in Round 3 (T3) for examination and comparison in Rounds Two (T2) and One (T1).  111  Statistical Analysis of Round Three (T3), Factor One(three) The mean for items that loaded on Factor One(three) in T3 factor analysis was compared to the means of the same items in T2 and T3 using a one way repeated measures analysis of variance. Mauchly's test indicated that the assumption of sphericity had been violated as shown in Table 44, and the Greenhouse-Geisser estimate of sphericity was used to correct the degree of freedom; hence, Factor 1 (χ2(2) = 8.698, p =.013),(ε=0.98) as shown in Table 45.  Table 44 Round Three(T3): Factor/Component One(three): Mauchly's Test of Sphericity Round Three (T3): Factor/Component One(three): Mauchly's Test of Sphericity Within Subjects Effect  Mauchly's W  Approx. ChiSquare  df  Sig.  .925  8.698  2  .013  Factor 1  Table 45 Round Three(T3): Factor/Component One(three): Tests of Within-Subjects Effects Round Three (T3): Factor/Component One(three): Tests of Within-Subjects Effects  Source Factor 1  Type III Sum of Squares Sphericity Assumed 69.296  Mean Square 34.648  df 2  F 30.897  Greenhouse-Geisser  69.296  1.861  37.237  30.897  Lower-bound  69.296  1.000  69.296  30.897  Sphericity Assumed  253.435  226  1.121  Greenhouse-Geisser  253.435  210.288  1.205  Lower-bound  253.435  113.000  2.243  112  There was a significant effect, and the mean score is statistically significantly different, Factor One(three) F(1.1797, 226) = 33.612, p < 0.0005. The Eta squared revealed in Table 46 is 22% in the magnitude of the effect. In this case the effect is minimal according to Cohen (1992).  Table 46 Round Three(T3): Factor/Component One(three): Partial Eta Squared Round Three (T3): Factor/Component One(three): Partial Eta Squared Source Factor 1  Sig. .000 .000 .000 .000  Sphericity Assumed Greenhouse-Geisser Huynh-Feldt Lower-bound  Partial Eta Squared .215 .215 .215 .215  The ANOVA results indicated an overall significant difference in the means. Table 47 provide the results of the Bonferroni post-hoc test. There is no significant difference in survey responses between Factor One(three) between T1 and T2 (p = .000), T2 and T3 (p = .000), and between T1 and T3 (p = 1.000).  Table 47 Round Three(T3), Factor/Component One (three): Bonferroni post-hoc Round Three (T3), Factor/Component One (three): Bonferroni post-hoc (I) Survey Rounds T1  (J) Survey Rounds T2  Mean Difference (I-J)  Std. Error  Sig.  -.931*  .155  .000  T2  T3  .977*  .121  .000  T3  T1  -.046  .143  1.000  113  Finding Seven There appears to be significant change in the teacher candidates’ levels of awareness competence and confidence of pedagogy when Round Three is compared to Round Two, and Round One is compared to Round Two. However there appears to be a no significant change between the scores in Round Three and Round One.  Statistical Analysis of Round Three (T3), Factor Two(three) The means for items that loaded on Factor Two(three) in T3 factor analysis was compared to the means of the same items in T2 and T3 using a one way repeated measures analysis, Mauchly's test indicated that the assumption of sphericity had been violated as shown in Table 48, and the Greenhouse-Geisser estimate of sphericity was used to correct the degree of freedom; hence, Factor Two(three) (χ2(2) = 12.080, p = .002), (ε=0.98) as shown in Table 49.  Table 48 Round Three(T3): Factor/Component Two(three): Mauchly's Test of Sphericity Round Three (T3): Factor/Component Two(three): Mauchly's Test of Sphericity  Within Subjects Effect Factor 2  Approx. ChiMauchly's W Square .898  12.080  df  Sig.  2  .002  114  Table 49 Round Three(T3): Factor/Component Two(three): Tests of Within-Subjects Effects Round Three (T3): Factor/Component Two(three): Tests of Within-Subjects Effects  Source Factor 2  Sphericity Assumed  Type III Sum of Squares 74.003  Greenhouse-Geisser  df 2  Mean Square 37.002  F 30.150  74.003  1.814  40.785  30.150  Lower-bound  74.003  1.000  74.003  30.150  Sphericity Assumed  277.355  226  1.227  Greenhouse-Geisser  277.355  205.037  1.353  Lower-bound  277.355  113.000  2.454  There was a significant effect, and the mean score is statistically significantly different, Factor Two(three) F(1.814, 226) = 30.150, p < 0.0005. The Eta squared revealed in Table 50 is 21%, in the magnitude of the effect. In this case the effect is minimal according to Cohen (1992).  Table 50 Round Three(T3): Factor/Component Two(three): Partial Eta Squared Round Three (T3): Factor/Component Two(three): Partial Eta Squared Source Factor 2  Sphericity Assumed  Sig. .000  Partial Eta Squared .211  Greenhouse-Geisser  .000  .211  Huynh-Feldt  .000  .211  Lower-bound  .000  .211  115  The ANOVA results indicated an overall significant difference in the means. Table 51 provide the results of the Bonferroni post-hoc test, for differences. There is no significant difference in survey responses between Factor 2 between time T1 and T2 (p = .000), T2 and T3 (p = .000), and T1 and T3 (p = 1.000).  Table 51 Round Three(T3), Factor/Component Two (three): Bonferroni post-hoc Round Three (T3), Factor/Component Two(three): Bonferroni post-hoc (I) Survey Rounds T1  (J) Survey Rounds T2  Mean Difference (I-J)  Std. Error  Sig.  -.975*  .167  .000  T2  T3  .998*  .126  .000  T3  T1  -.023  .144  1.000  Finding Eight There appears to be significant change in levels of awareness of skills and knowledge needed when Round Three is compared to Round Two, and when Round One is compared to Round Two. However there appears to be no significant change between the scores in Round Three and Round One.  Statistical Analysis of Round Three (T3), Factor Three(three) The mean for items that loaded on Factor Three(three) in T3 factor analysis was compared to the means of the same items in T2 and T3 using a one way repeated measures analysis. Mauchly's test indicated that the assumption of sphericity had been violated as shown in Table 52, and the Greenhouse-Geisser estimate of sphericity was used to correct  116  the degrees of freedom; hence, Factor Three(three) (χ2(2) = 13.163, p =.001), (ε=0.98) as shown in Table 52.  Table 52 Round Three(T3), Factor/Component Three(three): Mauchly's Test of Sphericity Round Three (T3), Factor/Component Three(three): Mauchly's Test of Sphericity Within Subjects Effect  Mauchly's W  Approx. ChiSquare  Factor 3  .889  13.163  df  Sig.  2  .001  Table 53 Round Three(T3), Factor/Component Three(three): Tests of Within-Subjects Effects Round Three (T3), Factor/Component Three(three): Tests of Within-Subjects Effects  Source Factor 3  Type III Sum of Squares Sphericity Assumed 76.552  df 2  Mean Square 38.276  F 31.171  Greenhouse-Geisser  76.552  1.800  42.520  31.171  Lower-bound  76.552  1.000  76.552  31.171  Sphericity Assumed  277.512  226  1.228  Greenhouse-Geisser  277.512  203.442  1.364  Lower-bound  277.512  113.000  2.456  There was a significant effect, and the mean score is statistically significantly different, Factor Three(three) F(1.8 226) = 31.171, p < 0.0005. The Eta squared revealed in Table 54 is 22% in the magnitude of the effect. In this case the effect is minimal according to Cohen (1992).  117  Table 54 Round Three(T3), Factor/Component Three(three): Partial Eta Squared Round Three (T3), Factor/Component Three(three): Partial Eta Squared Source Factor 3  Sig. .000 .000 .000 .000  Sphericity Assumed Greenhouse-Geisser Huynh-Feldt Lower-bound  Partial Eta Squared .216 .216 .216 .216  The ANOVA results indicated an overall significant difference in the means. Table 51 provide the results of the Bonferroni post-hoc test. There is no significant difference in survey responses between Factor Three(three) between time T1 and T2 (p = .000), T2 and T3 (p = .000), and between T1 and T3 (p = 1.000).  Table 55 Round Three(T3), Factor/Component Three (three): Bonferroni post-hoc Round Three (T3), Factor/Component Three(three): Bonferroni post-hoc (I) Survey Rounds T1  (J) Survey Rounds T2  Mean Difference (I-J)  Std. Error  Sig.  -.985*  .169  .000  T2  T3  1.021*  .130  .000  T3  T1  -.035  .138  1.000  Finding Nine There appears to be significant change in levels of awareness of the relevance of science teaching strategies needed when Round Three is compared to Round Two and Round One is compared to Round Two. However there appears to be no significant change between the scores in Round Three and Round One.  118  Summary The students’ overall PCK showed significant change from Round One to Round Two (p=.000) and Round Two to Round Three (p=.000). The students’ overall PCK experienced no significant change from Round One and Round Three (p=1.000). Confirmatory factor analysis produced three Factors for Round One: Awareness of the knowledge of the process of science; Awareness of self-competence; and Awareness of teaching strategies, two factors for Round Two: Awareness of the nature of science; and Awareness of pedagogy, and three factors for Round Three: Awareness competence and confidence; Awareness of skills and knowledge needed; and Awareness of the relevance of science teaching strategies. The factors and how each survey question loaded are depicted in Table 56.  119  Table 56 Questions Loaded per Factor Questions Loaded per Factor Round Round One(T1)  Round Two(T2)  Round Three(T3)  Factors/Components Awareness of the knowledge of the process of science  Question #  Awareness of self-competence  1, 2, 3, 4, 5, 6, 7, 10, 11, 12, 14, 15, 16, 18, 19, 20, 24, 26, 27, 28, 29, 30, 33, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 54, 55, 56, 57 9, 13, 17, 21, 22, 23, 53  Awareness of teaching strategies  8, 25, 31, 32, 34  Awareness of the nature of science  1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 35, 36, 37, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57  Awareness of pedagogy  31, 32, 33, 34, 38  Awareness of competence and confidence  1, 2, 4, 5, 6, 10, 11, 13, 17, 18, 20, 21, 22, 24, 25, 27, 28, 30, 31, 34, 36, 37, 39, 42, 44, 47, 48, 50, 52, 52, 53, 54, 57 12, 14, 15, 23, 26, 32, 35, 43, 46, 49, 51  Awareness of skills and knowledge needed Awareness of the relevance of science teaching strategies  3, 8, 9, 41,45  Round One (T1) Level of Awareness of the knowledge of the process of science (Factor One(one)) was found to be significantly different from the level of the same factor in Round Two (p=.000), and the same observation was made when the level in this factor in Round Two was compared to the level of the same factor in Round Three (p=.000). However, no significant differences were found in mean score on factor one items in Round One and mean score on the same factor items in Round Three (p=1.000). Factor Two(one) Awareness of self-competence level was found to be significantly different from the level of the same factor when Round One was compared to Round 120  Two (p=.000), Round Two was compared to Round Three (p=.000), and when Round One was compared to Round Three (p=.000). Factor Three(one) Awareness of teaching strategies level was found to not be significantly different when the level from Round One was compared to Round Two (p=1.000), the level in Round Two was compared to the level Round Three (p=1.000), and when level from Round One was compared to the level in Round Three (p=.061). Table 57 depicts the changes in significance as each factor is compared across the rounds.  Table 57 Round One(T1) Factors/Components Significant Changes in Difference Across Rounds Round One (T1) Factors/Components Significant Changes in Difference Across Rounds Rounds  Factor T1- T2  T2-T3  T1-T3  Factor 1  Change  Change  No Change  Factor 2  Change  Change  Change  Factor 3  No Change  No Change  No Change  Round Two (T2) Level of Awareness of the nature of science (Factor One(two)) was found to have no significant differences when Round Three was compared to Round One (p=1.000). However, a significant difference was found in mean score when Round Two was  121  compared to Round One (p=.000), and when Round Two was compared to Round Three (p=.000). Level of Awareness of pedagogy (Factor Two(two)) was found to have no significant differences in mean scores when Round Two was compared to Round Three (p =1.000), when Round Two was compared to Round One (p =1.000), and when Round Three was compared to Round One (p =.061). Table 58 depicts the changes in significance as each factor is compared across the rounds.  Table 58 Round Two(T2) Factors/Components Significant Changes in Difference Across Rounds Round Two (T2) Factors/Components Significant Changes in Difference Across Rounds Rounds  Factor T1- T2  T2-T3  T1-T3  Factor 1  Change  Change  No Change  Factor 2  No Change  No Change  No Change  Round Three (T3) Factor One(three) level of Awareness of competence and confidence was found to have significant differences when Round One was compared to Round Two (p=.000), and when Round Three was compared to Round Two (p=.000). No significant differences in score occurred when Round Three was compared to Round One (p=1.000).  122  Factor Two(three) level of Awareness of skills and knowledge needed was found to have significant differences when Round One was compared to Round Two (p=.000), and when Round Three was compared to Round Two (p=.000). No significant differences in mean scores occurred when Round Three was compared to Round One (p=1.000). Factor Three(three) level of Awareness of the relevance of science teaching strategies mean scores was significantly different when Round One was compared to Round Two (p=.000) and when Round Three was compared to Round Two (p=.000). No significant differences in mean score occurred when Round Three was compared to Round One (p=1.000). Table 59 depicts the changes in significance as each factor is compared across the rounds.  Table 59 Round Three(T3) Factors/Components Significant Changes in Difference Across Rounds Round Three (T3) Factors/Components Significant Changes in Difference Across Rounds Rounds  Factor T1- T2  T2-T3  T1-T3  Factor 1  Change  Change  No Change  Factor 2  Change  Change  No Change  Factor 3  Change  Change  No Change  Chapter 5 will examine through interviews the students’ PCK understandings as they participated in the science teaching methods course and extended practicum during their life in the 12-month Teacher Education Programme.  123  CHAPTER FIVE: INTERVIEW DATA ANALYSIS, RESULTS AND DISCUSSION  This chapter begins with a presentation of the results of analysis of interview data sets. I then explore the correspondence between factors determined from the quantitative data and emergent themes from the phenomenographic analysis of qualitative data.  Interview Data Analysis These data were elicited through two one-on-one, semi-structured, 45-minute interviews. The interviews were conducted with selected teacher candidates 14 days after the de-activation of Round Two, which was after their methods course, and 13 days after the de-activation of Round Three, which was after the extended practicum. All teacher candidates in the Elementary Teacher Education Programme were invited to participate in the interviews at the beginning of the year, and 14 teacher candidates were randomly selected from those who volunteered. The interviews were recorded and transcribed. A sample of interview questions is included in Appendix C. These interviews were aimed at probing the teacher candidates’ knowledge of Pedagogical Content Knowledge (PCK). The initial interview, which followed the completion of the questionnaire survey of their individual levels of awareness and knowledge of PCK, was aimed at further clarification and interpretation of components of PCK that were generated through factor analysis of the initial questionnaire data. The initial interview also provided a deeper understanding of the candidates’ baseline awareness and knowledge of PCK. Similarly, the second interviews were aimed at  124  further assessment and clarification of the effect of the interventions (science methods course and practicum) on the candidates’ baseline PCK. In addition to ensuring triangulation for the questionnaire data analyses, the interviews were conducted in ways that enabled the candidates to fully articulate their experience and allowed for the opportunity to deeply probe emergent insights. Thus, the analysis of the interview data corpus involved searching for themes that related to PCK. Attention was also paid to what the teacher candidates articulated about the effect of the interventions on their awareness of PCK with regard to science prior to joining the teacher education programme and after experiencing both the science methods course and extended practicum. The nature of the data collection and analysis described above was better rendered by aspects of phenomenographical methods (Marton & Booth, 1997; Pang, 2003; Pang & Marton, 2003). This involved using interview questions that probed and elicited data about the teacher candidates’ level and extent of awareness and knowledge of science PCK. The analysis of these data resulted in categories of PCK that were discerned and described in the form of emergent themes. Prior to the discernment and description of the categories, thick description in the form of a transcript (Miles & Huberman, 1994) for each interview data set was generated. Transcripts of each interview data set were sifted through to identify, describe, and interpret the different levels and types of awareness and knowledge of PCK as well as any effects of the interventions on the teacher candidates’ levels of awareness of PCK with regard to science that they expressed or conveyed. This was necessary since the nature of the interviews was conversational and reconstruction of conversational data was necessary to organize the teacher candidates’ views.  125  Thus, the interview data sets were carefully examined and categorized using a thematic approach (Merriam, 1998; Miles & Huberman, 1994) to address the study objectives (Miles & Huberman, 1994; Yin, 2003). The process involved several episodes of listening to interview clips and the reviewing of interview transcripts. The comparing and contrasting of candidate statements resulted in the themes that emerged across the following two interview data sets: (1) interview after completing the science methods course; and (2) interview after practicum. The rationale for this approach was to elucidate correspondence between themes that were evident and manifested in the questionnaire data, and those that emerged from interview data analysis. The analysis of interview data sets resulted in four key findings or themes that corresponded (not directly one-on-one) to the PCK components which were determined through factor analyses of questionnaire data (See Chapter 4). The themes, which are supported by select verbatim quotes and excerpts from the interview transcripts, represent the voices of participants who are identified by pseudonyms. The four key themes include: 1) Expression of pre-programme awareness of pedagogy. 2) Expression of pre-programme awareness of their content knowledge as regards science. 3) Expression of richer post science teaching methods awareness of PCK. 4) Expression, for some candidates, that a successful practicum validated or invalidated pre-practicum awareness of PCK.  126  Further analysis revealed teacher candidates’ perceptions of PCK are sometimes expressed or conveyed in their views on what it means to teach and how they have experienced a teaching situation.  Theme One Expression of Pre-Programme Awareness of Pedagogy The teacher candidates who participated in the study and volunteered to be interviewed appeared to express a strong awareness and knowledge of teaching. This understanding was prevalent among the teacher candidates who I interviewed immediately after completing the questionnaire but prior to experiencing the methods course as conveyed in the illustrative interview excerpts below. The responses were prompted by open-ended interview questions that included: What is it about your knowledge of teaching that might have earned you admission into the programme? Most of the teacher candidates interviewed expressed a belief that they possessed awareness and knowledge of pedagogy through their prior experience as science students at both elementary and secondary levels during volunteer experiences in classrooms or educational settings. Barb illustrates the value of volunteer experience when she comments: Yes, look at the admission form; they expect you to have already done some form of teaching like something along the lines of volunteering in a classroom, tutoring, teaching at summer camp. I volunteered both in classrooms as well as kids' camps. These experiences prepared me for  127  teaching for sure. I learned how to manage kids and keep them interested and busy. Others perceived admission into the programme as recognition of possession of awareness and knowledge of teaching. This perception may be in part due to one of the admission requirements: classroom experience. Chris's interview response reveals his belief that admission into the programme confirmed that he could teach: There are all these requirements to be eligible for admission. Someone or a group of some ones look at your application, and based on it, you are either admitted or rejected. They must think you have some teaching ability in order to be accepted. Chris then explains that his previous classroom experience enabled him to gain entrance into the programme: “I am already an educator, I knew I would be admitted because I had worked in a classroom, and if not, I would just go back to my old job." Anna focuses on the essay section of the application process where prospective teacher candidates write a statement detailing why they want to be a teacher: You have to write an essay on why you want to become a teacher, and I am thinking of one of my teachers and her teaching. She wanted us to learn and become scientists. Whoever read my essay believed that my experience as a student inspired me to become a teacher and my experiences teaching prepared me for the classroom. Gerri was surprised that he was accepted into the programme and credits the application process as validating his previous teaching experience:  128  I saw the ad in the paper and thought about applying. I was kind of surprised when I was offered a place. I mean, I didn’t think I had enough teaching experience. I guess when they looked at my application they felt I could do it! Edward expresses his understanding that the application reviewer believed he was capable of teaching: “I thought it was great to get admitted, someone thought I could teach.” The interview responses by the teacher candidates presented above reveal their perspectives on how the application process, including those who review the applications, validate and support the view that their previous teaching experiences prepared them for the Bachelor of Education programme. The teacher candidates’ revelations could be perceived as comparing the selection process to what Lewin (1946) described as “gatekeeping” and validation. Kurt Lewin (1890-1947), a Prussian-born scholar, was an experimental social psychologist whose focus was field theory and group dynamics. He studied how people’s behaviour was influenced by their connection to a group of people (Roberts, 2006). Lewin’s gatekeeping theory evolved from a study that he conducted on why people “eat what they eat” (Lewin, 1951, p. 174). He explained that food reaches the family table through various channels such as grocery stores or family gardens. Lewin concluded that housewives were the key “gatekeepers” who controlled what food entered the “channels” that ultimately end up on the dining room table. Lewin believed that his theoretical framework could be applied generally:  129  This situation holds not only for food channels, but also for the traveling of a news item through certain communication channels in a group, for movement of goods, and the social locomotion of individuals in many organizations. A university, for instance, might be quite strict in its admission policy and might set up strong forces against the passing of weak candidates. (Lewin, 1951, p. 187) In Lewin’s “gatekeeper” metaphor, an individual or group is “in power” (p. 145) for making the decision between who is “in” or “out”. Given the role of the gatekeeper is to determine admission, Chris’s, Anna’s, Gerri’s, and Edward’s interview responses indicate that they met the criteria as they were admitted into the programme. These teacher candidates explained that they had prior teaching awareness and knowledge, but they did not view their previous experiences as valid until they were granted admission into the teacher education programme by an outside source. Analogous to Lewin’s gatekeeper metaphor is the selection team that decides who is or who is not admitted into the Teacher Education Programme. The requirement that applicants have classroom-based experience supports the participating teacher candidates’ view that they have some awareness and knowledge of pedagogy prior to joining the programme. Indicative of this are expressions such the following from Edward when he asserts: “Students have been taught for so many years all they know is teaching. They feel like they know it; they have all the knowledge to become a teacher." I found Edward's words intriguing as he notes that students are already aware of and knowledgeable about teaching as well as what it means to be a teacher given the time  130  they spend in the classroom. By implication, it is clear that the teacher candidates who were all at one time elementary, secondary, and post secondary students developed some appreciation of PCK. Calderhead (1988) notes that student teachers begin their preservice training with quite specific images in mind of the ideal teacher they would like to be, based on recollections of their own past teachers. Students have been taught for many years by both teachers and parents. They are engaging in what Lortie (1975) refers to as an “apprenticeship of observation.” Most students feel they learned effectively from these instructors and identify a variety of features they attribute to this success such as the traditional "teaching as telling" model. Anderson (1995) describes this as a tendency to overgeneralize on the basis of personal experience. Students offer the following examples of their beliefs: "I learned this way, so this must be the best way to learn", and "my teachers taught this way, and I learned, therefore it must be the best way to teach" (p. 151). Based on the data collected in this study, I suggest that teacher candidates presume a causal relationship between their past teachers' teaching methods and their own learning. One of the most powerful influences on the knowledge base of pre-service teachers is the apprenticeship of observation (Lortie, 1975). Throughout their time as students, they have observed the actions of teachers and reached conclusions about what it means to be a teacher, how to go about teaching, and “the widespread idea that ‘anyone can teach’” (p. 62). Having the experience validated by the admission process reenforces their views on teaching and their own learning.  131  Theme Two Expression of Pre-Programme Awareness of Their Content Knowledge as Regards to Science The teacher candidates who participated in the study appeared to express an awareness and knowledge of science content. This understanding was prevalent among the candidates I interviewed as conveyed in the illustrative interview excerpts below. The responses were prompted by open-ended interview questions that included: What is it about you and science might have earned your admission into the programme? The teacher candidates who participated in the study appeared to express a strong awareness and knowledge of their science content knowledge. They acquired this knowledge through prior experiences as science students in elementary, secondary, and post-secondary levels and during volunteer experiences in classrooms and other educational settings. Teacher candidates who exhibited a strong awareness and knowledge of science content can be divided into the following two categories: those individuals who believed they had a high level of science content knowledge, and those who believed they did not. In both cases the teacher candidates were aware of what they know and did not know. Helen’s interview response indicates that she possessed a high level of science content knowledge. She cites the important role that her father played in her developing a high level of science content knowledge: “My dad is a science teacher. He is always telling me about science. I always thought I knew more about science than my school teachers in elementary.” Barb also expressed that she had a high level of science content knowledge. She explains that she knew she had sufficient science content knowledge  132  when she was required to meet admission requirements: I found out that I was missing a lab for admissions. I looked up the lab courses at UBC. I asked around and chose the easiest one, it’s not like I am going to need what we did in the lab to teach elementary school. The teacher candidates who believed they had a low level of science content knowledge were strongly aware of their weaknesses in this area. Dawn stated that she knew “nothing about science!” before entering the programme. Kevin explained his confusion about being admitted without having a high level of science content knowledge: “When I got into the programme it was expected that I knew all this science. I’m not sure where this was supposed to come from as it did not come from grade school.” Anna indicates that her prior educational experiences did not include developing science content knowledge: “I can’t remember a single thing from science in K-12. To be honest I don’t think I even did science until I got to high school, and then it was just bunch of meaningless labs.” Jill's words support Anna's claim that high school experiences did not develop her science content knowledge: It took until Grade 11 to actually have science class in a science lab. Finally we had beakers and Bunsen burners and all that neat science stuff to work with. Before that, it was kind of a joke to do science in the French or Social Studies room. It just wasn't sciencey. Everything came out of the textbook, Boring! Teacher candidates like Dawn, Kevin, Anna, and Jill who perceived their science content knowledge as weak, cited their educational background as one of the reasons for this deficiency. They also expressed their initial apprehension when they were accepted  133  to the programme as to how they would develop sufficient content knowledge to teach science. Some teacher candidates viewed their admission into the teacher education programme as an acknowledgment by the academic institution and others that they held a high level of science content knowledge. Laura summarizes her response to being admitted into the programme and being expected to teach science: Look, I know nothing about science. I know I have to teach science. What was the criteria for getting admitted into the programme? Frank commented on how others reacted to his acceptance: When my neighbour found out I was accepted into teacher education, I all of a sudden, became the science expert. She bombarded me with science questions that I had no clue how to answer. Barb also found out that her acceptance gave her academic credibility in science, which was not her area of expertise: Someone found out I was admitted into the programme, and now I am being asked to be a science fair judge. My major is English Literature not science. How can I judge something that I know nothing about? Laura, Frank, and Barb were aware of their low level of science content knowledge and expressed their view that admission into the programme provided acknowledgement of possession of science knowledge they did not have. The responses from the teacher candidates indicate that they are aware of their past experiences in science and are making projections into the future of their teaching; their views could be considered as a measure of self-efficacy (Bandura, 1986, 1997). Bandura  134  defines self-efficacy as a product of one's perceived level of competence rather than an actual level of competence; it is therefore prone to over-and under-estimation. Bandura believes that the concept of self-efficacy is made up of two constructs, outcome expectation and self-efficacy expectation. People are motivated to perform an act if they believe the act will have a favourable result (outcome expectation), and they are confident that they can perform that act successfully (self-efficacy expectation). Bandura (1986, 1997) puts forth four points of efficacy expectations: (1) physiological and emotional states; (2) social persuasion; (3) vicarious experiences; and (4) mastery experiences. Vicarious experiences involve the observation of a role model. By watching others model positive behaviours and identifying with the role model a person's self-efficacy is enhanced (Bandura, 1997). However if the model does poorly the experience can have a negative effect upon self-efficacy (Bandura, 1997). The fourth and most influential source of efficacy expectations is derived from mastery experiences (Bandura, 1986, 1997). Mastery experiences are those experiences that involve the individual successfully participating in activities that inform and reinforce their confidence in themselves. If individuals experience success based upon their own skills or effort, efficacy is improved through mastery experiences. However, if the individual feels that success was due to outside variables such as the intervention of others or luck, then self-efficacy is not necessarily strengthened by mastery experiences (Bandura, 1993; Pintrich & Schunk, 1996). The responses from Laura, Frank, and Bob reveal an important point regarding science in the elementary classroom. The literature indicates that over the past several decades, the declining quality and amount of science education instruction in elementary  135  schools has been a point of concern (Gee, 1996; Tilgner, 1990). Stefanich and Kelsey (1989) found that in elementary schools less time is allocated for science instruction than for any other subject. Researchers attribute this result to several factors and point out that elementary teachers often have negative attitudes towards science (Shrigley, 1974) and do not have confidence in their ability to teach science (DeTure & Ramsey, 1990). These factors coupled with low science interest (Tilgner, 1990) and science anxieties (Czerniak & Chiarelott, 1990) often contribute to elementary teachers’ avoidance of teaching science (Westerback, 1982). Generally, elementary teachers have been found to possess low level conceptual and factual knowledge as well as inadequate skills in the content area of science (Bloser and Howe, 1969; Victor, 1962; Wenner, 1993) and a low level of knowledge regarding the concepts, facts, and skills concerning science (Stevens & Wenner, 1996; Wenner, 1993). Many teachers, especially those who have only taken introductory science courses or are teaching out of their fields, have “ideas of science content that may not reflect accurate conceptions” (Akerson, 2005, p. 245). Hence a lack of science content knowledge and a lack of confidence in teaching science are key factors that cause elementary teachers to shy away from teaching science. The teacher candidates’ responses indicate that they are aware of having either a high or low level of science content knowledge. They acknowledge that this content knowledge can come from their own educational experiences at the elementary, secondary, and post secondary levels. Science content knowledge can also be developed as they mature into adults through volunteer classroom teaching experiences as well as through work in a variety of educational settings. The teacher candidates’ awareness of  136  their levels of science content knowledge indicate self efficacy (Bandura, 1997); their perceived competency in science does not necessarily align with their actual competency, resulting in either an over-estimation or under-estimation of their science content knowledge. The responses from the teacher candidates with a perceived low level of science content knowledge also reveal the effect of the quality and quantity of science instruction at the elementary and high school levels. In summary, my data indicate that a diverse range of experiences and interpretations contribute to the teacher candidates’ pre-programme awareness and knowledge of PCK. Experience as a student in grades K-12 appears to play a dominant role in the awareness and knowledge of PCK.  Theme Three Expression of Richer Post-Science Methods Course Awareness of PCK. The teacher candidates who participated in the study appeared to express an enrichment of their knowledge of PCK after their methods course. This understanding was prevalent among the candidates I interviewed as conveyed in the illustrative interview excerpts below. The responses were prompted by open-ended interview questions that included: How has your knowledge of teaching changed after taking a science methods course? It was clear that experiencing a science methods course enabled the teacher candidates to place a perspective on their experiences during their time as students in elementary and secondary schools. Some of the comments revealed the inevitable comparison and contrast of their recent experiences with previous ones. Some students 137  revealed their experiences with teachers. Laura focused on her feelings about learning science: “I hated science; taught by stuffy, lab coat wearing, follow the procedure, and write up the results type of teachers.” Frank discussed the reaction of teachers when students asked questions: Remember in school when you would ask the teacher a question, and you know they did not know the answer, and they would get all mad at you and send you away. Now I know that I don’t need to know everything, and I can admit it. When my students ask me a question, I can tell them that I don't know so let's find out! Jill expressed her pleasure at discovering different strategies for teaching which she learned in her teaching methods courses: It is nice to know that there are other ways to teach science than the terrible examples I suffered. I am excited to try these new ideas, be creative, and get my students engaged! Helen's response indicated her intention to move away from what she experienced in school: No textbook in this course. Wow, can you believe it? I am not chained to a textbook to teach science. If my high school teachers could see this class they would have a stroke. During the interviews, some of the teacher candidates revealed their appreciation and relief of having someone who was an expert on the topic. Barb stated: “Finally, an expert to teach us.” Gerry noted that his instructors could make the seemingly difficult, achievable. “They (the instructors) make it so easy and straight forward that I can’t wait  138  to try it (Gerry was referring to inquiry through the use of ‘black boxes’ a science methods activity on the Nature of Science). It is important to note that the teacher candidates interviewed for this study described their methods instructors as having a positive effect on their science teaching. Although none of the teacher candidates presented here spoke of their instructors as ineffectual, Ross (1987) notes that previous teachers can also serve as negative role models. Koster, Korthagen, and Schrijnemakers (1995) also found such examples of negative role models. However, Zeichner, Tabachnick and Densmore (1987) emphasize that this influence of former teachers can also take place on a less conscious level. McEvoy (1986) addressed the issue of role models in a paper with the intriguing title “She is still with me”. Often it was a shock to these teachers to realize that they were describing characteristics that were now very obvious in their own way of teaching, although they had not consciously been aware of the modeling process. In line with this claim, Britzman (1986) argues that it can be important to have student teachers examine the values embedded in such role models to avoid an unconscious influence on their teaching behaviour. The teacher candidates’ responses indicate that their methods instructors drew upon a constructivist view of learning. According to Jill, exposure to constructivism began in her first class when the instructor asked: “Where did these ideas come from?” and later in that same class, the students were asked, “How could you use this experience outside the classroom, in your daily life?” I believe that it is necessary that new science teachers gain applicable knowledge and appreciation of each of the aspects of science teaching  139  with competencies of a constructivist science teacher. This view was revealed in Laura’s comment on electricity (batteries and bulbs): I thought I knew nothing about electricity. The in-class challenge was to get a light bulb to work using a battery, bulb, and two wires. That was easy. Once I was successful, I was given another challenge that was a little bit harder. The cycle of success then challenge resulted in me being to build both types of circuits. Stalheim-Smith and Scharmann (1996) and Stoddart, Connell, Stofflett and Peck (1993) found that the use of constructivist teaching methodologies and learning cyclesmethods emphasizing concrete learning can improve science learning by candidates in science education. Without competency in and subscription to constructivist science teaching, new teachers will not successfully teach for understanding and application utilizing a broad vision of science (Nezvalova, 2008). Schoon and Boone (1998) suggest that content courses helped pre-service elementary teachers build confidence only if students take courses that are specifically designed for elementary education majors. These courses are effective perhaps because more time was spent on how to teach the scientific concepts rather than just learning them (Alonzo, 2002; Appleton, 2006; Schoon, 1995; Schoon & Boone, 1998). Moreover, these courses have proven effective because the instructors adopted pedagogical approaches that focus on fostering students’ ownership over their learning rather than on the transmission of expert knowledge to the students (Abell & Smith, 1994; Alonzo, 2002; Appleton, 2006; Trundle, Atwood & Christopher, 2007). In addition to studentcentered pedagogies, science teacher educators must use assessment strategies that will  140  enable students to explain scientific concepts to others, opportunities that will enable them to defend their theories, and learn science with others in a constructive manner (Abell & Smith, 1994; Butts, Kobolla & Elliott, 1997; Settlage & Southerland, 2007). Based on the teacher candidates' interview responses, I discerned that non-teaching experience contributed to their awareness and knowledge of PCK. Teacher candidates revealed that this development in their PCK began when methods instructors encouraged them to draw upon their work experience and incorporate it into their teaching. Edward explains his reaction to the valuing of his previous work experience: “It was kind of neat to know that my job in water conservation had so many skills that were validated by the methods instructor.” In some cases, the teacher candidates were not aware of how valuable their work experience would be in a classroom. Chris commented on the ways his work experience could inform his teaching of physics topics: “It was great; I could take my experience as a pilot and apply it to the classroom.” Kevin noted that he could incorporate his previous work experience into planning a unit on structures: “There is no way in this world, I thought I could take my experience as a carpenter and apply to the classroom.” Michelle remarked that her work in a nursery school could be transferred to the elementary classroom setting: “I worked in a nursery. Most of the time I did grafting and seedlings. I never thought I could apply that knowledge to the classroom.” Edward, Kevin, and Michelle all brought valuable experiences from their previous careers to their teaching. Yet, they did not recognize that this work experience was applicable to and informed their classroom teaching until they were 'told' and given opportunities by their methods instructors to incorporate their previous experiences.  141  One could consider the applicability of the teacher candidates’ previous work experience to the classroom setting to be an example of the Transfer of Learning. Mestre (2002) and his colleagues provide the following definition: “Transfer of learning… broadly … mean(s) the ability to apply knowledge or procedures learned in one context to new contexts” (p. 3). Marini and Genereux (1995) elucidate this further by describing transfer of learning as “prior knowledge affecting new learning or performance” (p. 2). It is interesting to note the role of the teacher in the transfer of learning. Alexander and Murphy (1999) suggest that learning environments are often specifically structured against the practice of transfer, including the instructor neither modeling, rewarding, encouraging, nor giving opportunities to express transfer. Alternatively, instructors might assume it is the teacher candidate’s responsibility to transfer knowledge and leave it entirely up to her/him to make the necessary connections. Therefore, teacher candidates are left on their own to understand, for example, that their citation skills can be used in other courses or that their critical thinking skills will help them in any course. Engle (2006) found that when instructors framed multiple contexts for applying teacher candidate learning, the candidates were able to explain phenomena better in different situations. The teacher candidates’ responses indicate a richer post-science methods course awareness and knowledge of PCK. The teacher candidates compared their own Kindergarten to Grade 12 school experiences to what they learned in their science methods courses. They discovered new ways of teaching and the positive effects of learning with an expert who was also a positive role model. The teacher candidates also  142  became more aware of the value and applicability of their past work experiences and how these may impact on their current and future views of PCK.  Theme Four For Some Candidates a Successful Practicum Validated or Invalidated PrePracticum Awareness of PCK. The teacher candidates who participated in the study and were interviewed appeared to express an awareness and knowledge of PCK post practicum. This is conveyed in the illustrative interview excerpts below. The responses were prompted by open-ended interview questions that included: What is it about your awareness of teaching and science knowledge that might have been affected due to your practicum experience? When the teacher candidates answered the above question, they all indicated the pivotal role of the School Adviser (SA) in their awareness and knowledge of PCK. The SA is the classroom teacher who works with the teacher candidate during the practicum. Many of the students indicated that a good relationship with the SA had positive results in their development of PCK. Anna commented on the importance of the relationship with the school advisor: It’s really important to have a great connection with your SA … I had a really good SA who let me try and struggle. The relationship has such a strong impact on your teaching.  143  Laura's response indicated that she viewed her SA as a role model: I felt so lucky with my SA, and she was such a great model for me. I learned by watching her, observing how she interacted with the children. She was a wonderful mentor for me. Other students commented on the challenges associated with a School Advisor and the impact this kind of relationship can have on their development of PCK. In some cases, the student commented on their inability to practice their craft, be creative, and implement new ideas learnt in teacher education programme. Barb spoke about the limitations that some SAs placed on their teacher candidates: Some of the SAs were really controlling and did not want to give over control of their class. It is hard to work on your teaching when you are stuck behind a desk marking or photocopying. Some Teacher Candidates were not given the opportunity to teach some subjects as they were not part of the SAs current curriculum, as in the case of Anna: “I sort of did not teach science on the practicum as they had already done it last term, and now they were doing Social Studies.” Barb spoke about the complete absence of science from a teacher candidate's practicum due to a personal preference: “One person I know taught French because she did not want to teach science.” Another teacher candidate, Michelle, "felt in conflict" with her SA. She noted that: "I wanted to teach one way, and my SA wanted me to teach her way.” Based on the Teacher Candidates’ interview responses, the relationship with the School Advisor played a key role in the development of their PCK. A positive relationship where the SA became a role model and mentor resulted in the  144  Teacher Candidate gaining valuable teaching knowledge and experiences. A challenging relationship with the School Advisor often limited the Teacher Candidates’ experiences, and they did not have the opportunity to implement the ideas learnt in the teacher education programme courses nor in some cases, teach science at all. The teacher candidates indicated that the School Advisor played a pivotal role in their PCK. A positive relationship can validate and further develop their pre-practicum PCK whereas a challenging relationship may invalidate their prior PCK. Educational scholars find that cooperating teachers such as School Advisors are often the most influential in the development of novice teachers, as they have the most contact and communication with the student teachers. Norris, Larke, and Briers (1990) stated that “the student teaching center and the supervising teacher are the most important ingredients in the student teaching experience” (p. 58). Other investigators (Cano & Garton & 1994; Martin & Yoder, 1985; Deeds, 1993; Deeds, Flowers & Arrington, 1991;) also support this assertion. Martin and Yoder added that a student teacher’s success during his or her field experience was based “on the general supervisory climate in the department and on the educational leadership abilities of the cooperating teacher” (p. 21). In most instances, the relationship that a student teacher has with his or her cooperating teacher is unique. Montgomery (2000) stated, “If the perspective of the cooperating teacher conflicts with the perspective learned by the student teacher, this relationship does not permit a smooth transition for the student teacher” (p. 7). As illustrated by the interview comments, the role of the school advisor as a model and mentor is an important factor in a teacher candidate’s developing knowledge and awareness of PCK. Berliner (1986) described a mentor as an expert who modeled  145  practice. According to Barab and Hay (2001), a “mentor models, then coaches, then scaffolds, and then gradually fades scaffolding” (p. 90). Moran (1990) expands on the mentor role and states that modeling of practices can aid pre-service teachers towards understanding their own practices.  Summary I analyzed, interpreted, and described principal components of PCK that are prevalent amongst the participating teacher candidates. The analysis indicated that the factors or principal components or generally the aspects of PCK in which awareness was demonstrated, were affected significantly as the students experienced their science methods course and extended practicum during their 12 months in the Teacher Education Programme. After completing the science teaching methods course, a similar analysis indicated a convergence of the factors into two main ones. I speculated this divergence to be a result of the teacher candidates’ exposure to the broad view of PCK inherent within the Teacher Education programme or the view of PCK that their Teacher Education instructors exhibited or modeled; hence a collapsing of pre-programme PCK components into two. In my opinion, the striking part was the analysis of the snapshot quantitative measurements after practicum that resulted in three factors or principal components of PCK. I find this to be surprising and then again, as I reflect upon the teacher candidates and their school experiences, perhaps not. Returning to the elementary classroom for the extended practicum was similar to a reactivation of their lived experience. The teacher candidates were exposed to elementary school teaching for years longer than their time in  146  the Teacher Education programme. However, one would expect the components of PCK to converge further as many teacher candidates initially expressed anxiety about teaching science. Therefore I hoped the teacher candidates’ new perspective on science teaching would be sustained as they experienced the extended practicum. In this regard, I speculate that the School Advisors’ evaluative role, combined with their influential position in guiding the students’ practical teaching experience, outweighed any influence the programme’s seed courses such as science methods might have had. In addition, I believe that the teacher candidates’ awareness of the gatekeeper responsibility of the SA and the fact that different SAs have or model different perspectives of PCK, inevitably translate into the increased number of components after practicum. These manifested into emergent themes that were determined through qualitative/interview data after a rigorous phenomenographic analysis. In the chapter that follows, the results of the quantitative and qualitative data analyses will be discussed. Correspondence between the data will be described to provide a simplified insight into the teacher candidates’ PCK and how their PCK evolved following experiences in the science methods course, the extended practicum, and during their time in the 12-month Teacher Education Programme.  147  CHAPTER SIX: DISCUSSION  This chapter presents a discussion of the study’s quantitative and qualitative results. The analysis of questionnaire data in Chapter 4 revealed aspects or components of PCK that the teacher candidates exhibited at entry, during, and just after completing their Teacher Education programme. These points or “Rounds” have been defined as: Round One –questionnaire and related interview data gathered at the point of entry into the Teacher Education Programme; Round Two – questionnaire and related interview data gathered at the end of participating in a science methods course; and Round Three – questionnaire and related interview data gathered at the end of their practicum experience. Determining points for discussion was derived from examining the components of PCK outlined in Chapter 4 and attempting to find commonalities with the intent to create themes for discussion. The simplest strategy for ordering components proved to be the most revealing as themes were created by grouping the components of PCK that either had or had not experienced various degrees of significant change. The components of PCK emerging from Chapter 4 include the following from Round One: awareness of the knowledge of the process of science; awareness of selfcompetence; and awareness of teaching strategies. Emerging from Round Two: awareness of the nature of science and awareness of pedagogy. Finally, the components of PCK emerging from Round 3 were as follows: awareness of competence and confidence; awareness of skills and knowledge; and awareness of the relevance of science teaching strategies.  148  The analysis of the interview data in Chapter 5 revealed themes that correspond to components of PCK. After the questionnaire was completed, two one-on-one interviews were conducted with volunteer participants. The first interview took place after the completion of the science methods course, and the second occurred after practicum. Thus, the interview data sets were carefully examined and categorized using a thematic approach to address the study objectives. The phenomenographic analysis of interview data sets resulted in four key themes that corresponded (but not directly one-on-one) to the PCK components: 1) Expression of pre-programme awareness of pedagogy. 2) Expression of pre-programme awareness of their content knowledge as regards science. 3) Expression of richer post science teaching methods awareness of PCK. 4) Expression, for some candidates, that a successful practicum validated or invalidated pre-practicum awareness of PCK.  Each quantitative theme in Chapter 4 was then compared to the qualitative themes in Chapter 5 and explored for commonalities and differences. Three themes emerged from the comparison of themes discussed in Chapters 4 and 5: 1. Teacher candidates understand teaching as the way they were taught; 2. Campus to the classroom; and 3. Teacher Education.  149  Teacher Candidates Understand Teaching as The Way They Were Taught The key components extracted from entry questionnaire data using principal component factor analysis indicate that the teacher candidates’ Awareness of Teaching Strategies did not demonstrate statistical significance. That is, there was no significant difference in terms of degree of change during experiences in the teacher education programme, which is indicated in mean score difference between different points in time when the questionnaire was administered. Thus, comparing statistical data for the Awareness of Teaching Strategies component between round: Round One-Round Two (p=1.000); Round Two-Round Three (p=1.000); Round One-Round Three (p =1.000) indicates no significant change in this component. In other words, the interventions, including the science teaching methods course, the teaching practicum, and other experiences in the Teacher Education programme, did not affect this aspect of the teacher candidates’ PCK. Furthermore, this component mirrors an emergent theme from the interview data analysis that was interpreted and described as Expression of pre-programme awareness and knowledge of Pedagogy resulting from K-12 student experience, perception of admission requirement as validation, successful pre-programme classroom-based volunteer, peer teaching, and/or work experience (see Chapter 5). A close examination of this theme supports the above interpretation of there being no significant change in the statistically determined PCK component of Awareness of Teaching Strategies. For example the excerpt from Barb in Chapter 5 conveys this eloquently as the teacher candidates believe they possessed awareness and knowledge of pedagogy through their prior experience as science students at both elementary and secondary levels during  150  volunteer experiences in classrooms or educational settings. This could be explained by what has already been mentioned in the literature reviewed in Chapter 5. I attribute the lack of significant statistical change in this aspect of PCK in part to what Blanton (2003) and Nashon (2006) describe as teachers’ tendency to teach the way they were taught. Furthermore, as noted in Chapter 5, teacher candidates’ prior beliefs about teaching are quite stable and enduring over the course of their school and university education and, as Pajares (1992) notes, well established by the time they attend Teacher Education programmes. Beliefs are also said to be resistant to change and thus unaffected by subsequent education coursework (Weinstein, 1990). Researchers have attributed prior beliefs to candidates’ personal history (Holt-Reynolds, 1992) and previous school experiences, including exposure to teacher role models (Calderhead & Robson, 1991; Richardson, 1996; Sugrue, 1997). At the time of entry into a teacher education programme, most students have images of what they would be like as teachers (Kennedy, 1997). These images are often formed in response to early childhood experiences with teachers, and may be modelled after former teachers or in response to a desire to improve on the practice of former teachers (Calderhead, 1991; Calderhead & Robson, 1991; Kagan, 1992). It could be argued that years of exposure to specific processes such as teaching may establish procedures which then become ingrained and difficult to change to the point where processes activate without awareness or deliberate intent (Aarts & Dijksterhuis, 2000, 2003; Bargh & Ferguson, 2000). Once a skill is ingrained or automatic, it no longer operates in a way immediately or necessarily available to conscious monitoring.  151  Further, the skill is completed without interruption which limits the ability to modify performance (Wheatley & Wegner, 2001). At times, the students interviewed in this study expressed frustration over the number of methods courses they were required to take in one term. Chris commented: In the winter term you have to take Math, Science, Social Studies, Physical Education, Art, and another Language Arts course. When do you get to think? Two days a week we went from 8:30 am until 5pm with 1 hour for lunch. But I remember rarely having the time to eat as there was usually a meeting scheduled. After classes we had to meet to do group based assignments. I mean it, when do you have time to think? In a succinct way Chris described two outcomes of taking multiple methods courses at the same time, fatigue and cognitive overload. According to Sweller (1989, 1990), cognitive overload occurs when the total processing demands of external stimuli and internal cognition exceed available “attentional” resources. Thus, levels of cognitive load significantly affect both learning and performance in authentic contexts (Goldinger, Kleider, Azuma & Beike, 2003; Sweller, 1988; Sweller, van Merrienboer & Paas, 1998). As overload occurs, mental operations that process information with little or no conscious awareness (i.e., impose little or no cognitive load) tend to dominate as automaticity. The following are characteristics of automaticity: (a) occurs without intention (or continues to completion without effort); (b) is not subject to conscious monitoring; (c) utilizes few, if any, “attentional” resources; and (d) takes place rapidly (Moors & De Houwer, 2006; Wheatley & Wegner, 2001).  152  The second key component extracted from entry questionnaire data using principal component factor analysis, the teacher candidates’ Awareness of Self-Competence, demonstrated statistical significance in terms of degree of change during their experience in the teacher education programme as indicated in mean difference between different points in time when the questionnaire was administered. This component mirrors an emergent theme from the interview data analysis that was interpreted and described as: Expression of pre-programme awareness and knowledge of their science content knowledge gained through prior experience as science students at the primary, secondary, and post-secondary levels, during volunteer experiences in classrooms or educational settings. Closer attention to this theme supports the above interpretation of the significant change in the statistically determined PCK component of “Awareness of Self-Competence in Teaching Strategies.” I attribute the statistically significant change to what Weinstein (1989) labels “unrealistic optimism.” According to Weinstein (1982), unrealistic optimism or optimistic bias (Klein & Weinstein, 1997) is explained by Hoy and Weinstein (2006) as “the tendency to believe that the problems experienced by others would not happen to them” (p. 205). Weinstein (1990) found that upon entry into the programme, teacher candidates rated themselves as above average for nearly all teaching skills, leading to the conclusion that “pre-service teachers may indeed have an unrealistic view, often bordering on overconfidence, about their ability to become effective teachers” (Weinstein, as cited Walker, 1992). As Barb commented, “I worked as a tutor and had volunteered in my old school so I had lots of experience … I was pretty confident I knew how to teach.”  153  This overconfidence could be attributed to teacher candidates entering into teacher education programmes with powerful and significant preconceptions about teaching which developed from their own personal histories (Carter, 1994; Finnie, 2004). As Chris explained in Chapter 5, “I am already an educator… .” Another dimension of student optimistic bias was revealed by Weinstein (1989) and Poulou (2007) who noted that teacher candidates who possessed a particular attribute tended to rate that attribute as more important for teaching. These teacher candidates were also more optimistic about their future teaching performance. Frank articulates this ‘optimism’ in his comment about his “attribute,” coaching experience and expertise: “Coaching really prepares you for teaching. It’s all about making sure everyone is listening and doing what they are supposed to. They listened to me because they knew I meant business!” Frank discusses his ability “to get everyone’s attention,” a direct result of his coaching experience, as a key attribute to teaching. Barb states that her key to teaching was her organizational skills: “Well I knew how to be organized because I worked in a daycare. You have to keep the kids busy with lots of activities. Plus your day is super planned out with specific things like snack time, group time, and sometimes a field trip.” Ann’s comments reveal that her relationship with students was the key to teaching: “Oh and I love kids! You have to really enjoy being with them or forget it. You could never be a good teacher if you didn’t like your students!” After their science methods course, the teacher candidates demonstrated a significant increase in their awareness in self-competence. This result could be best explained from the business perspective. Paul Sloan (2007) writes, “When things are going well we can become blinded. Success becomes a prophylactic against disruptive  154  ideas. Why should we change a winning formula? Don’t mess with success!” (p. 149). Bob’s comments support a “don’t mess with success” perspective: “You must understand that to get this far, we, as students must have figured out how to be a good student. Why change? It works!” This idea of not “messing with success” can also be seen in the teacher preparation courses as they often reinforce rather than challenge pre-service teachers’ prior knowledge and beliefs (Britzman, 1991; Feiman-Nemser & Buchmann, 1986; Zeichner & Gore, 1990). After the practicum, the teacher candidates registered a significant decrease in their awareness of self-competence when compared to their after-methods and their entry questionnaires. It is thought that candidates enter teacher education not only with powerful personal history based theories (Holt-Reynolds, 1992) about what constitutes good teaching practice, but also with a great deal of confidence in their ability to teach successfully, even without any formal training (Book, Byers & Freeman). Perhaps, as Holt-Reynolds (1992) suggests, teacher education students base their perceptions of teaching on personal history-based lay theories that are typically generalizations based on references to themselves in the role of students. Because they have been successful as students, they may optimistically assume that they will also be successful as teachers. Adding to this is an optimistic individualism. Believing in the inevitability of triumph over any obstacle through hard work and individual efforts (Ahlquist, 1992; Finney & Orr, 1995; McCall, 1995; Nieto, 1998). The following comments articulate the beliefs that teacher candidates held. Edward remarked, “I have a BSc. I know how to  155  work hard. I always got good marks so no problem for me.” Anna also supported the belief in the efficacy of work and effort, “I don’t worry about it. It’s different from doing a normal undergrad degree, but it’s nothing I haven’t done before. It’s busy work. Lots of assignments, but they’re not hard. It’s just keeping up and being organized. I know how to do that!” Jill explained that due to her hard work and resultant good mark in a prerequisite course, teaching science would be easy, “I had to take a science course as a prerequisite to get into this programme. I worked hard and did really well so I am pretty confident about teaching it.” I attribute the lack of significant statistical change in this aspect of PCK to the role experience plays in PCK. Teacher candidates possess awareness and knowledge of PCK through their prior experiences as students and volunteers, during career path employment, or as parents. They are so entrenched in their beliefs and knowledge about teaching that they may not be cognizant of their views. Hence they see no need to change what they perceive as “successful,” supporting in part what Blanton (2003) and Nashon (2006) describe as teachers’ tendency to teach the way they were taught (Geddis & Roberts, 1998; Hollingsworth, 1989; MacKinnon & Erickson, 1992; Tobin, Tippins & Gallard, 1994).  Campus to the Classroom: The Transition The factors/components: Awareness of the Knowledge of the Process of Science; Awareness of the Nature of Science; Awareness of Competence and Confidence; Awareness of Skills and Knowledge; and Awareness of the Relevance of Science Teaching Strategies all shared a common trait. Each demonstrated statistical significance  156  in terms of degree of change during their life in the teacher education programme as indicated in mean difference between different points in time when the questionnaire was administered upon entry and after the methods course and the extended practicum (Chapter 4). These points are referred to as “Rounds.” But factors/components were not statistically significant in terms of degree of change between when the questionnaire was administered upon entry and after the extended practicum (Chapter 4). In other words, the interventions (science teaching methods course and teaching practicum) may have contributed to an effect on the teacher candidates’ PCK, but there was no overall change in their PCK from starting point to end point. This component mirrors an emergent theme from the interview data analysis that was interpreted and described as Expression of richer post science methods course awareness and knowledge of PCK resulting from programme enhanced perspective on K-12 content experience, experience of interacting with methods instructors, and new content learning strategies (see Chapter 5). Closer attention to this theme supports the above interpretation of the significant change in the statistically determined PCK components: Awareness of the Knowledge of the Process of Science; Awareness of the Nature of Science; Awareness of Competence and Confidence; Awareness of Skills and Knowledge; and Awareness of the Relevance of Science Teaching Strategies but an overall change that was found to be statistically insignificant. According to Daas (2004), a critical component of a pre-service science teacher preparation programme is the science teaching methods course. Methods courses may offer one of few opportunities for teacher candidates to view new knowledge through a lens not tainted by prior knowledge about teaching and learning. The usual intent of the science methods course is to help teacher candidates develop an understanding of various  157  aspects of science instruction such as pedagogical approaches, management strategies, and assessment techniques. The methods course is typically taken prior to the practicum, and students usually have the opportunity to prepare and practise science lessons, units, and related activities. Mellado (1998) investigated the relation between teaching/learning science concepts and practice when teaching science. The results established a general correspondence between the two. Mellado explains that the academic knowledge teacher candidates receive during teacher training is not sufficient to know how to teach science. Therefore it is necessary to equip teacher candidates with the professional component that involves personal teaching experience, reflection, and self-knowledge. The timing of the methods course is important to teacher candidate success on the practicum (Zeichner & Teitelbaum, 1981). When there is a lack of connection between the methods course and classroom practice, the result is that teacher candidates often adopt the practice of the school advisor and the school culture, causing any prior experiences to be "washed out” leading to “teachers tend to teach they way they were taught”. General educational theory is only helpful for teacher candidates when a connection is established linking the theory to their actions in concrete practical situations where real-world problems are encountered (Joyce & Showers, 1988). The critiques of methods courses range from being too idealistic or not practical enough (e.g., Hermanowicz, 1966; Koehler, 1985; Lortie, 1975) to being too simplistic and without rigour (e.g., Beyer & Zeichner, 1982; Koerner, 1963; Lyons, 1980). Regardless of the views, there is an assumption in the literature that methods courses play a vital role in teacher education (Adamy, 1999; Beisser, 1999; Handler & Marshall, 1992; Wetzel, 1993).  158  Gustafson and Rowell (1995) found it difficult to predict the degree to which ideas and attitudes developed during the science methods course will be modified as a result of in-service teaching experiences. The extent to which in-service teachers will continue to use a constructivist approach may depend on the support and encouragement provided by the school environment. As the teacher candidates move from their methods course into the classroom they appear to take on the persona of the classroom teacher and become involved in the theory versus practice debate. Many practising teachers tend to view academic knowledge as impractical, decontextualized, not credible, and inaccessible, and they rely more on their own and their colleagues’ experiences (Gore & Gitlin, 2004). Teacher Candidates tend to share these attitudes. In their study of teacher candidates, Hacher, Cocard, and Moser (2004) found the following statement to be typical: “Forget about theory; practice is all” (p. 623). Moreover, Teacher Candidates and practising teachers in the "field" tend to dismiss methods instructors as residing in the cloisters of the university or college and being out of touch with the realities of today's schools (Oftedahl, 1984). Practising teachers often argue that teaching is learned by experience. Research indicates that the nature of the school influences its teachers; likewise school advisors influence teacher candidates’ instructional styles. As Russell, Munby, Spafford, and Johnston (1988) state: "Professional knowledge consists of more than that which can be told or written on paper" (p. 67). Educational studies often report that teacher candidates view their field experiences as the central part of their teacher education programme (Eltis & Cairns, 1982; Zeichner, 1983). Many teacher candidates believe that field experiences provide the only "real"  159  learning (Amarel & Feiman-Nemser, 1988). The commonly held perception by teacher candidates seems to be that a simple relationship exists between the quantity of field experience and the amount of learning. The more experience one has in classrooms, the more one will learn about teaching (Johnston, 1994).  Teacher Education Mean scores of the items on one of the key PCK components, Awareness of Pedagogy, determined through factor analysis indicated no statistically significant difference between round: Round One-Round Two (p=1.000); Round Two-Round Three (p=1.000) and Round One-Round Three (p=.061). In other words, the interventions (science teaching methods course, teaching practicum and other experiences in the Teacher Education Programme) may not have an effect on the teacher candidates’ PCK. Furthermore, this component mirrors an emergent theme from the interview data analysis that was interpreted and described as: For some candidates a successful practicum validated or invalidated pre practicum awareness and knowledge of PCK. Closer attention to this theme supports the above interpretation of no significant change in the statistically determined PCK component Awareness of Pedagogy. Reflection is a cornerstone of learning and of personal and professional development (Baird, 1993). Educators support a need for reflection (Grimmet, MacKinnon, Erickson & Riecken, 1990; Laboskey, 1994) in teacher education as a means of encouraging teachers to examine their beliefs about teaching and to use their knowledge to construct change in their practice (Calderhead, 1989; Clift, Houston & Pugach, 1990). Dewey (1933) distinguished "reflective thinking" from other forms of  160  thought by two characteristics: a state of doubt or hesitation in which thinking originates in the practice situation, and an act of inquiring to find materials that will resolve the doubt and dispose of the perplexity. The reflective process is valued in professional growth and successful teaching (Dewey, 1933; Kraus & Butler, 2000; Schön, 1987; Valli, 1992). Schön (1987) endorses "reflection in action" (p. 26) as a process, which requires practitioners to "remake their practice worlds" (p. 36). Reflection is not just a point of view, rather it is a process of deliberative examination of the interrelationship of ends, means, and contexts. Shulman (1986) identifies reflection as one stage of pedagogical reasoning. During this stage, teachers look back on their teaching. The process of "looking back" should be conceptualized as one of re-enactment, rather than just looking back to recall what occurred in the teaching episodes. The critical reflection of "looking back" requires skills that teacher candidates may lack or may not develop fully. The teacher candidates are required to reflect in their e-portfolios (an on-line document of their programme experiences and reflections) on the process and achievement of knowledge, skills, or competencies in teaching and learning. Rarely are teacher candidates taught the purposes, processes, and philosophies of reflecting in multiple contexts of education which are essential for analyzing and understanding their own growth and development (Gallavan & Webster-Smith, 2010). Too often, teacher educators ask candidates to reflect on their practice without providing them with the necessary tools and refined techniques to ensure that reflection is principled, productive, and positive (Gallavan & Webster-Smith, 2010). The question of what reflective teaching is and what it implies for teacher education  161  is not widely agreed upon in the literature and in the field. Terms such as “reflective practice,” “inquiry oriented teacher education,” “reflection-inaction, ” “teacher as researcher,” “teacher as decision-maker,” “teacher as professional,” and “teacher as problem-solver” all encompass some notion of the role of reflection in the process of professional development. At the same time these varied terms disguise a vast number of conceptual variations. Therefore a range of alternative implications for the organization and design of teacher education courses must be provided with opportunities for analyzing teaching through reflective practice (Calderhead, 1989). Research suggests that student teachers’ reflections generally remain at a fairly superficial level. This is often the outcome even in teacher education courses which purport to be encouraging reflective teaching (Borko, Livingstone, Borko et al. 1988; Calderhead, 1989; Feiman-Nemser & Buchmann, 1987). To some extent, it may be attributed to the teacher candidates’ inability to confront and reflect upon their own practice. In teaching, their attention is so focused on delivering a lesson that they have little time to consider the ‘big picture’ of how the lesson is actually going. In addition, teacher candidates often have a high level of “ego-involvement” in their conceptions of themselves as teachers. They are reluctant to be self-critical and dwell upon their weaknesses at a time when their confidence may well be under threat. Teacher candidates also appear to lack the analytical skills to examine their own practice; they lack a language for talking about teaching and sometimes fail to understand the comments supervisors make on their performance. They may also lack the knowledge of a repertoire of alternative teaching approaches (and clear ideas about the criteria one might use in judging alternatives) that they might draw upon to evaluate their  162  own teaching. Teacher candidates often have difficulty cueing in to classroom processes (Calderhead, 1988; Copeland, 1981) as they lack the concepts with which to perceive what is going on. They lack knowledge on how to plan a lesson on a particular topic for Grade 5s and how to take into account the students’ skills and attitudes, learners with special needs, and how students are grouped (ability, age, or other). With practising teachers, such knowledge developed after considerable classroom experience (see Berliner, 1987). Similarly, the task of evaluating their own lessons depends on knowledge that teacher candidates do not yet possess. To be able to answer such questions as “How well did I teach?”, “What were the effects of my teaching?”, “What assumptions have I made in teaching this way?”, “How else might 1 have taught the lesson?”, or “How will I do it next time?” require the teacher candidates to draw upon knowledge of alternative teaching approaches. Similarly, the task of evaluating their own lessons depends on knowledge that students do not yet possess. To answer such questions requires the teacher candidate to draw upon knowledge of alternative teaching approaches, of children’s typical performance and achievement, and of criteria for judging teaching. Teacher candidates may well lack the knowledge that enables the necessary comparisons and evaluations these questions require. It is thought that teacher candidates enter teacher education not only with powerful personal history based theories (Holt-Reynolds, 1992) about what constitutes good teaching practice, but also with a great deal of belief in their ability to teach successfully, even without any formal training. Weinstein (1990) found entering teacher candidates rated themselves as above average for nearly all teaching skills. The combination of a high level of confidence in their own ability to cope with teaching and a belief in personal  163  success may account for the reality shock experienced by students during their practicum. Teacher candidates have extensive experience as students but little experience thinking about teaching and learning. They lack the knowledge, skills, and ability to effectively self-reflect to become the teachers they want to become. The result is that in most cases, teacher candidates return to entry levels of PCK.  Summary, Future Research and Conclusions My research investigated elementary teacher candidates’ Pedagogical Content Knowledge in science. I investigated their PCK upon entry into the Teacher Education Programme, after participating in 1st semester courses, with a focus on their science teaching methods course, and after completing the long practicum. My analysis revealed a series of themes. The teacher candidates believed they possessed PCK upon entry into the teacher education programme, which increased after their science methods class. The teacher candidates experienced a slightly lower level of awareness of PCK after their extended practicum. In the final sections, I summarize my research findings by revisiting the questions that guided my study. 1. What are elementary science teacher candidates’ levels of PCK: (a) at entry into the teacher education programme, (b) immediately after completing their science methods course, and (c) after completing their school practicum? 2. How does the collective elementary teacher candidates’ pedagogical and content knowledge evolve as they participate in the science teaching methods course and extended practicum?  164  Summary To investigate the elementary teacher candidates’ PCK, I conducted a study of 1Year Elementary Teacher Candidates enrolled in the Teacher Education Programme at a university in western Canada. For this study I adopted a mixed methods research strategy. I employed quantitative methods to test/measure the entry level of the teacher candidates’ PCK and to determine the effect of the science teaching methods course and extended practicum experiences on the participants’ PCK. I employed an adapted instrument to assess the candidates’ level of PCK at various times during their 12-months in the teacher education programme. I also examined student views on their experiences using qualitative methods that included semi-structured interviews. The data were analyzed using statistical analysis and qualitative procedures. Results were obtained through multiple methods and then compared. My results are presented below.  Research Question 1: What are elementary science teacher candidates’ levels of PCK: (a) at entry into the teacher education programme, (b) immediately after completing their science methods course, and (c) after completing their school practicum? The students entered the programme with mean of 7 out a maximum of 11 (M=7.1431, SD=1.07794). Most of the teacher candidates interviewed expressed a belief that they possessed awareness and knowledge of pedagogy gained through prior experiences including being science students at both elementary and secondary levels and during volunteer experiences in classrooms and/or educational settings. Others perceived admission into the programme as recognition of their possessing an awareness and  165  knowledge of teaching. This perception may be in part due to one of the admission requirements, classroom experience. At the end of the science teaching methods course (T2), the teacher candidates had a mean of 8 out of a maximum of 11 (M=8.0719, SD=1.03280). The teacher candidates who participated in the study appeared to express enrichment of their PCK after their methods course. They attributed the change to working with their instructor who they regarded as an expert on the topic. Through a constructivist view of learning based on their experiences as students in elementary and secondary school, the teacher candidates came to value non-teaching experiences. At the end of the extended practicum, which coincided with the end of their 12 month Teacher Education Programme, the teacher candidates’ mean was 7 out of a maximum of 11 (M=7.0783, SD=0.72437). The teacher candidates who participated in the study and were interviewed expressed an awareness and knowledge of PCK post practicum. Their change in PCK to near entry levels were closely linked to their relationship with the school advisor and to the real-world teaching context of the practicum. In some cases, the teacher candidates’ relationship with their school advisor affected their ability to practice their craft, be creative, and implement new ideas learnt in the teacher education programme.  166  Research Question 2: How does the collective elementary teacher candidates’ pedagogical content knowledge evolve as they participate in the science teaching methods course and extended practicum? The statistical data reveal how the teacher candidates’ PCK evolved through components of awareness. Components experienced significant change between Rounds One (T1) and Two(T2), Rounds Two (T2) and Three (T3), and did not experience significant change between Round One (T1) and Three (T3): •  Awareness of the knowledge of the process of science; was found to have significant differences when Round One was compared to Round Two p=.000, and when Round Two was compared to Round Three p=.000.  •  Awareness of the nature of science was found to have no significant differences when Round Three was compared to Round One p=1.000. A significant difference occurred when Round Two was compared to Round One p=.000 and when Round Two was compared to Round Three p=.000.  •  Awareness of competence and confidence was found to be significantly different when Round One was compared to Round Two p=.000, and when Round Three was compared to Round Two p=.000. No significant differences in score occurred when Round Three was compared to Round One p=1.000  •  Awareness of skills and knowledge needed was found be significantly different when Round One was compared to Round Two p=.000 and when Round Three was compared to Round Two p=.000. No significant differences in mean scores occurred when Round Three was compared to Round One p=1.000  167  The dominance of the practicum and the reality of the associated real-world teaching context clearly reveal the impact on how PCK evolves. Interviews revealed a range of possible explanations including: “Teaching my classmates at UBC was so much easier than teaching my practicum students;” “I wanted to teach one way and my SA wanted me to teach her way;” and “Some of the SAs were really controlling and did not want to give over control of their class. It is hard to work on your teaching when you are stuck behind a desk marking or photocopying.” Components that experienced significant change between Rounds One (T1) and Three (T3) and did not experience significant change between Rounds One (T1) and Two (T2) and Rounds Two (T2) and Three (T3): •  Awareness of the relevance of science teaching strategies mean scores were significantly different when Round One was compared to Round Two p=.000 and when Round Three was compared to Round Two p=.000. No significant difference in mean score occurred when Round Three was compared to Round One p=1.000.  The practicum experience had a statistically significant impact on the teacher candidates as some reported opportunities for intensive science teaching while others reported an absence of teaching science on their practicum. Components that experienced significant change between all three Rounds (T1, T2, T3): •  Awareness of self-competence was found be significantly different when Round One was compared to Round Two p=.000, Round Two was compared to Round Three p=.000 and when Round One was compared to Round Three p=.000.  168  A simple explanation for this evolution was the teacher candidates’ success in their courses and practicum: in essence, they passed. Components that did not show significant change between all three Rounds(T1, T2, T3): •  Awareness of teaching strategies was found to have no significant differences when Round One was compared to Round Two p=1.000, Round Two was compared to Round Three p=1.000 and Round One was compared to Round Three p= 0.061  •  Awareness of pedagogy was found to have no significant differences in mean scores when Round Two was compared to Round Three p=.190 and when Round Two was compared to Round One p=.190. A significant difference occurred when Round Three was compared to Round One p=.210.  During the interviews, many of the teacher candidates expressed that they were already competent teachers, and this belief was enforced through their success in the practicum.  Recommendation for Further Research This study of teacher candidates’ PCK and programme experiences generates many questions and raises many issues for discussion. Some questions may find answers within the study, and others may require further research. One possible question for futher study focuses on how the collective elementary teacher candidates’ pedagogical and science content knowledge evolves as they participate in the science teaching methods course and extended practicum. Although this study touched upon this subject, further studies are required for a more extensive exploration. Another possible area for 169  further research involves identifying individual teacher candidates and tracking their experiences as they progress through the programme and longitudinally through postBEd professional practice. This could be one way to document and assess the effect of the teacher education programme on an individual level. In the dialogue presented earlier, I investigated the level of PCK in teacher candidates upon entry into the teacher education programme, immediately after completing their science methods course, and after completing their extended school practicum. This study did not investigate how additional BEd courses may affect the teacher candidates’ PCK. Hence, one possible area for further research would be to measure the impact of other BEd courses on the teacher candidates’ PCK. Also, researching the teacher candidates’ experiences in all of their courses may provide additional information regarding their levels of PCK and the evolution of their PCK. This study is not considered to be conclusive of how teacher candidates’ PCK is affected as they participated in the science teaching methods course and extended practicum. It was conducted with one cohort, in one year, at one University; a study extending to multiple years, and multiple Teacher Education Programmes at other universities would provide additional insight on how teacher candidates’ PCK was affected as they participated in the science teaching methods course and practicum. A study of teacher candidates’ PCK in the Secondary (science) Teacher Education Programme, as they participate in their science teaching methods course and practicum also warrants investigation. This may provide insights into how secondary science teacher candidates’ PCK are affected by the methods course and practicum. An exploration of the role that content knowledge may play in PCK for students enrolled in  170  the Secondary Science Teacher Education who have a Bachelor of Science Degree (B.SC.) would also warrant investigation. Also, studying teachers in their first year of full-time employment would provide another prespective on their evolving PCK and the real world context of the classroom. A more nuanced study of subject specific science methods could be completed to determine effects on PCK. For instance, would an Environmental Science methods course be more likely to affect PCK than a Physics methods course? What is more influential on elementary teachers’ PCK, biology or chemistry methods course? At the same time, can one science course for elementary teacher candidates, in Jill’s words, be “sciencey” enough to make a difference in how these teachers will teach? Similarly, studies of technological and pedagogical content knowledge (TCPK or TPACK) (Koehler & Mishra, 2005; Mishra & Koehler, 2006) exist, but researchers have yet to address how TPACK develops in concert with science knowledge in elementary teacher education. Archambault and Crippen (2009) offer an instrument that can be adapted and integrated with the Elementary Science PCK instrument I developed and used in this study. Finally, it is crucial that researchers attend to performance measures of PCK or TPACK to address limitations of self-efficacy studies. Instead of measures of awareness, attitudes or interests in science knowledge, research is needed for performance measures of teacher candidates’ scientific literacies, digital literacies, or technological literacies. It is hoped that this research into how PCK evolved as teacher candidates participated in their science teaching methods course and practicum will contribute to an understanding of the ways teacher candidates develop as teachers. This study may also  171  assist adminstrators and instructors in teacher education programmes develop and implement courses and practicum experiences that enhance teacher candidates’ teaching and support, inform, and inspire them to be effective teachers.  Conclusions Upon entry into the Teacher Education Programme at a university in Western Canada, the teacher candidates possessed an understanding of teaching developed from their experiences as students in a classroom and also based on their ability to meet the programme’s admission requirements. The experience of working with experts and the mix of theory and practice in the science methods course brought about a broadening of their PCK. It is clear that the science methods course contributed to influencing the teacher candidates’ knowledge of the two PCK components: awareness of the nature of science and awareness of pedagogy. The extended practicum brought about many significant changes in the teacher candidates’ PCK and affected their awareness of the relevance of science teaching strategies. Through interaction with classroom teachers within the real-world teaching context of the classroom, the teacher candidates developed an awareness of their competence. Changes occurred in the teacher candidates’ levels of awareness in their confidence, teaching skills and strategies.  172  References Aarts, H. & Dijksterhuis, A. (2002). Category activation effects in judgment and behavior: The moderating role of perceived comparability. British Journal Social of Psychology, 41, 123-138. Aarts, H. & Dijksterhuis, A. (2000). Habits as knowledge structures: Automaticity in goal-directed behavior. Journal of Personality and Social Psychology, 78, 53-63. Abell, S.K. (2008). Twenty years later: Does pedagogical content knowledge remain a useful idea? International Journal of Science Education, 30(10), 1405-1416. Abell, S. K., & Smith, D. (1994). What is science? Preservice elementary teachers’ conceptions of the nature of science. International Journal of Science Education, 16(4), 475-487. Adamy, P. (1999). An analysis of factors that influence technology integration by math teacher educators. Unpublished doctoral dissertation, University of Virginia, Charlottesville, VA. Ahlquist R. (1992). Manifestations of inequality: Overcoming resistance in a multicultural foundations course. In C. A. Grant, Research and multicultural education: From margins to the mainstream, 89-105. Washington, D.C.: Falmer. Åkerlind, G. (2005). Variation and commonality in phenomenographic research methods. Higher Education Research and Development, 24(4), 321–334. Åkerlind, G. S., (2005b), Learning about phenomenography: Interviewing, data analysis and the qualitative research paradigm, in J. Bowden & P. Green (Eds.), Doing Developmental Phenomenography, Qualitative Research Methods Series. 56 – 62. Melbourne, Victoria: RMIT University Press. Akerson, V. (2005). How do elementary teachers compensate for incomplete science content knowledge? Research in Science Education, 35, 245-268. Alexander, P. A., & Murphy, P. K. (1999). Learner profiles: Valuing individual differences within classroom communities. In P. L. Ackerman, P. C. Kyllonen & R. D. Roberts (Eds.), Learning and individual differences: Processes, traits, and content determinants, 413-431. Washington, DC: American Psychological Association. Alonzo, A. C. (2002). Evaluation of a model for supporting the development of elementary school teachers’ science content knowledge. Proceedings of the Annual International Conference of the Association for the Education of Teachers in Science, Charlotte, NC.  173  Amarel, M., & Feiman-Nemser, S. (1988, April). Prospective teachers' views of teaching and learning to teach. Paper presented at the annual meeting of the American Educational Research Association, New Orleans, LA. Anastasi. A. & Urbina, S. (1997) Psychological testing. Upper Saddle River, NJ: Prentice-Hall. Anderson, J. R. (1995). Cognitive psychology and it’s implication. New York: Freeman and Company. Anderson, R. D., & Mitchener, C.P. (1994). Research on science teacher education. In D. L. Gabel (Ed.), Handbook of research on science teaching and learning, 3–44. New York, NY: Macmillan. Applegate, J. H., & Lasley, T. J. (1982). Cooperating teachers’ problems with preservice field experience students. Journal of Teacher Education. 33(2), 15-18. Appleton, K. (2003). How do beginning primary school teachers cope with science? Toward an understanding of science teaching practice. Research in Science Education, 33, 1–25. Appleton, K. (2006). Science pedagogical content knowledge and elementary school teachers. In K. Appleton (Ed.), Elementary science teacher education: International perspectives on contemporary issues and practice (pp. 31–54). Mahwah, NJ: Lawrence Erlbaum in association with the Association for Science Teacher Education. Ashworth, P. & Lucas, U. (1998). What is the “world” of phenomenography? Scandinavian Journal of Educational Research, 42, 415–431. Attewell, P. & Rule, J. B. (1991) Survey and Other Methodologies Applied to IT Impact Research: Experiences From a Comparative Study of Business Computing. In The Information Systems Research Challenge: Survey Research Methods, 3, 299-315, Boston, MA: Harvard Business School Press. Ball, D. L. (1990). The mathematical understandings that preservice teachers bring to teacher education. Elementary School Journal. 90(4), 449–466. Ball, D. L., & McDiarmid, G. W. (1990). The subject-matter preparation of teachers. In W. R. Houston (Ed.), Handbook of research on teacher education (pp. 437–465). New York, NY: Macmillan. Baker, R. (1994). Teaching science in primary schools: What knowledge do teachers need? Research in Science Education. 24, 31-40.  174  Baird, J. R. (1993). Collaborative reflection, systematic inquiry, better teaching. In Russell, T. & Munby, H. (Eds.) Teachers and teaching: From classroom to reflection (pp. 33-48). New York, NY: Falmer Press Ball, D. & Bass, H. (2000). Interweaving content and pedagogy in teaching and learning to teach: Knowing and using mathematics. In Boaler, J. (Ed.), Multiple perspectives on the teaching and learning of mathematics (pp. 83-104). Westport, CT: Ablex. Ball, D. L., & McDiarmid, W. (1990). The subject-matter preparation of teachers. In W.R. Houston (Ed.), Handbook for Research on Teacher Education (pp. 437-449). New York: Macmillan. Bandura, A. (1973). Aggression: A social learning analysis. Englewood Cliffs, N.J.: Prentice-Hall. Bandura, A. (1997). Self-efficacy : the exercise of control. New York, NY: W.H. Freeman. Bandura, A. (1986). Social foundations of thought and action: A social cognitive theory. Englewood Cliffs, N.J.: Prentice-Hall. Barab, S. A., MaKinster, J. G., Moore, J. A., Cunningham, D. J., & The ILF Design Team. (2001). Designing and building an on-line community: The struggle to support sociability in the Inquiry Learning Forum. Educational Technology Research and Development, 49(4), 71–96. Bargh, J. A. & Ferguson, M.J. (2000). Beyond behaviorism: The automaticity of higher mental processes. Psychological Bulletin, 126, 925-945. Barnes, L. B., Christensen, C. R. & Hansen, A. J. (Eds.). (1994). Teaching and the case method. Boston, MA: Harvard Business School Press. Bassey, M. (1999). Case study research in educational settings. Buckingham, England: Open University Press. Baxter, J.A. & Lederman, N.G. (1999). Assessment and measurement of pedagogical content knowledge. In J. Gess-Newsome & N. G. Lederman (Eds.), Examining pedagogical content knowledge: PCK and science education (pp. 147-161). Netherlands: Kluwer Academic Publisher. Beisser, S. R. (1999, March). Infusing technology in elementary social studies methods. Paper presented at the annual meeting of the Society for Information Technology and Teachers Education, San Antonio, TX. (ERIC Document Reproduction Service No. ED 432 294).  175  Bell, J., Veal, W.R., & Tippins, D.J. (1998, April). The evolution of pedagogical content knowledge in prospective secondary physics teachers. Paper Presented at the Annual Meeting of the National Association for Research in Science Teaching, San Diego, CA. Beri, G. C. (2007). Marketing Research. New York, NY: McGraw-Hill Education. Berliner, D. (1989). Implications of studies of expertise in pedagogy for teacher education and evaluation. In New directions for teacher assessment, Proceedings of the 1988 Educational Testing Service Invitational Conference (pp. 39- 68). Princeton, NJ: Educational Testing Service. Beyer, L., & Zeichner, K. (1982). Teacher training and educational foundations: A plea for discontent. Journal of Teacher Education, 33, 18-23. Biggs, J., (1979), Individual differences in study processes and the quality of learning outcomes, Higher Education, 8(4), 381-394. Blanton, P. (2003). Constructing knowledge. The Physics Teacher. 41(2), 125-126. Blosser, P. E. & Howe R. W., (1969). An analysis of research on elementary teacher education related to teaching of science. Science and Children, 6(5), 50-60. Bodner, G. M. (1986). Constructivism: A theory of knowledge. Journal of Chemical Education, 63(10), 873–878. Book, C., Byers, J., & Freeman, D. (1983). Student expectations and teacher education traditions with which we can and cannot live. Journal of Teacher Education, 34 (1), 9-13. Booth, S.A., (1997), On phenomenography, learning and teaching, Higher Education Research and Development, 16(2), 135-159. Borko, H., Livingston, C., McCaleb, J., & Mauro, L. (1988). Student teachers’ planning and post-lesson reflections: Patterns and implications for teacher education. In J. Calderhead (Ed.), Teachers’ Professional Learning (pp. 65-83). New York, NY: Falmer. Bowden, J., (2000), Experience of phenomenographic research: A personal account. In J. A. Bowden & E. Walsh (Eds.), Phenomenography (pp. 47-61). Melbourne, Australia: RMIT Publishing., Britzman, D. (1986). Cultural myths in the making of a teacher: biography and social structure in teacher education. Harvard Educational Review, 56(4), 442-456  176  Brooks, S. D. (1907). Preparation of high school teachers. National Education Association Journal of Proceedings and Addresses (pp. 547–551). Washington, DC: NEA. British Columbia Ministry of Education (1995a). Integrated resource package for elementary science: Learning resources K-7. Victoria, British Columbia: Author. Bruce, C. (1992). The distinctiveness of a phenomenographic interview: Notes for the phenomenography interest group. Unpublished manuscript. Bruner, J. (1960). The process of education. Cambridge, MA: Harvard University Press. Bruner, J. (1996). The culture of education, Cambridge, MA: Harvard University Press. Bryan, L. & Abell, S. K. (1999). Development of professional knowledge in learning to teach elementary science. Journal of Research in Science Teaching, 36(2), 121139. Bullough, Jr., R. V. (2001). Pedagogical content knowledge circa 1907 and 1987: A study in the history of an idea. Teaching and Teacher Education, 17(6), 655-666. Butts, D.P., Koballa, T.R., & Elliot, T. D. (1997). Does participating in an undergraduate elementary science methods course make a difference? Journal of Elementary Science Education, 9(2), 1-17. Calderhead, T. (1989). Reflective teaching and teacher education. Teaching and Teacher Education, 5(1), 43-51 Callero, P.L. (1992) The meaning of self-in-role: A modified measure of role identity, Social Forces, 71(2), 485-501. Caracelli, V., & Greene, J. (1997). Crafting mixed-method evaluation designs. In J. C. Greene & V. J. Caracelli (Eds.), Advances in mixed-method evaluation: The challenges and benefits of integrating diverse paradigms (pp. 19-32). San Francisco, CA: Jossey-Bass. Calderhead, J. (1988). Learning from introductory school experience. Journal of Education for Teaching, 14(1), 75-83. Calderhead, T. (1989). Reflective teaching and teacher education. Teaching and Teacher Education, 5(1), 43-51. Calderhead, J. & Robson, M. (1991). Images of teaching: student teachers’ early conceptions of classroom practice. Teaching and Teacher Education, 7, 1-8.  177  Cano, J., & Garton, B. L. (1994). The relationship between agriculture preservice teachers’ learning styles and performance in a methods of teaching agriculture course. Journal of Agricultural Education, 35(2), 6-10. Carlsen, W.S. (1992). Closing down the conversation: Discouraging student talk on unfamiliar science content. Journal of Classroom Interaction, 27(2), 15-21. Carter, K. (1990). Teachers’ knowledge and learning to teach. In W. R. Houston & M. H. J. Sikula (Eds.), Handbook of research on teacher education (pp. 291–310). New York, NY: Macmillan. Cattell, R. B. (1966). The scree test for the number of factors. Multivariate Behavioral Research, 1(2), 245-276 Chai, C. S., Koh, J. H. L., & Tsai, C.-C. (2010). Facilitating Preservice Teachers' Development of Technological, Pedagogical, and Content Knowledge (TPACK). Educational Technology & Society, 13 (4), 63–73. Chan, K. (1998, April). A case study of physicists’ conceptions about the theory of evolution. Paper Presented at the Annual Meeting of the National Association for Research in Science Teaching, San Diego, CA. Clift, R., Houston, R., & Pugach, M. (Eds.). (1990). Encouraging reflective practice in education: An analysis of issues and programs. New York, NY: Teachers College Press. Cochran, K.F., DeRuiter, J.A., & King, R.A. (1993). Pedagogical content knowledge: An integrative model for teacher preparation. Journal of Teacher Education, 44, 263272. Collins, C. (2004). Envisaging a new education studies major: what are the core educational knowledges to be addressed in pre-service teacher education? AsiaPacific Journal of Teacher Education, 32(3), 227-240. Creswell, J. W. (1994). Research design: Qualitative and quantitative approaches. Thousand Oaks, CA: Sage Publications. Creswell, J. W. (2002). Educational research: Planning, conducting, and evaluating quantitative and qualitative research. Upper Saddle Creek, NJ: Pearson Education. Creswell, J. W. (2003). Research design: Qualitative, quantitative, and mixed methods approaches (2nd ed.). Thousand Oaks, CA: Sage. Creswell, J. W., & Plano Clark, V. L. (2007). Designing and conducting mixed methods research. Thousand Oaks, CA: Sage Publications.  178  Creswell, J. W., Plano Clark, V. L., Gutmann, M. L., & Hanson, W. E. (2003). Advanced mixed methods research designs. In A. Tashakkori & C. Teddlie (Eds.), Handbook of mixed methods in social and behavioural research (pp. 209-240). Thousand Oaks, CA: Sage. Crisp, R. D. (1958). American Management Association. Research Study, 35. New York, NY, American Management Association. Czerniak C. & Chiarelott L (1990). Teacher education for effective science instruction: A social cognitive perspective. Journal of Teacher Education, 41, 49-58. Dass, P. M. (1998). Professional Development of Science Teachers: Results of Using the Iowa Chautauqua Model in Collier County, Florida. Paper presented at the National Association for Research in Science Teaching Annual Conference, San Diego, CA, USA. Dahlgren L (1997). Learning conceptions and outcomes. In F. Marton, Gordon D. Hounsell, N. Entwistle (Eds.), The experience of learning: Implications for teaching and studying in higher education (2ed). 23-38. Edinburgh, Scotland: Scottish Academic Press Darling-Hammond, L. (2000). How teacher education matters. Journal of Teacher Education. 51(3), 166–173. Davis, E.A. (2004). Knowledge integration in science teaching: Analyzing teachers’ knowledge development. Research in Science Education, 34, 21–53. Davis, E., & Petish, D. (2005). Real-world applications and instructional representations among prospective elementary science teachers. Journal of Science Teacher Education. 16(4), 263-286. DeCoster, J. (2004). Meta-analysis. In Kempf-Leonard, K. (Ed.), The Encyclopedia of Social Measurement. San Diego, CA: Academic Press. Deeds, J. P. (1993). A national study of student teaching requirements in agricultural education. Proceedings of the 20th National Agricultural Education Research Meeting, 20, 219-225. Deeds, J. P., Flowers, J., & Arrington, L. A. (1991). Cooperating teacher attitudes and opinions regarding agricultural education student teaching expectations and policies. Journal of Agricultural Education, 32(2), 2-9. Denzin, N. K., & Lincoln, Y. S. (1998). (Ed.) Collecting and interpreting qualitative materials.  179  DeTure L. R., Gregory E., & Ramsey, B. G. (1990). The science preparation of elementary teachers. Paper presented at the Annual Meeting of the National Association for Research in Science Teaching, Atlanta, GA. Dewey, J. (1902/1983). The child and the curriculum. In J. A. Boydston (Ed.), John Dewey: The middle works, 1899–1924, Vol. 2: 1902–1903. Carbondale, IL: Southern Illinois University Press. Driver, R. (1983). The pupil as scientist. Milton Keynes, UK: Open University Press. Edwards, A. L. & F. P. Kilpatrick. (1948). A technique for the construction of attitude scales. Journal of Applied Psychology, 32, 374-384. Eltis, K. and Cairns, L. (1982). Perceived effectiveness of supervisors of the practicum' Australian Journal of Teaching Practice, 2(2), 101-109. Entwistle, N. (1997). Introduction: Phenomenography in Higher Education, Higher Education Research and Development, 16(2), 127-134. Erickson, F. (1986). Qualitative research methods on teaching. In M. C. Wittrock (Ed.), Handbook on research on teaching (pp. 119–161). New York, NY: Macmillan. Feiman-Nemser, S. & Buchmann, M. (1986). The first year of teacher preparation: Transition to pedagogical thinking? Journal of Curriculum Studies, 18(3), 239-256. Finney, S., & Orr, J. (1995). ‘I've really learned a lot’. Cross-cultural understanding and teacher education in a racist society. Journal of Teacher Education, 46, 327-339. Fuller, F. (1969). Concerns of teachers: A developmental conceptualization. American Educational Research Journal, 6, 207-226. Gallavan, N. P. & Webster-Smith, A. (2010). Navigating teachers’ reactions, responses, and reflections in self-assessment and decision-making. In E. G. Pultorak (Ed).The purposes, practices, and professionalism of teacher reflectivity; Insights for twentyfirst-century teachers and students (pp. 191-209). Lanham, MD: Rowman & Littlefield. Geddis, A.N., Onslow, B., Beynon, C., & Oesch, J. (1993). Transforming content knowledge: Learning to teach about isotopes. Science Education, 77(6), 575–591. Geddis, A. N. & Roberts, D. A. (1998). As science students become science teachers: A perspective on learning orientation. Journal of Science Teacher Education, 9(4), 271–292. Gee, J. P. (1996). Social linguistics and literacies: Ideology in discourses. 2nd ed. London, UK: Taylor & Francis.  180  Gess-Newsome, J. (1999). Introduction and orientation to examining pedagogical content knowledge. In J. Gess-Newsome & N. G. Lederman (Eds.), Examining pedagogical content knowledge (pp. 3–20). Dordrecht, The Netherlands: Kluwer Academic Publishers. Glaser, B. (1992). Basics of grounded theory analysis: Emergence vs. forcing. Mill Valley, CA: Sociology Press. Glesne, C., & Peshkin, P. (1992). Becoming qualitative researches: An introduction. New York, NY: Longman. Goldinger, S.D., Kleider, H.M., Azuma, T., & Beike, D. (2003). "Blaming the victim" under memory load. Psychological Science, 14, 1, 81-85. Goodlad, J.I. (1990). Teachers for our nations schools. San Francisco, CA: Jossey-Bass Publishers. Goodrum, D., Hackling, M., & Rennie, L. (2001). The status and quality of teaching and learning of science in Australian schools. Canberra, Australia. Greene, J., & Caraselli, V. J. (Eds.). (1997). Advances in mixed-method evaluation: The challenges and benefits of integrating diverse paradigms (New Directions for Evaluation, No. 74). San Francisco, CA: Jossey-Bass. Grimmett, P. P., MacKinnon, A. M., Erickson, G. L., & Riecken, T. J. (1990). Reflective practice in teacher education. In R. T. Clift, W. R. Houston, & M. C. Pugach (Eds.), Encouraging reflective practice in education: An analysis of issues and programs (pp. 20–38). New York, NY: Teachers College Press. Grossman, P.L. (1990). The making of a teacher: Teacher knowledge and teacher education. New York, NY: Teachers College Press. Grossman, P. L., Wilson, S. M., & Shulman, L. (1989). Teachers of substance: Subject matter knowledge for teaching. In M. C. Reynolds (Ed.), Knowledge base for the beginning teacher (pp. 23-36). Oxford, UK: Pergamon Press. Guba, E., & Lincoln, Y. (1989). Forth generation evaluation. Newbury Park, CA: Sage. Gustafson, B. J., & Rowell, P. M. (1995). Elementary preservice teachers: constructing conceptions about learning science, teaching science and the nature of science. International Journal of Science Education, 17(5), 589-605. Hacher, T., Cocard, Y. & Moser, P. (2004). Forget about theory – practice is all? Student teachers’ learning in practicum. Teachers and Teaching: theory and practice, 10 (6), 623-637.  181  Hamer, R. N. , van Rossum,, E.J. (2010). The meaning of learning and knowing. Thesis Utrecht University. Hammersley, M. (1989). The dilemma of qualitative method: Herbert Blunter and the Chicago tradition. London, UK: Routledge. Hand, B. & Peterson, R. (1995). The development trial and evaluation of a constructivist teaching and learning approach in a preservice science teaching education program. Research in Science Education. 25, 75-88. Handler, M., & Marshall, D. (1992). Preparing new teachers to use technology: One set of perceptions. In R. Carey, D. Carey, J. Willis, & D. Willis (Eds.), Technology and teacher education annual, 1992 (pp. 386-388). Charlottesville, VA: Association for the Advancement of Computing in Education. Hansen, J., & Stephens. J. (2000). The ethics of learner-centered education: Dynamics that impede the process. Change, 32(5), 41–47. Hashweh, M. Z. (2005). Teacher pedagogical constructions: A reconfiguration of pedagogical content knowledge. Teachers and Teaching: Theory and Practice, 11(3), 273–292. Hashweh, M.Z. (1987). Effects of subject matter knowledge in the teaching of biology and physics. Teaching and Teacher Education, 3(2), 109-120. Hasselgren, B., & Beach, D. (1997). Phenomenography: A “good for nothing brother” of phenomenology? Outline of an analysis. Higher Education Research and Development, 16(2), 191-202. Hatton, N. & Smith, D. (1995). Reflection in teacher education: Towards definition and implementation. Teaching and Teacher Education, 11(1), 33-49. Hermanowicz, H. J. (1966). The pluralistic world of beginning teachers. In The real world of the beginning teacher. Report of the nineteenth national TEPS conference. Washington, DC: National Education Association. Hinkle, D. E., Wiersma, W., & Jurs, S. G. (2003). Applied Statistics for the Behavioral Sciences (5th ed.). New York, NY: Houghton Mifflin Company. Hoepfl, M. C. (1997). Choosing qualitative research: A primer for technology education researchers. Journal of Technology Education, 9(1), 47-63. Hollingsworth, S. (1989). Prior beliefs and cognitive change in learning to teach. American Educational Research Journal, 26, 160-189.  182  Holt-Reynolds, D. (1992). Personal history-based beliefs as relevant prior knowledge in course work. American Educational Research Journal, 29, 325-349. Howell, D. C. (2002). Statistical Methods for Psychology (5th ed.). Pacific Grove, CA: Duxbury. Howitt, C. (2007). Pre-service elementary teachers' perceptions of factors in an holistic methods course influencing their confidence in teaching science. Research in Science Education, 37(1), 41-58. Hoy, A. W., and C. S. Weinstein. 2006. “Students’ and Teachers’ Knowledge and Beliefs about Classroom Management.” In Handbook of Classroom Management: Research, Practice, and Contemporary Issues, ed. C. Evertson and C. Weinstein. Mahwah, NJ: Erlbaum. Kelly, G. A. (1955). The psychology of personal constructs: A theory of personality (Vol. I). New York, NY: Norton. Kennedy, Y, H. (1997) Learning works: Widening participation in further education (Coven-try, the Further Education Funding Council). Kettle, B., & Sellars, N. (1996). The development of student teachers' practical theory of teaching. Teaching and Teacher Education, 12(1), 1-24. Kraus, S. & Butler, K. (2000). Reflection is not description: cultivating reflection with pre-service teachers. Paper presented at the American Association of Colleges for Teacher Education (52nd, Chicago, IL. February 26-29, 2000). ED 440 079. Janesick, V. (1998). Stretching exercises for qualitative researchers. London, UK: Sage Publications. Jones, M. (2008) Collaborative partnerships: A model for science teacher education and professional development. Australian Journal of Teacher Education. 33(3),61-76. Johnson, B., & Christensen, L. (2000). Educational research: Quantitative, qualitative and mixed approaches. Boston, MA: Allyn & Bacon. Johnson, B., & Christensen, L. (2008). Educational research: Quantitative, qualitative and mixed approaches (3rd ed). Los Angeles, CA: Sage. Johnson, R. B., & Onwuegbuzie, A. J. (2004). Mixed methods research: A research paradigm whose time has come. Educational Researcher, 33(7), 14-26. Johnston, J. (1994). Education of Channel One. Ann Arbor, MI.: Institute for Social Research, University of Michigan. Joppe, M. (2000). The Research Process. http://www.ryerson.ca/~mjoppe/rp.htm  183  Joyce, B. & Showers, B. (1988). Student achievement through staff development: Fundamentals of school renewal. London: Longman Publishing Group. Kagan, D. M. (1992). Professional growth among pre-service and beginning teachers. Review of Educational Research, 62(2), 129-169. Kirk, J., & Miller, M. L. (1986). Reliability and validity in qualitative research. Beverly Hills, CA: Sage Publications. LaBoskey, V. (1994). Development of reflective practice: A study of preservice teachers. New York, NY: Teachers College Press. Klein, W. M., & Weinstein, N. D. (1997). Social comparison and unrealistic optimism about personal risk. In B. P. Buunk & F. X. Gibbons (Eds.), Health, coping, and well-being: Perspectives from social comparison theory (pp. 25-61). Hillsdale, NJ: Lawrence Erlbaum. Koehler, V (1985). Research on preservice teacher education. Journal of Teacher Education, 34 (1), 23-30. Koerner, J. D. (1963). The miseducation of American teachers. Baltimore, MD: Penguin. Koster, B., Korthagen, F. A. J., & Schrijnemakers, H. G. M. (1995). Between entry and exit: how student teachers change their educational values under the influence of teacher education. In F. Buffet & J. A. Tschoumy (Eds.), Choc démocratique et formation des enseignants en Europe [Democratic shock and the education of students in Europe] (pp. 56-168). Lyon, France: Presses Universitaires de Lyon. LaBoskey, V. K. (1994). Development of reflective practice: A study of preservice teachers. New York, NY: Teachers College Press. Lankford, S. V. (1994). Attitudes and perceptions toward tourism and rural regional development. Journal of Travel Research, 31(3), 35-43. Learned, W. S., & Bagley, W. C. (1920). The professional preparation of teachers for American public schools: A study based upon an examination of tax-supported normal schools in the state of Missouri. New York: The Carnegie Foundation for the Advancement of Teaching. Lee, E., Brown, M., Luft, J., & Roehrig, G., (2007). Assessing beginning secondary science teachers' PCK: Pilot year results. School Science and Mathematics.107(2). Lee, E. & Luft, J. (2005, April). Conceptualizing PCK from the perspective of mentor science teachers. Paper presented at the annual meeting of the National Association for Research in Science Teaching, Dallas, TX.  184  Lemlech, J.K., & Kaplan, S.N. (1990). Learning to talk about teaching: Collegiality in clinical teacher education. Action in Teacher Education, 12, 13-19. Lewin, K (1949). The action research and minority problems, Journal of Social Issues, 2, 34-46. Lewin K. (1951). Field theory in social science. New York, NY: Harper and Row. Likert, R. A. (1932). A technique for the measurement of attitudes. Archives of Psychology, 22(140), 1-55. Likert, R. (1967). The human organization. New York, NY: McGraw-Hill Lincoln, Y., S. & Guba, E. (1985). Naturalistic inquiry. Beverly Hills, CA: Sage. Lortie, D. (1975). School-teacher: A sociological study. Chicago, IL: University of Chicago Press. Loughran, J., Berry, A., & Mulhall, P. (2006). Understanding and developing science teachers’ pedagogical content knowledge. Rotterdam, The Netherlands: Sense Publishers. Loughran, J., Korthagen, F., & Russell, T. (2008). Teacher education that makes a difference: Developing foundational principles of practice. In C. Craig & L. Deretchin (Eds.), Imagining a renaissance in teacher education (pp. 405–423). Lanham, MD: Roman & Littlefield. Luckey, G. W. A. (1907). Preparation of high school teachers. National Education Association Journal of Proceedings and Addresses, 587–592. Washington, DC: NEA. Lyons, G. (1980). Why teachers can’t teach. Phi Delta Kappan, 62(2), 108-112. Magnusson, S., Krajcik, L., & Borko, H. (1999). Nature, sources and development of pedagogical content knowledge. In J. Gess-Newsome & N. G. Lederman (Eds.), Examining pedagogical content knowledge (pp. 95–132). Dordrecht, The Netherlands: Kluwer. MacKinnon, A.M., & Erickson, G.L (1992). The roles of reflective practice and foundational disciplines in teacher education. In T.L Russell & H. Munby (Eds.), Teachers and teaching: From classroom to reflection (pp. 192-210). London, UK: Falmer Press.  185  Marini, A. and Genereux, R. (1995). The Challenge of Teaching for Transfer. Iin McKeough, Lupart, J. & A. Marini (Eds.), Teaching for Transfer: Fostering Generalization in Learning (pp. 1–20). Mahwah, NJ: Lawrence Erlbaum Associates. Marks, R. (1990). Pedagogical content knowledge: From a mathematical case to a modified conception. Journal of Teacher Education. 41, 3–11. Martin, R. A. & Yoder, E. P. (1985). Clinical teaching analysis: A procedure for supervising teachers. Journal of the American Association of Teacher Educators in Agriculture, 26(4), 16-21, 33. Marton, F. (1981). Phenomenography: Describing conceptions of the world around us. Instructional Science, 10, 177-200. Marton, F. (1986). Phenomenography:A research approach to investigating different understandings of reality. Journal of Thought, 21(3), 28-49. Marton, F. (1994). On the structure of awareness. In J. A. Bowden & E. Walsh (Eds.), Phenomenographic research: Variations in method (pp. 176-205). Melbourne, Australia: Office of the Director EQARD, Royal Melbourne Institute of Technology. Marton, F. (2000). The structure of awareness. In J. A. Bowden & E. Walsh (Eds.), Phenomenography. Melbourne, Australia: RMIT Publishing. Marton, F. and Säljö, S., (1997), Approaches to learning, in F. Marton, D. Hounsell and N.J. Entwistle (Eds.), The Experience of Learning: Implications for Teaching and Studying in Higher Education (pp. 39-58). Edinburgh, Scotland: Scottish Academic. Mathison, S. (1988). Why triangulate? Educational Researcher, 17(2), 13-17. McCall, A. L. (1995). Constructing conceptions of multicultural teaching: Preservice teachers' life experiences and teacher education. Journal of Teacher Education, 46, 340-350. McDiarmid, G. W., Ball, D. L., & Anderson, C. (1989). Why staying one chapter ahead doesn't really work: Subject-specific pedagogy. In M. C. Reynolds (Ed.), Knowledge base for the beginning teacher (pp. 193- 205). Elmsford, NY: Pergamon Press. McEvoy, B. (1986). She is still with me: influences of former teachers on teacher practice . Paper presented at the Annual Meeting of the American Educational Research Association, San Francisco, CA.  186  McIntyre, D. Byrd, D. & Foxx, S. (1996). Field and laboratory experiences. In J. Sikula, T. Buttery & E. Guyton (Eds.), Handbook of research on teacher education (pp. 171–193). New York, NY: Macmillan. Mellado, V. (1998). Preservice teachers' classroom practice and their conceptions of the nature of science. In B. J.Fraser & K.Tobin (Eds.), International handbook of science education. Dordrecht, The Netherlands: Kluwer. Merriam, S. B. (1998). Qualitative research and case study applications in education. San Francisco, CA: Jossey-Bass. Mestre, J. P. (2002). Probing adults' conceptual understanding and transfer of learning via problem posing. Journal of Applied Developmental Psychology, 23, 9-50. Meyer, H. (2004). Novice and expert teachers' conceptions of learners' prior knowledge. Science Education, 88(6), 970-983. Miles, M. & Huberman, A. M. (1994). Qualitative data analysis: An expanded sourcebook (2nd ed). Thousand Oaks, CA: Sage Publications. Monroe, W. S. (1952). Teaching-learning theory and teacher education: 1890–1950. Champaign, IL: University of Illinois Press. Montgomery, B. (2000). The student and cooperating teacher relationship. Journal of Family and Consumer Sciences Education, 18(2), 7-15. Moors A, De Houwer J (2006) Automaticity: A theoretical and conceptual analysis. Psychology Bulletin, 132, 297-326. Moran, S. (1990). Schools and the beginning teacher. Phi Delta Kappan 72, 210–213. Morse, J. M. (1991). Approaches to qualitative-quantitative methodological triangulation. Nursing Research, 40, 120-123. Nashon, S. (2005). Reflections from pre-service science teachers on the status of Physics 12 in British Columbia. Journal of Physics Teacher Education Online, 3(1), 25-32. Nashon, S. M. (2006). A proposed model for planning and implementing high school physics instruction. Journal of Physics Teacher Education Online. 4(1), 6-9. Nashon, S. & Anderson, D. (2004). Obsession with ‘g’: A metacognitive reflection of a laboratory episode. Alberta Journal of Science Education, 36(2), 39-44  187  Newman, I., Ridenour, C. S., Newman, C., & DeMarco, G. M. P. (2003). A typology of research purposes and its relationship to mixed methods. In A. Tashakkori & C. Teddlie (Eds.), Handbook of mixed methods in social and behavioural research (pp. 167-188). Thousand Oaks, CA: Sage. Nezvalova, D. (2008) The Quality in Science Education. Olomouc, Czech Republic: Palacky University, Nieto, S. (1998). From claiming hegemony to sharing space: Creating community in multicultural education course. In R. Chavez Chavez, & J. O'Donnell (Eds.), Speaking the unpleasant: The politics of (non) engagement in the multicultural education terrain (pp. 16-31). Albany, NY: State University of New York Press. Noor , K. (2008). Case Study: A Strategic Research Methodology. American Journal of Applied Sciences, 5(11), 1602-1604. Norris, R. J., Jr. Larke, A., & Briers, G. E. (1990). Selection of student teaching centers and cooperating teachers in agriculture and expectations of teacher educators regarding these components of a teacher education program: A national study. Journal of Agricultural Education, 31(1), 58-63. Novak, L.L. (2007). Making sense of clinical practice: Order set design strategies in CPOE. AMIA Annual Symposium Proceedings, 568–572. O'Donoghue, T. and Punch, K., (2003), Qualitative educational research in action: Doing and reflecting, London: Routledge Falmer. Oftedahl, J. (1984). Secondary English methods courses in the Midwest as viewed by methods professors and secondary English teachers. Unpublished doctoral dissertation. Orgill, M. K. (2002). Phenomenograpy. Retrieved 30 October, 2011 from http://www.minds.may.ie/~dez/phenom.html Ornek, F., (2008). An overview of a theoretical framework of phenomenography in qualitative education research: An example from physics education research. AsiaPacific Forum on Science Learning and Teaching, 9(2), 11. Osborne, J. & Simon, S. (1996) Primary science: Past and future directions. Studies in Science Education, 27, 99-147. Pagano, R. R. (2004). Understanding Statistics in the Behavioral Sciences (7th ed.). Belmont, CA: Thomson/Wadsworth. Pajares, M. F. (1992). Teachers’ beliefs and educational research: Cleaning up a messy construct. Review of Educational Research, 62(3), 307-333.  188  Palmer, D. H. (2006). Sources of self-efficacy in a science methods course for primary teacher education students. Research in Science Education, 36(4), 337-353. Pang, M.F., (2003), Two faces of variation: on continuity in the phenomenographic movement, Scandinavian Journal of Educational Research, 47(2), 145-156. Pang, M.F. & Marton, F. (2003). Beyond “lesson study” – Comparing two ways of facilitating the grasp of economic concepts. Instructional Science, 31 (3), 175-194. Parr, S. S. (1988). ?. National Education Association Journal of Proceedings and Addresses. Washington, DC: NEA. Patton, M. (1990). Qualitative education methods. Thousand Oaks, California: Sage. Payne, S., (1951). The art of asking questions. New Brunswick, NJ: Princeton University Press. Pelton, T., & Pelton, F. L. (2005). Helping students learn with classroom response systems. In J. Lang (Ed.), Society for Information Technology and Teacher Education International Conference (pp. 1554–1559). Chesapeake, VA: Association for the Advancement of Computing in Education. Phillips, D. K., & Carr, K. (2006). Becoming a teacher through action research: Context, process, and self-analysis. New York, NY: Routledge. Piaget, J. (1972). The psychology of the child. New York, NY: Basic Books. Piaget, J. (1990). The child's conception of the world. New York, NY: Littlefield Adams. Pintrich, P. R., & Schunk D. H. (1996). Motivation in education: Theory, research, and applications. Englewood Cliffs, NJ: Merrill/Prentice Hall. Poulou, M. (2007). Student-teachers’ concerns about teaching practice. European Journal of Teacher Education, 30(1), 91-110. Ramsden, P., (1992), Learning to teaching in higher education, Routledge, London. Roberts, J. (2006). Limits to communities of practice. Journal of Management Studies 43(3), 623-639. Richardson, V. (1996). The role of attitudes and beliefs in learning to teach. In J. Sikula (Ed.), Handbook of research on teacher education (2nd ed,) (pp. 102-119). New York, NY: Macmillan. Richmond, G., Howes, E., Kurth, L., & Hazelwood, C. (1998). Connections and critique: Feminist pedagogy and science teacher education. Journal of Research in Science Teaching. 35(8), 897-918.  189  Ross. D. (1987). Action research for preservice teachers: A description of why and how. Journal of Education. 6(4). 131-130. Roth, K., Smith, E., & Anderson, C. (1983, April). Students' conceptions of photosynthesis and food for plants. Paper presented at the annual meeting of the American Educational Research Association, Roth, K., Anderson, C., & Smith, E., (1986). Curriculum materials, teacher talk and student learning: Case studies in fifth grade science teaching. Research Series. 171, Michigan State University, MI: The Institute for Research in Teaching. Rovegno, I. (1991). A participant observation study of knowledge restructuring in a fieldbased elementary physical education methods course. Research Quarterly for Exercise and Sport, 62, 205-212. Rovegno, I. (1993a). The development of curricular knowledge: A case of problematic pedagogical content knowledge during advanced knowledge acquisition. Research Quarterly for Exercise and Sport, 64, 56-68. Rovegno, I. (1993b). Content knowledge acquisition during undergraduate teacher education: Overcoming cultural templates and learning through practice. American Educational Research Journal, 30, 611-642. Rovegno, I. (1994). Teaching within a curricular zone of safety: School culture and the situated nature of student teachers' pedagogical content knowledge. Research Quarterly for Exercise and Sport, 65, 269-279. Rovegno, I. (1995). Theoretical perspectives on knowledge and learning and a student teacher's pedagogical content knowledge of dividing and sequencing subject matter. Journal of Teaching in Physical Education, 14, 284-304. Rovegno, I., & Bandbauer, D. (1994). Child-designed games - Experience changes teachers' conceptions. Journal of Physical Education, Recreation and Dance, 65(6), 60-63. Russell, T., Munby, H., Spafford, C., & Johnston, P. (1988). Learning the professional knowledge of teaching: Metaphors,p uzzles, and the theory-practice relationship. In P. Grimmett & G. L. Erickson (Eds.), Reflection in teacher education (pp. 67-90). New York, NY: Teachers College Press Sandberg, J. (1997). Are phenomenographic results reliable? Higher Education Research and Development, 16(2), 203-212. Sandberg, J. (2005). How do we justify knowledge produced within interpretive approaches? Organizational Research Methods, 8(1), 41-68.  190  Sandelowski, M. (1986). The problem of rigor in qualitative research. Advances in Nursing Science, 8(3), 27–37. Sanders, L.R., Borko, H., & Lockard, J.D. (1993). Secondary science teachers' knowledge base when teaching science courses in and out of their area of certification. Journal of Research in Science Teaching, 30, 723-736. Schmidt, D., Baran, E., Thompson, A., Koehler, M.J., Shin, T, & Mishra, P. (2009, April). Technological Pedagogical Content Knowledge (TPACK): The Development and Validation of an Assessment Instrument for Preservice Teachers. Paper presented at the 2009 Annual Meeting of the American Educational Research Association. April 13-17, San Diego, California. Schön D (1987) Educating the Reflective Practitioner. San Francisco, CA: Jossey-Bass Schoon, K. J. (1995). The origin and extent of alternative conceptions in the earth and space sciences: A survey of pre-service elementary teachers. Journal of Elementary Science Education, 7 (2), 27-46. Schoon, K. J., & Boone, W. J. (1998). Self-efficacy and alternative conceptions of science of preservice elementary teachers. Science Education, 82(5), 553–568. Shymanski, J. A., & Green, D. W. (1982). Valuing science content: Science is a basic we all can do. In Benson, B. W. (Ed.), Teaching Children Science: Changing Adversity into Advocacy Environmental Education, Columbus, OH. Sechrest, L., & Sidana, S. (1995). Quantitative and qualitative methods: Is there an alternative? Evaluation and Program Planning, 18, 77–87. Settlage, J., & Southerland, S. A. (2007). Teaching science to every child: Using culture a starting point. New York, NY: Routledge. Sherin, M. (2002). When teaching becomes learning. Science Education. 20, 119-150. Shrigley, R. (1974). Correlation of science attitudes and science knowledge of preservice elementary teachers. Science Education, 58, 142-151. Shulman, L. S. (1986). Those who understand: Knowledge growth in teaching. Educational Researcher, 15(2), 4–14. Shulman, L. S. (1987). Knowledge and teaching: Foundations of the new reform. Harvard Educational Review, 57(1), 1–22. Shulman, L. (1992, September-October). Ways of seeing, ways of knowing, ways of teaching, ways of learning about teaching. Journal of Curriculum Studies, 28, 393396.  191  Siniscalco, M. T. & Auriat, N. (2005) Quantitative research methods in educational planning: Questionnaire Design [Online]. Available at: http://www.iiep.unesco.org/fileadmin/user_upload/Cap_Dev_Training/Training_M aterials/Quality/Qu_Mod8.pdf [Accessed: 11 November 2011] Skamp, K. (1989). General science knowledge and attitudes towards science teaching of pre-service primary teachers: Implications for pre-service units. Research in Science Education, 19, 257-267. Sjostrom, B., & Dahlgren, L. O. (2002). Nursing theory and concept development or analysis: Applying phenomenography in nursing research. Journal of Advanced Nursing, 40(3), 339-345. Sloane, P. (2007). The innovative leader: How to inspire your team and drive creativity. New York: Kogan Page Publishers. Smith, D.C. (1999) "Changing our Teaching: the Role of Pedagogical Content Knowledge in Elementary Science," in J. Gess-Newsome and N.G. Lederman (Eds.). Examining Pedagogical Content Knowledge: the Construct and its Implications for Science Education (pp. 163-97). Dordrecht, The Netherlands: Kluwer Academic Publishers. Smith, D., & Neale, D. (1989). The construction of subject matter knowledge in primary science teaching. Teaching and Teacher Education, 5(1), 1-20. Sonnemann, U. (1954). Existence and therapy: An introduction to phenomenological psychology and existential analysis. New York, NY: Grune and Stratton. Stake, R. E. (1995). The art of case study research. Thousand Oaks, CA: Sage. Stalheim-Smith, A. & Scharmann, L. C. (1996). General biology: Creating a positive learning environment for elementary education majors. Journal of Science Teacher Education, 7(3), 169-178. Starko, A. J. (2002) http://www.educationreport.org/pubs/mer/article.aspx?id=4376 Statistics Canada (2010). Survey methods and practices. Statistics Canada Catalogue no. 12-587-X. Ottawa, Canada. Staver, J. R. (2007). Teaching science. In H. J. Walberg (Ed.), Educational practices series 17 (pp. 1-28). Brussels, Belgium: International Academy of Education (IAE). Steele-Johnson, D., Beauregard, R.S., Hoover, P.B., & Schmidt, A.M. (2000). Goal orientation and task demand effects on motivation, affect, and performance. Journal of Applied Psychology, 85, 724–738.  192  Stefanich, G. P., & Kelsey, K. W. (1989). Improving science attitudes of preservice elementary teachers. Science Education, 73(2), 187-94. Stephanou, A., (1999), The measurement of conceptual understanding in physics, Conference Proceedings., 8th European Conference for Leaning and Teaching, August 24 – 28, 1999, Goteborg, Sweden. Stevens C, Wenner G (1996). Elementary preservice teachers' knowledge and beliefs regarding science and mathematics. School Science and Math, 96(1), 2-9. Stoddart, T., Connell, M., Stofflett, R. & Peck, D. (1993). Reconstructing elementary teacher candidates understanding of mathematics and science content. Teaching and Teacher Education, 9(3), 229-241. Strauss, A. L., & Corbin, J. (1998). Basics of qualitative research: Techniques and procedures for developing grounded theory (2nd ed). Thousand Oaks, CA: Sage. Sugrue, C. (1997). Complexities of teaching: Child-centred perspectives. London, UK: Falmer Press. Sweller, J. (1989). Cognitive technology: Some procedures for facilitating learning and problem solving in mathematics and science. Journal of Educational Psychology. 81, 457-466. Sweller, J. (1990). On the limited evidence for the effectiveness of teaching general problem-solving strategies. Journal for Research in Mathematics Education, 21 (5), 411-415. Sweller, J., van Merrienboer, J., & Paas, F. (1998). Cognitive architecture and instructional design. Educational Psychology Review, 10, 251-296. Tashakkori, A. & Teddle, C. (Eds). (2003). Handbook of Mixed Methods in social and behavioural research. Thousand Oaks, CA: Sage. Tilgner, P. (1990). Avoiding science in the elementary school. Science Education, 74, 421-431. Thomas, G. and Hult, M. (2010). Market-focuses sustainability: market orientation plus! Journal of the Academy of Marketing Science, 39(1), 1-6. Tobin, K. & Garnett, P. (1988). Exemplary practice in science classrooms. Science Education, 72, 197-208  193  Tobin, K., Tippins, D.J., & Gallard, A.J. (1994). Research on instructional strategies for teaching science. In D.L. Gabel (Ed.), Handbook of research on science teaching and learning (pp. 45–93). New York, NY: Macmillan. Trumbull, D. (1999). The new science teacher. New York, NY: Teachers College Record. Trundle, K. C., Atwood, R. K., & Christopher, J. E. (2007). A longitudinal study of conceptual change: Preservice elementary teachers' conceptions of moon phases. Journal of Research in Science Teaching, 44 (2), 303-326. Ulmer, J.T. & Wilson, M.S. (2003) The potential contributions of quantitative research to symbolic interactionism. Symbolic Interaction, 26(4), 531-552. Valli, L. (1992). Reflective teacher education: cases and critiques. Albany, NY: State University of New York Press. van Driel, J., Verloop, N., & de Vos, W. (1998). Developing science teachers’ pedagogical content knowledge. Journal of Research in Science Teaching, 35, 673– 695. Van Driel, J. H., Beijaard, D., & Verloop, N. (2001). Professional development and reform in science education: The role of teachers' practical knowledge. Journal of Research in Science Teaching, 38(2), 137-158. van Driel, J., de Jong, O., & Verloop, N. (2002). The development of preservice chemistry teachers’ pedagogical content knowledge. Science Education, 86(4) 572– 590. Veal, W. R. (1998). The evolution of pedagogical content knowledge in prospective secondary chemistry teachers. Paper presented at the Annual Meeting of the National Association of Research in Science Teaching, San Diego, CA. Veal, W. R., & MaKinster, J. G. (1999). Pedagogical content knowledge taxonomies. Electric Journal of Science Education, 3(4) [Online]. Available at: http://wolfweb.unr.edu/homepage/crowther/ejse/ejsev3n4.html [Accessed: 11 November 2011]. Veenman, Simon (1984). Perceived problems of beginning teachers. Review of Educational Research, 54(2), 143-178. Victor, E. (1962). Why are our elementary school teachers reluctant to teach science? Science Education, 46, 185-192. Vygotsky, L. (1986). Thought and language. Boston, MA: MIT Press. Vygotsky, L.S. (1978). Mind in society. Cambridge, MA: Harvard University Press.  194  Walsh, E., (2000), Phenomenographic analysis of interview transcripts, in Phenomenography, edited by J.A. Bowden and E. Walsh, 19 – 33. Melbourne, Australia: RMIT Publishing. Walker, M. B. (1992). The psychology of gambling. NewYork, NY: Pergamon. Wenner, G.J.,(1993). Relationship between science knowledge levels and beliefs toward science instruction held by preservice elementary teachers. Journal of Science Education and technology, 2, 461-468. Westerback M (1982). Studies on anxiety about teaching science in preservice elementary teachers. Journal of Research in Science Teaching, 21, 937-950. Weinstein, C. (1988). Preservice teachers’ expectations about the first year of teaching. Teaching and Teacher Education, 4, 31-40. Weinstein, N. D. (1982) Unrealistic optimism about susceptibility to health problems. Journal of Behavioral Medicine, 5, 441–460. Wetzel, K. (1993). Teacher educators' uses of computers in teaching. Journal of Technology in Teacher Education, 1(4), 335-352. Wheatley, T. P., & Wegner, D. M. (2001). Automaticity in action. In N. J. Smelser & P. B Baltes (Eds.), International encyclopaedia of the social and behavioral sciences (pp. 991-993). London, UK: Pergamon. Wilson, S. M. and Wineberg, S. S. (1988). Peering at history through different lenses: the role of the disciplinary perspectives in teaching history, Harvard Educational Review, 89 (4) 527-39. Winter, G. (2000). A comparative discussion of the notion of validity in qualitative and quantitative research. Qualitative Report, 4. Whitehead, A. N. (1961). The aims of education. New York, NY: American Library/Mentor Books. Yin, R. K. (2003b). Case study research: Design and methods 3rd ed., Thousand Oaks, CA: Sage. Zembal-Saul, C., Starr, M., & Krajcik, J.S. (1999). Constructing a framework for elementary science teaching using pedagogical content knowledge. In N. Lederman and J. Guess-Newsome (Eds.) Examining Pedagogical Content Knowledge (pp. 237-256). The Netherlands: Kluwer Academic Publishers,  195  Zeichner, K., & Gore, J. (1990). Teacher socialization. In W. Robert Houston (Ed.), Handbook of research on teacher education (pp. 329-348). New York, NY: Macmillan. Zeichner, K., Tabachnick, B., & Densmore, K. (1987). Individual, institutional and Cultural influences on the development of teacher's craft knowledge. In J. Calderhead (Ed.), Exploring teachers' thinking (pp. 21-59). London, UK: Cassell Zeichner, K., & Tabachnick, B. R. (1981). Are the effects of university teacher education washed out by school experience? Journal of Teacher Education, 32, 7-11. Reprinted in the Education Digest, 40-43. Zembal-Saul, C., Krajcik, J.,& Blumenfeld, P. (2002). Elementary student teachers’ science content representations. Journal of Research in Science Teaching, 39, 443-463.  196  Appendices Appendix A: Email Letter Consent to Participate  Faculty of Education 2125 Main Mall Vancouver, B.C. Canada V6T 1Z4  Consent to Participate One-Year Elementary Teacher Candidates Pedagogical Content Knowledge Principal Investigator: Dr. Samson Madera Nashon, Faculty of Education, Department of Curriculum and Pedagogy, University of British Columbia. Co-Investigator: Douglas Adler, Graduate Student, Department of Curriculum and Pedagogy, University of British Columbia. You have been invited to participate a questionnaire because you are an Elementary Teacher Candidate in the One Year Teacher Education. This questionnaire is part of a study aimed at examining Elementary Teacher Candidates’ pedagogical content knowledge (PCK). If you have any questions please just e-mail the Principal Investigator. If for ANY reason, you do not want to take part in this study that’s fine, you don’t have to. It is up to you if you want to take part or not. You are also free to withdraw at any time without having to give any reason. If you drop out you will not experience ANY negative consequences at all. This questionnaire does not ask for personal identifiers or any information that may be used to identify you. The Internet server in the Department of Curriculum and Pedagogy (EDCP) in the Faculty of Education hosts the online survey. The EDCP 197  server does record incoming Internet provider [IP] addresses of the computer that you use to access the survey. The IP addresses are analyzed only in aggregate; no connection is made between your data and your computer’s IP address. Your answers will be kept confidential, and so no one will know how you answered the questions except you. All completed surveys will be kept on a secure server at UBC. Your questionnaire will not be made available to anyone other than the researchers involved in this research. The data collected from this questionnaire will be published in a graduate thesis. There will be no names and identifiers linking the data to individuals and IP addresses. There are no known risks associated with participation in this study. If you have any questions or concerns about what is involved please contact Dr. Samson Nashon or Douglas Adler by email or phone (see contact details at the top of this page). Alternatively, if you have any concerns about your rights or treatment as a research subject please contact the ‘Research Subject Information Line’ in the UBC Office of Research Services at (604) 822-8598 or if long distance email to RSIL@ors.ubc.ca. Please read the instructions carefully, it should take you about 10 minutes to complete this questionnaire. The questionnaire address is: http://www.xxxxxx.ca (disguised) Password: PCK By completing and submitting a questionnaire you are consenting to the conditions outlined in this consent to participate. Please keep or print a copy of this consent for your records. Thank you for your help.  198  Appendix B: Questionnaire  199  200  201  202  203  204  205  206  207  208  209  210  211  212  213  Appendix C: Interview Questions After Science Methods Course What is it about your knowledge of teaching that might have earned you admission into the programme? What is it about you and science that might have earned your admission into the programme? Tell me about your experiences you might have had in teaching. What do you think was the most important asset in your application to the teacher education programme? Tell me about your experience in elementary and high school science. What activities, including job related or extra-curricular do you believe have contributed to your knowledge of science?  After Practicum How has your knowledge of teaching and science content changed after taking a science methods course? Tell me about your science methods course In what ways did the methods course experience affect the way you view teaching science? Describe how your approach to teaching science has changed after the science methods course. What is it about knowledge of teaching and science knowledge that might have been affected due to your practicum experience?  214  Tell me about your practicum experience. In what ways did the practicum experience affect the way you now view teaching science? Describe how your approach to teaching science has changed after the practicum experience.  215  

Cite

Citation Scheme:

        

Citations by CSL (citeproc-js)

Usage Statistics

Share

Embed

Customize your widget with the following options, then copy and paste the code below into the HTML of your page to embed this item in your website.
                        
                            <div id="ubcOpenCollectionsWidgetDisplay">
                            <script id="ubcOpenCollectionsWidget"
                            src="{[{embed.src}]}"
                            data-item="{[{embed.item}]}"
                            data-collection="{[{embed.collection}]}"
                            data-metadata="{[{embed.showMetadata}]}"
                            data-width="{[{embed.width}]}"
                            async >
                            </script>
                            </div>
                        
                    
IIIF logo Our image viewer uses the IIIF 2.0 standard. To load this item in other compatible viewers, use this url:
http://iiif.library.ubc.ca/presentation/dsp.24.1-0055368/manifest

Comment

Related Items