Open Collections

UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Investigating teachers’ perceptions of the usefulness of Portfolio-Based Language Assessment (PBLA) in… Mohammadian Haghighi, Fatemeh 2016

Your browser doesn't seem to have a PDF viewer, please download the PDF to view this item.

Notice for Google Chrome users:
If you are having trouble viewing or searching the PDF with Google Chrome, please download it here instead.

Item Metadata


24-ubc_2017_february_mohammadian haghighi_fatemeh.pdf [ 1.06MB ]
JSON: 24-1.0340525.json
JSON-LD: 24-1.0340525-ld.json
RDF/XML (Pretty): 24-1.0340525-rdf.xml
RDF/JSON: 24-1.0340525-rdf.json
Turtle: 24-1.0340525-turtle.txt
N-Triples: 24-1.0340525-rdf-ntriples.txt
Original Record: 24-1.0340525-source.json
Full Text

Full Text

Investigating Teachers’ Perceptions of the Usefulness of Portfolio-Based Language Assessment (PBLA) in Language Instruction for Newcomers to Canada (LINC) Programme by  Fatemeh Mohammadian Haghighi  M.Ed., The University of British Columbia, 2013  A THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF  MASTER OF ARTS in THE FACULTY OF GRADUATE AND POSTDOCTORAL STUDIES (Teaching English as a Second Language)  THE UNIVERSITY OF BRITISH COLUMBIA (Vancouver)  December 2016  © Fatemeh Mohammadian Haghighi, 2016       ii Abstract  The growing tendency towards student-centred communicative approach to language pedagogy has given rise to alternative modes of assessment such as Portfolio-Based Language Assessment (PBLA), which is compatible with student-centred SLA theories and pedagogical practices in ESL classrooms (e.g., Fox, 2014; Ripley, 2012). In Canada, since 2013, Citizenship and Immigration Canada (CIC) has allocated funding to different projects across the country to implement PBLA as an assessment tool in the Language Instruction for Newcomers to Canada (LINC) programme. Since language assessment tools have a remarkable role in determining learners’ progress through the LINC programme, and due to the significant impact of assessment tools on newcomers’ professional and educational possibilities in Canada, this study investigated instructors’ perceptions of the usefulness, challenges and benefits of PBLA in the LINC programme. Data were collected through semi-structured interviews with ten LINC instructors who have been using PBLA for at least three months. Data were analyzed drawing on Bachman and Palmer’s (1996) concept of test usefulness (validity, reliability, authenticity, interactiveness, impact and practicality) as a theoretical framework for the study.  The findings showed that although PBLA has numerous benefits for language learners, when it comes to its implementation, it presents many challenges for LINC teachers. Some of these challenges include: Challenges in developing real-life, CLB-aligned instructional and assessment tasks, difficulties in assessing portfolios and providing feedback, etc. According to       iii the participants, PBLA is better to be used for formative assessment rather than summative assessment.  In terms of test usefulness framework, according to the findings, validity and reliability of PBLA as a summative assessment tool is questionable. Regarding authenticity and interactiveness, except for teachers who were teaching lower CLB levels other teachers found PBLA an authentic and interactive method of assessment. As for the students, based on the findings, positive impacts of PBLA on learners’ lives outweigh potential negative impacts. Finally, concerning the practicality of PBLA, a number of issues regarding PBLA implementation were raised. It should be noted that since I had a small number of research participants, the findings of this study may not be generalized to other LINC instructors and LINC programmes across Canada.                  iv Preface This thesis is the original work of the author, Fatemeh Mohammadian Haghighi. Ethics approval was required for this research and was approved by the UBC Behavioural Research Ethics Board (BREB) on April 28, 2016. The BREB number is H16-00627.       v Table of Contents   Abstract .......................................................................................................................................... ii Preface ........................................................................................................................................... iv Table of Contents ...........................................................................................................................v List of Tables .............................................................................................................................. viii List of Abbreviations ................................................................................................................... ix Acknowledgements ........................................................................................................................x Chapter 1: Introduction ................................................................................................................1 1.1 Statement of the problem and research questions ........................................................... 3 1.2 Background to the study ................................................................................................. 3 1.3 Significance of the study ................................................................................................. 5 Chapter 2: Literature Review .......................................................................................................7 2.1 Portfolios and their key features ..................................................................................... 8 2.1.1 Portfolio types ......................................................................................................... 8 2.1.2 Strengths and weaknesses of using portfolio assessment ....................................... 9 2.2 Research on portfolio language assessment .................................................................. 11 2.3 PBLA in the LINC programme..................................................................................... 16 2.3.1 Portfolio tasks ....................................................................................................... 19 2.4 Theoretical framework: Test usefulness framework..................................................... 22 2.4.1 Construct validity .................................................................................................. 23 2.4.2 Reliability .............................................................................................................. 27       vi 2.4.3 Authenticity........................................................................................................... 30 2.4.4 Interactiveness....................................................................................................... 31 2.4.5 Impact ................................................................................................................... 32 2.4.6 Practicality ............................................................................................................ 33 2.5 Research on PBLA in the LINC programme ................................................................ 33 Chapter 3: Methodology..............................................................................................................37 3.1 Introduction ................................................................................................................... 37 3.2 Research participants .................................................................................................... 38 3.2.1 Participant profiles ................................................................................................ 41 3.3 Data collection .............................................................................................................. 43 3.4 Data Analysis ................................................................................................................ 44 Chapter 4: Findings and Discussion ...........................................................................................46 4.1 Introduction ................................................................................................................... 46 4.2 Teachers’ perceptions of PBLA compared with other traditional assessment methods 46 4.2.1 Conclusion ............................................................................................................ 48 4.3 Teachers’ perceptions of the usefulness of PBLA as a method of assessment ............. 48 4.3.1 Validity ................................................................................................................. 49 4.3.2 Reliability .............................................................................................................. 56 4.3.3 Authenticity........................................................................................................... 66 4.3.4 Interactiveness....................................................................................................... 69 4.3.5 Impact ................................................................................................................... 74 Positive impacts of PBLA on learners’ lives .................................................... 75       vii Negative impacts of PBLA on learners’ lives................................................... 81 Positive impacts of PBLA on Canadian society ............................................... 88 Impacts of PBLA on teachers ........................................................................... 89 4.3.6 Practicality ............................................................................................................ 90 4.3.7 Conclusion .......................................................................................................... 100 4.4 Logistical challenges in implementing PBLA as an assessment tool ......................... 101 Challenges in conducting needs assessment and introducing PBLA concepts101 Challenges in training new LINC instructors ................................................. 105 Conclusion ...................................................................................................... 106 4.5 Teachers perceptions of PBLA training ...................................................................... 107 4.5.1 Teachers’ general views on PBLA training ........................................................ 107 4.5.2 Teachers’ views on PBLA training regarding developing assessment tools ...... 113 4.5.3 Teachers’ suggestions for improving PBLA training and implementation ........ 115 4.5.4 Conclusion .......................................................................................................... 122 Chapter 5: Conclusions .............................................................................................................123 5.1 Summary ..................................................................................................................... 123 5.2 Implications of this study ............................................................................................ 125 5.3 Limitations of the study .............................................................................................. 128 5.4 Recommendations for future research ........................................................................ 129 5.5 Concluding remarks .................................................................................................... 130 References ...................................................................................................................................132        viii List of Tables  Table 3.1 Participants' Demographic Information ........................................................................ 40         ix List of Abbreviations  CCLB: Centre for Canadian Language Benchmarks Common European Framework of Reference: CEFR CIC: Citizenship and Immigration Canada CLB: Canadian Language Benchmarks ELSA: English Language Services for Adults ESL: English as a Second Language IRCC: Immigration, Refugees and Citizenship Canada LINC: Language Instruction for Newcomers to Canada PBLA: Portfolio-Based Language Assessment          x Acknowledgements  First and foremost, my utmost gratitude goes to my supervisor Dr. Monique Bournot-Trites for her continuous support and insightful guidance throughout every aspect of this project. My special words of thanks should also go to the other members of my committee, Dr. Lee Gunderson and Dr. Kadriye Ercikan, for their invaluable support and feedback.  I am deeply grateful for the support of the LINC instructors who participated in this study, and willingly shared their precious time during the process of interviewing, without whose help and support data collection would have been impossible.  I owe my deepest gratitude towards my better half Ismaeil Fazel for his dedicated help and support and understanding of my goals and aspirations. His infallible love and support has always been my strength.  I am thankful to my parents whose encouragements and love have always given me strength. Last but not least I am sincerely thankful to God who made all the things possible...         1 Chapter 1: Introduction Language proficiency and the ability to communicate effectively in English or French is arguably integral to the success and smooth integration of newcomers into Canadian society. Findings from a number of empirical studies (e.g., Boyd, 1992; Burnaby, 1992; Derwing & Waugh, 2012; Duff, Wong, & Early, 2000; Picot & Sweetman, 2012) indicate a strong link between immigrants’ language proficiency level and their socialization into the Canadian context and achieving socioeconomic success.  In response to the language needs of newcomers to Canada, and to facilitate the sociocultural and economic integration of newcomers and refugees into Canadian society, the LINC programme was established in 1992 with a mandate to provide free, basic English language instruction for adult new immigrants and refugees. Thus, the main purpose of the LINC programme is to ease the process of newcomers’ settlement in Canada. To achieve this goal, the English language pedagogy in LINC is focused on and revolves around themes that pertain to various aspects of life in Canada such as Canadian values, laws, rules, rights and responsibilities, and more importantly the language required for day-to-day life. Based on second language training policy by the government of Canada, Canadian Language Benchmarks (CLB) are used to inform and guide the instruction and assessment practices in the LINC programme. Although CLB has been introduced as “a descriptive scale of communicative proficiency” and “a framework of reference for learning, teaching, programming and assessing adults’ English as a Second Language in Canada” (the Canadian Language Benchmarks, 2012, p. V), there seems to be a gap between the immigrant language training policy and the more specific “curricular” level for LINC teachers. Consequently, teachers appear       2 to have difficulty translating policy goals and mandates into the classroom and assessment practices. For instance, although teachers are required to adhere to the CLB guidelines (both for instruction and assessment purposes), teachers “have complained that the benchmarks are difficult to interpret and realize” (Haque & Cray, 2010, p. 635).  Furthermore, the findings from the 2010 LINC evaluation spotlighted the insufficiency of appropriate tools to measure the impact of LINC on the language learning of learners in this programme1. Along similar lines, some focal studies on language training in Canada also revealed the inconsistency of assessment in LINC programmes (e.g., Makosky, 2008; Nagy & Stewart, 2009). In order to address this issue, Citizenship and Immigration Canada (CIC) allocated funding to different projects to field-test the efficacy of implementing Portfolio Based Language Assessment (PBLA) in different LINC programmes in eastern Canada. Given the positive outcomes of the PBLA pilot projects, as well as the positive reception from partners and stakeholders, “the development of PBLA was undertaken as a CIC priority” (Pettis, 2012, p. 18). As a result, in March 2013, CIC Operational Bulletin introduced PBLA as a “standard” feature of LINC programme and announced that PBLA will be phased in across all LINC programmes in Canada over a three-year period in order to “address the need for a standardized in-class language assessment protocol in LINC”. Having provided a brief overview of the LINC programme, I will now turn to a discussion of the research questions and provide a rationale for my proposed study.                                                  1       3 1.1 Statement of the problem and research questions This exploratory study investigated the perceptions of English teachers about the usefulness (Bachman & Palmer, 1996), challenges and benefits of the newly implemented PBLA in the LINC programme. Therefore, this study aims to look into how teachers in different basic CLB levels (literacy-CLB 4) perceive the use of PBLA in their classes.  The proposed study will seek answers to the following research questions: 1. How do teachers perceive PBLA compared to other traditional assessment methods?  2. From the perspective of LINC instructors, what is the usefulness of PBLA?   3. From the perspective of LINC instructors, what are the challenges and benefits of using PBLA?  4. What are LINC instructors’ views on PBLA training?  Thus, drawing on Bachman and Palmer’s (1996) concept of test usefulness (validity, reliability, authenticity, interactiveness, impact and practicality), this study explored the usefulness, challenges and benefits of PBLA from the perspectives of the English teachers in the context of the LINC programme. In the following, I will first introduce Portfolio Based Language Assessment (PBLA), and then will elaborate on the history of PBLA in the LINC programme in Canada. 1.2 Background to the study In the past decade, the field of second language education has shown a growing tendency towards student-centred communicative approach to language pedagogy. Because of the current developments in theories of second language education, in recent years, the field of language assessment has witnessed an ongoing change. Traditional assessment techniques such as       4 multiple-choice, true-false and fill-in-the-blanks tests and essays do not appear to be congruent with current ESL classroom practices. Standardized testing has also been criticized (Haney & Madaus, 1989; Moya & Malley, 1994; Schifter & Carey, 2014) due to many reasons including its incompatibility with the current SLA theories and pedagogical practices in ESL classrooms. Therefore, portfolio language assessment has gained momentum as an alternative (Lynch & Shaw, 2005) way of assessment and a viable alternative to traditional assessment techniques (Fox, 2008, 2009, 2014; Hamp-Lyons & Condon, 2000; Hirvela, 1997; Ripley, 2012).  According to Johnson, Mims-Cox, and Doyle-Nichols (2010) portfolios are defined as collections of students’ work that is put together over time that “can be organized to assess competencies in a given standard, goal, or objective” (p. 5). Moya and O’Malley (1994) provide a distinction between a portfolio and portfolio assessment explaining “a portfolio is a collection of a student's work, experiences, exhibitions, self-ratings (i.e., data), whereas portfolio assessment is the procedure used to plan, collect, and analyze the multiple sources of data maintained in the portfolio” (p. 2). As far as the LINC programme is concerned, the purpose of using portfolios is twofold. Language learners prepare portfolios as a tool both for self-assessment, and summative assessment (Pettis, 2014).  Portfolio language assessment in the LINC programme was initially implemented in Manitoba. In 2003, a group of experienced LINC teachers from a variety of CLB levels including ESL Literacy developed a guide, which included suggestions and materials for using PBLA with learners at different CLB levels. This guide was called “Collaborative Language Portfolio Assessment (CLPA): Manitoba Best Practices”. In 2004, CLPA was introduced in three key Winnipeg programmes and was phased in across all LINC programmes in Manitoba,       5 including regional programmes, from 2005 to 2008. The effectiveness of implementation of CLPA in different programmes was regularly monitored and reviewed by Manitoba’s Adult Language Training (ALT) Branch project officers in 2008. Finally, in 2009, the CLPA protocol and expectations were standardized, the CLPA Binder Divider was introduced, and the CLPA: Manitoba Best Practices Guide was revised. In 2010, a survey was conducted by Winnipeg Technical College to collect feedback from teachers and students on the use of CLPA. The survey results indicated that the majority of students and teachers supported the use of CLPA and found it helpful (Pettis, 2014). After the successful implementation of portfolio assessment in Manitoba, and based on Manitoba’s CLPA model, “CIC designed a new CLB-based language assessment approach known as Portfolio-Based Language Assessment (PBLA)” (Singh & Blakely, 2012, p. 9). Subsequently, from September 2010 to January 2012, CIC conducted a comprehensive and successful field-test of PBLA in Ottawa LINC classes. Since then, the pilot has been expanded to Edmonton, Moncton, Saint John, and Fredericton. Due to the success of the PBLA pilot projects, CIC has decided to make PBLA a standard feature of all LINC classes across the country (Singh & Blakely, 2012). Despite the extensive amount of funding that has been allocated to this national project, there are only a limited number of studies available on PBLA in the LINC programmes. Therefore, in response to these gaps, this study aims to investigate teachers’ perceptions of this method of assessment in LINC classes. 1.3 Significance of the study The findings of this study will have potential implications for the LINC service providers, LINC administrators and teachers, as well as other policy makers who are directly or indirectly involved in this national project (e.g., Centre for Canadian Language Benchmarks (CCLB),       6 Citizenship and Immigration Canada (CIC), Ministry of Citizenship & Immigration (MCI)). More importantly, the results of this study will directly and indirectly impact the language learners in the LINC programme, particularly those who need to reach the CLB level 4 in order to attain the language requirement for the Canadian citizenship application. For instance, through this study, LINC teachers will have a chance to be familiarized with Bachman and Palmer’s (1996) framework of test usefulness, which could potentially impact how they develop assessment tasks and assess the portfolios. In addition, by casting light on the perceptions of LINC teachers of the PBLA and investigating its usefulness as a method of assessment, this research will address the gap in the literature about the usefulness of portfolios as means of assessment, which can potentially lead to the formation of support and scaffolding programmes aimed at a more effective implementation of the PBLA in the LINC programmes across Canada.           7 Chapter 2: Literature Review  In the past few decades, portfolio assessment has been receiving increasing attention from researchers and practitioners in different educational contexts (Pitts, Coles, & Thomas, 2001; Ripley, 2012). As mentioned in the introduction chapter, the paradigm shift in the field of education (in general) and in second language pedagogy (in particular) towards the communicative competence approach demand alternative ways of assessment, which are more in line with student-centered pedagogy in current classrooms. Therefore, portfolio assessment as an alternative assessment technique (Lynch & Shaw, 2005) has gained increasing popularity among educators, mostly because portfolios have the potential to elicit the competency of most students, not just those motivated to do well on decontextualized, on-demand, one-shot tests (Herman & Winters, 1994). One of the key features of portfolios is the fact that they are collected over a time period, which provides students with multiple opportunities for revision and reflection (Pitts & Thomas, 2001). As such, students are not only actively engaged in their own learning process, but are also involved in assessment procedures. Thus, portfolios have the potential to link the instruction to assessment, allow learners to reflect on and take responsibility for their own learning, enable learners to see gaps in their learning, and encourage them to take risks (Ekbatani, 2000). Perhaps it is because of these positive features that the use of portfolios has been promoted in the LINC programme in Canada. Since PBLA is soon replacing other assessment tools such as end-of-term exit tests in the LINC programme, it is important to investigate its usefulness as an assessment tool. In what follows, I will provide an overview of       8 portfolios and their core features, as well as the existing literature on portfolio language assessment.   2.1  Portfolios and their key features Portfolios, in broad terms, are defined as a selection of students’ work over a time period, which are collected and organized based on a specific goal and standard in order to assess students’ progress. Portfolios have the potential to be used both for formative and summative assessment purposes (Johnson, Mims-Cox, & Doyle-Nichols, 2010). Portfolio assessment, as argued by Brown and Hudson (1998), is “purposeful collections of any aspects of students’ work that tell the story of their achievements, skills, efforts, abilities, and contributions to a particular class.” (p. 664) 2.1.1  Portfolio types There are generally two main types of portfolios: working portfolios and showcase portfolios. Working portfolios are typically used to show work-in-progress and thus serve as evidence of learners’ progress over time. Therefore, they are primarily used for formative assessment purposes (Black & Wiliam, 1998, 2001). In contrast, showcase portfolios, as the name suggests, provide “a site for showcasing samples of end-of-unit or course achievement” (Fox, 2012, p. 69). Hence, showcase portfolios in which “the best evidence of that student’s qualifications is presented” (Johnson, Mims-Cox, & Doyle-Nichols, 2010, p. 34) are utilized for summative (achievement) purposes (Fox, 2008). In the context of language assessment, portfolios have been used by language instructors both for formative and summative purposes. Research on portfolio assessment has shown its considerable benefits for formative assessment (Fox, 2009; Fox &       9 Hartwick, 2011; Little, 2005, 2007, 2013). However, when it comes to summative assessment – specifically in cases where external review of portfolios is required – portfolios may present challenges for both the teachers and learners (Fox, 2014). For instance, in some contexts, teachers may not have the freedom to select students’ best evidence of progress, because they are required to showcase work predetermined by curricular specifications (Callahan, 2001; Fox, 2014; Ripley, 2012; Spalding & Cummins, 1998). 2.1.2 Strengths and weaknesses of using portfolio assessment There are many advantages to using portfolios. One of the positive features of portfolios is the fact that they promote and increase learner autonomy and engagement by encouraging students to take an active role in their learning, which can potentially boost learners’ motivation (Little, 2005, 2007, 2013, 2015). In addition, as an assessment tool, portfolios enable teachers to tailor their assessment toward the learning objectives and assess students on the skills developed throughout the course (Pettis, 2012). Portfolio assessment is a classroom-based language assessment procedure which utilizes assessment tasks that reflect authentic activities related to learners’ lives and classroom instruction, therefore it is more authentic than other forms of assessment. In addition, because portfolio items serve as evidence of student progress over time, they can be used formatively, as opposed to the summative role of standardized tests (Moya & O’Malley, 1994).  Despite all its strengths, as with other forms of assessments, portfolio assessment has its own drawbacks. For instance, one of the key issues that has been raised by many studies is that assessing portfolios is labour-intensive and costly (Ripley, 2012; Mathews, 2004; Song & August, 2001; Spalding & Cummins, 1998). Another major challenge is designing and       10 developing rubrics for assessing portfolios, which requires extensive training and demands a considerable amount of time (Brown & Hudson, 1998; Ripley, 2012). Moreover, validity and reliability of portfolio assessment have been a matter of ongoing debate in the testing literature (Bachman, 2002; Brown & Hudson, 1998) – This will be further explained in the following sections. Another issue in using portfolio assessment is that portfolios can be prone to the subjective judgments of teachers and assessors. For instance, in the LINC programme, as I will explain later, only a limited number of assessment tasks must be included in portfolios and it is the teachers who decide which ones should be collected. Therefore, as Fox (2014) argues, procedures used in portfolio assessment “require subjective judgment not only in evaluating performance, but also in selecting what is to be assessed” (p. 69). This may be discouraging for some students, because they may not view teacher’s selection of their work as convincing evidence of their performance or language capability. As a result, they may continue to place more stock in traditional tests and grades as evidence of their achievement, as opposed to their scores from portfolio assessment (Lam & Lee, 2010). Another area of concern for utilizing portfolio assessment is the fact that students must make sure not to lose their portfolios, so that they have sufficient evidence of their language proficiency development. In the context of the LINC programme, other issues that adversely impact the use of portfolios include but are not limited to continuous intake of learners in some programmes, learners’ recurring absences, and variations among different programmes (Ripley, 2012). According to its guidelines, PBLA does not prescribe a uniform curriculum. Consequently, in different LINC programmes – and even in the classes within each programme – teachers need to develop “specific classroom curricula based on the students’ needs” (Pettis, 2014, p. 8). Lack of an established curriculum, despite its       11 potential merits, can be at times problematic, particularly in cases where students who move to a different class may have already been taught the same topics and lessons as the other class. In the following section, I will provide a synopsis of studies on portfolio language assessment. 2.2 Research on portfolio language assessment In the past few decades, portfolio assessment has regained increasing popularity as a viable and valuable assessment tool (Biggs & Tang, 1997; Hamp-Lyons & Condon, 2000; Huot & Williamson, 1997). There is an extensive body of scholarly literature on portfolio assessment in education, in general. However, since the focus of my study is on portfolio based language assessment (PBLA), I will provide a selective review of the studies that have primarily focused on portfolio language assessment.  Research on portfolio language assessment shows that since the mid-1980s writing portfolios have been commonly used as a tool for assessment in the L1 writing classroom (Belanoff & Dickson, 1991). In fact, among other language skill areas, assessing writing through portfolios has been receiving increasing attention from researchers and educators. This is perhaps due to the fact that portfolios provide a more comprehensive measure of what students can do, and because they have the potential to replace the timed writing tests, which tend to be “particularly discriminatory against non-native writers” (Hamp-Lyons & Condon, 2000, p. 61). Quite conceivably, that is why Huot and Williamson (1997) considered portfolios as “the most popular form of writing assessment ever” (p. 44). Furthermore, the growing popularity of portfolio assessment could be related to the fact that portfolio assessment has the potential to facilitate productive learning by aligning teaching and assessment (Huot 2002; Klenowski, 2002).        12 The available body of research on portfolio writing assessment shows that much attention has been paid to summative assessment, while the formative aspect and its potential to effectively align teaching with assessment – especially in the EFL context – is still under-explored (Lam & Lee, 2010). To address this gap in the literature, Lam and Lee (2010) investigated students’ and teachers’ perceptions of the impact of portfolio assessment on student writing in a post-secondary EFL classroom in Hong Kong. The research participants included 31 students studying in an academic writing course, as well as four instructors. All the students who participated in this study were non-English majors and Cantonese speakers. As for the four instructors, three of them had 6–10 years of experience in English teaching and the fourth had about one year of teaching experience. None of the research participants – neither the students nor the instructors – had used a portfolio approach before. Data were collected through questionnaires and semi-structured interviews (in Cantonese) with students and instructors. Students were expected to write multiple drafts (500 words) for three different essays, and select two of their best final drafts for summative assessment. The students’ final sores on portfolios were based on the summation of the scores of the best two pieces selected by the students. The interim drafts were not marked and were merely used for formative assessment. Students received regular teacher and peer feedback on content, language, and organization of their writing to improve their drafts. At the end of the portfolio programme, students were also required to write a reflective journal to evaluate their strengths and weaknesses in writing. The findings from student and instructor interview data indicated that both students and their instructors were appreciative of the formative potential of portfolio assessment, because they could be empowering for students in terms of learning how to improve their writing. Therefore,       13 the results illustrated that the dual function of portfolios for summative–formative assessment of writing can lead to enhancing the writing skill of students (Lam & Lee, 2010).  In the same line of inquiry, Taki and Heidari (2010) explored the effectiveness of portfolio-based writing assessment in the EFL context of Iran. Pre-intermediate young Iranian English learners (N=40) participated in this study. The participants were randomly divided into experimental and control groups, each with 20 students. Those in the experimental group wrote five essays, and received feedback on content, organization, voice, word choice, coherence and cohesion, and were given the opportunity to revise and resubmit their writings. In contrast, the students in the control group wrote only one draft, and their writings were corrected only once by their teacher. In addition to their essays, similar to Lam and Lee’s (2010) study, the participants were also asked to complete a questionnaire to share their reflection of using portfolios. The findings of this study echoed those of Lam and Lee (2010) and other studies that indicate portfolio-based writing assessment has a positive impact on language learning and improving the writing ability of learners as well as their self-assessment. The results also showed that almost all the students were in favour of this method of assessment.  In another strand of research, in an ESL context, Ruetten (1994) looked into the success rates of ESL students (N=27) in comparison with their native English-speaking (NES) peers (N=27) in an institutional exit proficiency writing exam, by exploring the sources of their success. In order for students to pass the course, they were required to pass the writing proficiency exam, and those who failed the exam were given a second chance by presenting an appeal folder. The appeal folder was a portfolio of students’ written work over the course of the semester. The findings showed that only 33% of ESL students and 70% of NES students were       14 able to pass the proficiency writing exam. Therefore, those who failed the exam were asked to submit their writing portfolios. Among those who had submitted their writing portfolios, 63 % of NES and 89 % of ESL students were able to pass the course. In other words, the results revealed that while ESL students were twice as likely as NES writers to fail the proficiency exam, “they made up for their failure in the [portfolio] appeals process, ending up with a comparable pass rate.” (Ruetten, 1994, p. 90) The findings support earlier studies (Johns, 1991; Thompson, 1990) which suggest that “holistically scored competency exams are particularly (and perhaps unfairly) difficult” (p. 85) for ESL writers, and as a result, portfolio assessment may be a more viable option to assess their writing.  In a similar vein, Song and August (2001) conducted a quantitative study to compare the performance of two groups of advanced ESL students in a second semester composition course. At the end of the semester, one group was assessed based on portfolio assessment along with City University of New York (CUNY) Writing Assessment Test (WAT), while the other group was assessed using WAT. The students from the portfolio group with passing portfolio scores were allowed to move up to the next level regardless of their WAT scores; however, students in the non-portfolio group were required to pass the WAT to be eligible to move ahead. The findings indicated that students were twice more likely to pass into the next level when they were evaluated by portfolio than when they were required to pass the WAT, because portfolio assessment made it possible to identify more than twice the number of ESL students who proved to be successful in the next English course. Thus, it was concluded that portfolio assessment seemed to be a more appropriate assessment alternative for the ESL population.        15 Similarly, Little (2005, 2007, 2009) conducted a study in the ESL context of Ireland. The research participants were ESL students –who were mostly refugees with little or no knowledge of English – in the public schools of Ireland. The main aim of the study was to explore whether using tools such as portfolios could impact learners’ self-assessment and their autonomy in language learning. In order to help ESL students, first CEFR (Common European Framework of Reference) was adopted to match the particular language needs and the age of the ESL learners in Ireland mainstream schools. This was carried out by identifying 14 major themes in the official curriculum of Irish primary schools. As a result, CEFR descriptors were modified and rewritten based on the 14 recurring themes in the curriculum of Irish schools. This lead to the creation of English Language Proficiency Benchmarks that offered a global scale of communicative proficiency and 14 scales related to the curriculum themes. The scales and their benchmarks were substantially revised with the help of the ESL support teachers and using their experiences, over the course of three years. ESL support teachers were required to use English Language Portfolios (ELP) to scaffold their pedagogy. According to Little (2005), ELP, like any portfolio based on the CEFR, has three main sections including passport section, language biography and dossier. The passport section is used to briefly record the learner’s language profile and background, and substitutes a simplified version of the global benchmarks of English language proficiency for the self-assessment grid from the CEFR. Learner’s progress across each of the three levels is recorded at regular intervals using three criteria, namely “with a lot of help, with some help, without help – which takes account of the gradual nature of language learning” (Little, 2005, p. 331). The language biography section has a checklist for each of the theme-related units of work that is used for goal-setting and self-assessment. Each checklist includes the       16 descriptors which are grouped according to level. In other words, “cumulatively the checklists restate the ESL curriculum content as elaborated in the benchmarks” (p. 331). The checklists function as a self-assessment tool. Students use them once every two weeks to identify and record the themes that have been covered and the new tasks they can perform, as well as the learning objectives in the next learning phase. After completing the checklists, students are required to add them to the language biography section of the ELP, and every two months, they put their summarized cumulative self-assessment in the language passport section. The dossier section mainly contains some generic pages that are used for recording different aspects of learning such as lists of vocabulary, summaries of books read, worksheets and supplementary materials for learning. The findings of this study showed that using this approach helps learners to exercise their agency in their language learning process and provides them with an opportunity to gradually develop an awareness of their own progress and their identity as a language learner and user. In what follows, I will explain how portfolios are used in the LINC programme.  2.3 PBLA in the LINC programme As noted in the preceding sections, the LINC programme is funded by the Ministry of Citizenship and Immigration Canada (CIC) to provide free basic language training for adult newcomers who are permanent residents or refugees. Prior to entering the LINC programme, the eligible applicants are required to take the Canadian Language Benchmarks Placement Test (CLBPT)2, which measures the applicants’ abilities in listening, speaking, reading and writing,                                                  2 Centre for Canadian Language Benchmarks Website:       17 and determines their level of language proficiency based on the Canadian Language Benchmarks (CLB). CLB Placement Test (CLBPT), based on the National Placement Guidelines (2014), “places a learner on the CLB scale of communicative competence” (p. 2), meaning that the CLB level assigned to a learner at the time of placement assessment indicates that the learner has demonstrated “the level of communicative ability associated with most or all (traditionally, 70 to 100%) of the descriptors for the benchmarks assigned in each of the four skills” (National Placement Guidelines, 2014, p. 3). As Pettis (2014) put it, “CLBPT is reliable within a benchmark level, which means the student has demonstrated proficiency at the level assessed but may be higher” (p. 49). As specified by the National Placement Guidelines (2014), in CLBPT, the difficulty level in the speaking, listening and reading assessment gradually increases, and in the writing tasks “the assessor evaluates each task based on a decision tree, rubric, or holistic band aligned with CLB descriptors” (p. 2).  According to the CLB (2012) document, there are 12 CLB levels, which include basic, intermediate and advanced levels. However, there are no advanced English classes for those who are in CLB levels 8-12, since CIC mandates the LINC programme to provide “basic language training”. Accordingly, LINC levels range from Literacy level – which is a foundation level prior to CLB 1, and is for those who have difficulty in reading and writing in any language, such as refugees who may not have any literacy skills, and people with a first language that does not have a written form– to CLB 8 (high intermediate). LINC classes are offered through an array of various service providers, including school boards, community and settlement organizations and community colleges, schools, community centres, libraries and local churches. For instance, some of the main LINC service providers in Vancouver include: Mosaic, ISS of BC, SUCCESS,       18 VCC (Vancouver Community College), Richmond School Board, Burnaby School Board, to name a few. The majority of LINC service providers only offer classes from Literacy level to CLB 4 due to the large number of applicants in these levels. Other LINC programmes that offer CLB levels 5, 6, 7 and 8, in addition to general English classes, often provide English-for-Workplace classes as well; these classes are geared towards helping learners in higher levels of English proficiency to learn the language they need for employment in the Canadian context. Previously, when teachers came to the conclusion that a student was ready to move up to the next level, they would typically give him or her a standardized CCLB exit test to determine if the student had the required language skills to move up. After completing each level, language learners would receive a certificate demonstrating their language proficiency skills at that level. Upon completion of CLB level 4 or higher in speaking and listening, they were eligible to receive a certificate as proof of having met the Canadian citizenship language requirement.  As explained earlier, PBLA is soon replacing other forms of assessment in all LINC programmes across Canada. Many LINC programmes in Ontario, Alberta and Manitoba have already been involved in PBLA pilot projects, and have phased in PBLA. However, LINC programmes in other parts of Canada are still in the preliminary stages of PBLA implementation. According to PBLA guide (Pettis, 2014), PBLA implementation in a given LINC programme occurs in three phases. In Phase 1, CIC and the administrators in each LINC programme schedule a plan for PBLA implementation. Subsequently, depending on the number of classes in the programme, some teachers will be recruited by programme administrators as Lead Teachers. Following that, Lead Teachers take a PBLA Lead Teacher Foundation Course. In phase 2, Lead Teachers start implementing PBLA in their own classes and begin to introduce PBLA to their       19 colleagues. Afterwards, Lead Teachers work closely with their Regional Coach to share their experiences and review the materials developed for professional development sessions for classroom teachers. Then, after receiving some PBLA training, classroom teachers start using some features of PBLA in their classes. Finally, in Phase 3, full implementation of PBLA will take place in all LINC classes in the programme.   2.3.1 Portfolio tasks When students register in the LINC programme they receive a Language Companion binder, which aims to support language learning, PBLA, and the student’s settlement in Canada. It should be noted that there are three versions of the Language Companion binder: ESL Literacy, CLB 1-4, and CLB 5-8. According to the PBLA guide (Pettis, 2014), the binders have different sections, which include the following:  Canadian Language Benchmarks – Information about the CLB levels based on the “Can Do Statements”   My Canada – Basic reference information about Canada that is important for settlement, including opportunities for students to personalize the information   Where I Live – Information related to provincial, regional, and municipal features and services, including opportunities for students to add resources and personalize the information   Helpful English – Useful English language reference information, with the option of adding additional information provided by the teacher        20  My Notes – An empty section for students to keep day-to-day worksheets, rough drafts, and handouts. This section can be culled when items are no longer needed or when a student moves to a new class.   My Portfolio – A section in which students organize artefacts for assessment purposes. Teachers collect this section to review it for evaluation at the end of the term, and then return it. “My Portfolio” is divided into six subsections: About Me, Listening, Speaking, Reading, Writing, and Other. (p. 19)  Of particular relevance to this study is “My portfolio” section, which is comprised of two types of tasks to be assigned: language-assessment tasks and skill-using activities/tasks. Skill-using activities, as the name suggests, are used to provide an opportunity for language learners to use and practice their skills and what they have learned in situations similar to language-assessment tasks and real-life contexts. Learners usually do these activities in pairs or in small groups. Skill-using tasks have been designed to offer students the chance to reflect on their own learning and to do a variety of self-assessment and peer assessment activities based on teacher’s instructions. On the other hand, the language-assessment tasks, are mainly used for language assessment purposes. The major difference between these two types of tasks is the amount of support and scaffolding students receive from their teachers for task completion. Prior to each task, the teacher provides instructions about how the tasks should be completed, and explains what is expected of students based on the CLB competences that will be addressed in those tasks. More importantly, the teacher provides criteria and rubrics for assessing the tasks. In skill-building activities teachers provide feedback, scaffold the activities and offer suggestions to support students. Nevertheless, language-assessment tasks – which are completed in classroom –       21 are treated more formally and are used to determine what students are able to do with their language proficiency independently.  Based on PBLA protocol, students cannot self-select to put the artefacts in their portfolios. Teachers decide what should be put in the – listening, speaking, reading and writing sections of – portfolios to ensure that portfolios contain the assessment tasks that provide explicit evidence of the students’ having achieved CLB outcomes. It should also be noted that the number of language-assessment tasks for portfolios depends on whether the programme is full-time or part-time. According to PBLA guide (2014), in full-time programmes – where the classes meet five days a week – students are required to collect at least eight to 10 artefacts per skill area (two or three entries per week) to represent sufficient evidence of their language proficiency and CLB outcome in each skill. In part-time programmes, however, teachers may aim for two to four artefacts every two weeks. Throughout the course, teachers provide regular feedback on portfolio entries and evaluate what students can do based on CLB descriptors for each skill area in a given CLB level. At the end of the term or semester, after conducting an end-of-term portfolio review, the teacher holds a brief conference (10 to 15 minutes only) with each student to discuss the student’s progress as well as his or her needs and goals with reference to the evidence in the portfolio. In addition, during the student’s progress conference, students receive a standardized progress report, which has been completed by their teacher and approved by the programme administrator. Having provided the background information regarding deployment of portfolios in the LINC programme, I will now turn to a discussion of PBLA through the lens of Bachman and Palmer’s (1996) framework of “test usefulness” as the theoretical framework for the       22 proposed study. I will end this chapter by presenting an overview of the literature on the studies that have investigated PBLA in the context of the LINC programme.  2.4 Theoretical framework: Test usefulness framework The proposed study employed Bachman and Palmer’s (1996) framework of “test usefulness” in relation to its six encompassing test qualities: validity, reliability, authenticity, interactivity, impact, and practicality as the theoretical framework. Bachman and Palmer (1996) conceptualize test usefulness as a “function of several different qualities all of which contribute in unique but interrelated ways to the overall usefulness of a given test” (p. 18). These qualities include “validity, reliability, authenticity, interactivity, impact, and practicality” (p. 18). In order to operationalize the framework of test usefulness framework, Bachman and Palmer (1996) introduce the following three principles:  Principle 1: Increasing the overall usefulness of a test should take precedence over the individual qualities that impact usefulness.   Principle 2: The individual qualities must be considered with regard to their combined impact on the overall test usefulness rather than being evaluated independently.   Principle 3: There is not a one-size-fits-all approach for test usefulness and the appropriate balance among the different qualities, rather these must be determined based on each particular testing situation.  These principles, according to Bachman and Palmer (1996), are founded on the underlying premise that “in order to be useful, any given language test must be developed with a specific purpose, a particular group of test takers and a specific language use domain”—also referred to as “target language use” or “TLU”—which they define as “the situation or context in which the       23 test taker will be using the language outside of the test itself” (p. 18). In the following section, PBLA in the LINC programme will be discussed in light of the six above-mentioned test qualities – i.e., validity, reliability, authenticity, interactivity, impact, and practicality – in the test usefulness framework. 2.4.1 Construct validity Construct validity, for Bachman and Palmer (1996), is the extent to which a test measures what it claims to measure and the degree to which a given test score can be interpreted as “an indicator of the ability(ies), or construct(s)” (p. 21) a given test purports to measure. Construct validity also refers to the “domain of generalization to which our score interpretations generalize” (p. 21), which means language test tasks should be developed in a way that correspond to the target language use domain, so that a test score can be generalized to the test taker’s language ability beyond the testing situation. As far as portfolio assessment is concerned, Moya and O’Malley (1994) argue that since portfolio assessment is essentially a qualitative approach to student assessment, establishing its validity and reliability may be more challenging, compared to traditional forms of assessment, which tend to be more quantitative. The validity of portfolio assessment may be challenged in the three areas of construct validity, criterion-related validity, and content validity. One of the major issues concerning the construct validity of portfolio assessment is the authorship of portfolios and the meaningfulness of student scores (Condon & Hamp-Lyons, 1991; Gearhart, Herman, Baker, & Whitaker, 1992). In other words, since students receive different rounds of feedback from their instructors and peers on their portfolios, to what extent can portfolio scores represent students’ true competencies? In order to address this key question, Gearhart, Herman,       24 Baker, and Whitaker (1993) investigated the meaningfulness of portfolio scores. Nine elementary school teachers and six students participated in this study. The teachers were asked to select two students at three different writing competency levels (low, medium and high) to collect portfolios of their writing assignments. Teachers recorded the instructional support provided for each of the six target students during the writing and editing phases. Students’ writings were rated based on writing proficiency dimensions including content, organization, style and mechanics; and within each rating, teachers also documented the amount of support they provided to help students in each dimension as well as the amount of time students spent on writing. The findings showed that, generally, students who were in low and medium proficiency levels needed more support than those in high levels. Also, there was considerable variability in the amount of support teachers provided across different writing competency dimensions for students in different levels. In addition, the results indicated that the patterns of teachers’ ratings differed between those who had prior experience with portfolio assessment and those who were new to this method of assessment. Moreover, teachers who had used portfolio assessment for over a year provided more support for their students, compared to those who did not have experience in this regard. Gearhart, et al. (1993) argue that in cases where portfolio assessment is used to make critical decisions about students’ lives, portfolio scores (ratings) need to be adjusted to reflect differences in support and assignment difficulty so that comparability of results and achieving validity become more feasible.   In a similar study, Gearhart and Herman (1998) examined the integration of classroom and large-scale portfolio assessment. They disputed the accountability of portfolio scores, particularly in cases where external examiners are required to rate the student portfolios prepared       25 with the support of teachers, peers and parents. They questioned the eligibility of an external rater – and lack of his/her familiarity with the classroom context and the individual students – for rating the portfolios that are the product of the collaborative work of students with their peers and teachers. Gearhart and Herman refer to Web’s (1993, 1995) study that suggests a student’s performance in collaborative work does not necessarily reflect his/her capabilities. For instance, Web’s study showed that sometimes students with lower levels of proficiency achieved higher scores on the basis of group work. Web (1993, 1995) concludes that an external rater’s portfolio scores may overestimate student’s performance particularly because she/he may not be aware of the amount of support student has received in completing his/her portfolios. On the other hand, when a teacher rates her/his own students’ portfolios, she/he may tend to adjust the score downward depending on how much help the student needed for completing the portfolio tasks. Consequently, these issues “may threaten the validity of inferences that can be drawn about individual performance from portfolios constructed in the social complexities of classroom life”. (Gearhart & Herman, 1998, p. 41)   Lack of criterion-related validity, which could be caused by variations in portfolio assessment procedures, has been considered as one of the shortcomings of portfolio assessment. The use of portfolio assessment in Vermont by Koerts, et al. (1994) indicated that discrepancies in portfolio implementation in different programmes raised serious questions regarding the validity of comparisons among schools. Furthermore, the study showed that there was no correlation between the scores from mathematics portfolios and the mathematics uniform test. This was similar to the findings from Herman, Gearhart, and Baker’s study (1993) that revealed no correlation between the scores from a standard writing assessment and portfolios. These       26 findings demonstrate that the scores resulted from portfolio assessments and standardized tests did not lead to criterion-related validity. Therefore, although portfolio assessment may have some privileges over standardized and traditional forms of assessment, summative portfolio assessment tends to lack criterion-related validity. Another key area of concern regarding portfolio assessment (as a tool for summative assessment) has been its tenuous content validity. Wiliam (2010) maintains that “a test can be valid for some purposes but not others” (p. 256) and similarly, while a test can be valid for some students, it may not be necessarily valid for others. For instance, as Wiliam (2010) argues, a mathematics test with a high reading demand may yield valid inferences about mathematical capabilities of students who are proficient in language and reading skills, whereas the same test may present challenges for students who have difficulty in reading skills. Therefore, when students do poorly on the mathematics test, it is hard to figure out whether the low marks are due to poor reading (language) skills or weakness in mathematics. In the case of portfolio assessment, since portfolio tasks are usually designed to be authentic and similar to the target language use domain, it is probable that some aspects of portfolio assessment tasks may involve variables unrelated and irrelevant to the construct being assessed (Nolet, 1992).     Regarding the validity of PBLA, Pettis (2014) argues PBLA “improves validity through assessing factors that cannot be included in public exam settings” (p. 12). However, Brown and Hudson (1998) have questioned the validity of portfolio assessment. They describe different forms of assessment and discuss their advantages and disadvantages in relation to language assessment. In discussing validity of alternative assessments such as PBLA, Brown and Hudson (1998) express their concerns about being too positive about the consistency of alternative       27 assessments by merely relying on the strategies that impact the assessment procedures. They criticize Huerta-Macías’ (1995) argument that “alternative assessments are in and of themselves valid, due to the direct nature of the assessment” (p. 10) and consider such statements “too general and shortsighted” (p. 655).  2.4.2 Reliability Reliability, according to Bachman and Palmer (1996), refers to “the consistency of measurement” and a reliable test score will be “consistent across different characteristics of the testing situation” (p. 19). In other words, a reliable test should yield consistent results for a test taker if the same test is administered in two different occasions, in two different settings.  In discussing portfolio assessment, as an alternative or performance assessment approach (Brown & Hudson, 1998; Fox, 2008, 2009, 2014; Hamp-Lyons & Condon, 2000; Hirvela, 1997; Lynch & Shaw, 2005; Ripley, 2012), reliability has always been a matter of dispute among different scholars. For instance, Huerta-Macías (1995) argues, “alternative assessment consists of valid and reliable procedures that avoid many of the problems inherent in traditional testing including norming, linguistic, and cultural biases” (p. 10). According to Huerta-Macías, the consistency of alternative assessment can be guaranteed by utilizing strategies such as “auditability” of the evaluation procedures. In her view, auditability includes providing evidence of decision making process, using multiple assessment tasks, training assessors to use explicit and clear criteria and finally, triangulating the decision-making process with students’ parents and teachers. However, this view is contested by Brown and Hudson (1998) who believe such strategies are not robust enough to contribute to the validity and reliability of alternative assessments such as portfolio language assessment. As a case in point, the use of portfolios in Vermont which is regarded by       28 some researchers (e.g., Koretz, Stecher, Klein, McCaffrey, & Deibert, 1994) as a turning point in portfolio assessment (Mathews, 2004) shows how attaining the reliability of portfolio assessment can potentially be challenging especially when it is used in large-scale assessments. In 1988, Vermont initiated an innovative statewide performance assessment programme. This programme was unique in many ways particularly, due to the fact that the statewide assessment was substantially based on “unstandardized tasks — specifically, portfolios of student work rather than performances generated in response to standardized tasks and administrative conditions.” (Koretz, et al. 1994, p. 5) As a result of this project, Koretz et al.(1994) found that it was almost impossible to interpret and compare the meanings of grades or scores on portfolios —in different subject matters—across different schools, because there was little agreement among different teachers about what sorts of elements should be included in the portfolios. According to Koretz et al. (1994), other factors that impacted the unreliability of scores include “the lack of standardization of tasks” and challenges in designing rubrics that could be appropriate for scoring a wide variety of tasks, as well as “the difficulty of training a large number of raters to a sufficient level of accuracy” (p. 12). Therefore, although supporters of portfolio assessment regarded portfolios as more valid measures of learning, the fact that the same portfolio would get different scores according to who did the scoring made their use for summative purposes difficult to sustain in the U.S. context.” (Wiliam, 2010, p. 272) Similarly, Haque and Cray (2010) raise the issue of rater consistency in portfolio assessment in comparison with traditional approaches arguing that “even with on-going training, rater consistency in traditional approaches to language testing is an on-going issue” (p. 69). In traditional standardized tests, there is a strong emphasis on controlling the test setting and the scoring procedures so that factors affecting the testing       29 situation remain as constant as possible and raters rate the performances under similar conditions to the extent possible. However, in portfolio assessment there is more flexibility in terms of the nature of assessment tasks as well as the performance situations because portfolio tasks are completed in unique classroom settings and therefore are subject to various instructional contexts and constraints (Haque & Cray, 2010). Another area of dispute over validity and reliability of portfolios is divergent views on psychometric standards and standardization. Some researchers advocate portfolio assessment mainly because it opposes standardization (Huot & Williamson, 1997; Moss, 1994). Such researchers are of the opinion that “reliability and validity in the narrow psychometric sense are undesirable factors in evaluation. (Song & August, 2001, p. 52). For instance, Huot and Williamson (1997) compare marking writing under exam conditions with portfolio assessment of writing, explaining how these two differing assessment approaches can lead to agreement or disagreement among different raters (or in other words interrater reliability). According to their argument, it is obvious that when students are required to write on the same topic in the same genre under exam conditions it is more likely for different scorers to have consistent scores for the same piece of writing. However, when it comes to scoring a mixture of writing pieces in a portfolio, it is possible that in one portfolio some dimensions of a writing piece (e.g., coherence, cohesion, syntax or mechanics) are stronger or weaker than another piece (in the same portfolio). Therefore, even a single reader of a portfolio may struggle to decide on a single number score that can be appropriate for such a mix combination of strengths and weaknesses. This could be even more problematic when different raters with different values are asked to rate the same portfolio, which may lead to low interrater reliability. By contrast, others (e.g., Herman, Gearhart, & Baker, 1993; LeMahieu, Gitomer, & Eresh, 1995) argue that       30 psychometric integrity and achieving reliability is feasible for portfolio assessment, if the aim is to come up with a single score of overall quality. In other words, if a common rating scale is used to score various classroom assignments of similar genre, it is possible to score portfolios consistently. As Herman, et al. (1993, p.219) put it “it is particularly heartening that classroom assignments can be reliably scored on analytic components that can be used to focus instruction and to provide feedback to teachers and students.”  In the context of the LINC programme, PBLA relies heavily on CLB. Therefore, according to PBLA protocol, the language-assessment tasks and the teacher-made rubrics designed to assess portfolios should be aligned with CLB descriptors and exemplars. As a result, since CLB is standardized, the reliability and validity of CLB descriptors and exemplars have been confirmed (CLB, 2012). According to Pettis (2014), “PBLA protocols, resources, supports and monitoring procedures are intended to ensure these critical conditions are in place to support the reliability and validity of PBLA in the classroom” (p. 13). Based on this claim, PBLA as used in LINC has the potential to be reliable only if the assessment tasks are aligned with CLB standards. 2.4.3 Authenticity Bachman and Palmer (1996) define authenticity as the degree to which the characteristics of a given (language test) task correspond to the features of a target language use (TLU) task. In other words, authenticity can be conceived of as the extent to which the scores from a language assessment task can be generalized and interpreted beyond the testing situation. Based on this definition, there is a strong link between authenticity and validity, since they both refer to the generalizability of test scores to target language use domain.        31 In the context of the LINC programme, authentic language use is highly advocated, which reflects a task-based, communicative competence approach to language pedagogy (Singh & Blakely, 2012). According to the PBLA guide, the assessment tasks in PBLA are developed based on real-world issues and utilize authentic texts. Pettis (2014) considers PBLA authentic, because it is a form of teacher-based assessment, which allows the teacher to assess what is done in the classroom. She argues that because tasks are based on real-world issues and events and use authentic texts, assessments tend to be more realistic and authentic. This is especially due to the fact that teachers can tailor the language-assessment tasks towards what they have actually taught rather than using pre-made standardized tests that have little to do with what the students have learned in their programmes. Thus, portfolios are often viewed as an authentic assessment—assessment that looks at performance and practical application of theory (Newmann, 1990; Pitts & Thomas, 2001; Snadden & Thomas, 1998; Weigle, 2002). According to Dieleman (2012), since PBLA makes use of a collection of different tasks – which correspond to the tasks in the Target Language Use (TLU) domain – to illustrate learner’s language proficiency progress over a period of time, it “is considered to be an authentic, accurate, and reliable approach” (p. 13). As Weigle (2002) argues, authenticity is one of the most positive features of portfolio assessment, this could be generalized to PBLA as well.  2.4.4 Interactiveness The interactiveness of a given language test task is conceptualized as “the ways in which the test taker's areas of language knowledge, metacognitive strategies, topical knowledge, and affective schemata are engaged by the test task” (Bachman & Palmer, 1996, p. 25).        32  As far as PBLA is concerned, since PBLA uses CLB as its framework of reference, and due to the fact that authentic language use is highly promoted in PBLA, language assessment tasks are intended to simulate real-life situations in which learners need to use their topical knowledge, affective schemata and meta-cognitive strategies to fulfill the tasks. Therefore, it can be assumed that interactiveness is embedded in PBLA. However, the available studies on PBLA do not seem to refer to this aspect, and there is a need for more research in this area.  2.4.5 Impact Impact, as the name suggests, refers to the impact of a language test on society and educational systems as well as the individuals in those systems. As Bachman and Palmer (1996) argue, “the impact of test use operates at two levels: a micro level, in terms of the individuals who are affected by the particular test use, and a macro level, in terms of the educational system or society” (pp. 29-30).  In terms of the impact of PBLA, one of the major impacts it potentially has on learners’ lives and society is its role as a gatekeeper for Canadian citizenship. As mentioned earlier, language learners who aim to apply for Canadian citizenship have to reach CLB level four in listening and speaking skills. Thus, for this group of learners PBLA is a high-stake test because it impacts their future and their citizenship status in Canada. According to the PBLA guide, “teachers are accountable for the levels they record on student reports” (Pettis, 2014, p. 50). Therefore, the students’ portfolios must present solid evidence of what learners can do with their language proficiency based on CLB descriptors for their particular level. As such, since the level of English language proficiency has direct and indirect impacts on learners’ employment       33 opportunities, education and citizenship status in Canada, PBLA outcomes can have a significant impact on learners’ lives in many ways.   2.4.6 Practicality As the name implies, practicality is “the relationship between the resources that will be required in the design, development, and use of the test and the resources that will be available for these activities.” (Bachman & Palmer, 1996, p. 36)  With respect to portfolio assessment, one of the major issues raised by different researchers (e.g., Ripley, 2012; Weigle, 2002) is the practicality of portfolio assessment. Weigle (2002) discerns practicality as a weakness of portfolio assessment, given the considerable amount of time required for its successful implementation. Many researchers and educators have criticized portfolio assessment because of the high costs associated with the teacher training for assessing the portfolios and developing the scoring rubrics as well as the extensive amount of time it demands for assessing the portfolios (e.g., Brown & Hudson, 1998; Mathews, 2004; Ripley, 2012; Song & August, 2001; Spalding & Cummins, 1998). Having explained how PBLA stands in relation to its usefulness according to Bachman and Palmer’s (1996) concept of test usefulness, I will now refer to the studies so far conducted on PBLA in the LINC programme. 2.5 Research on PBLA in the LINC programme As explained at the beginning of this chapter, there is a large body of literature on the use of portfolio assessment in educational settings. However, only a limited number of studies have been conducted on the deployment of PBLA in the LINC programme. Ripley (2012) conducted a small-scale qualitative study on the benefits and challenges of implementing PBLA in a LINC programme in a large Canadian city. The key participants in this study were four LINC teachers       34 who were involved in a PBLA pilot project, a representative of the Citizenship and Immigration Canada, and a developer of the PBLA model. All participants participated in a semi-structured interview and shared their views on the benefits and challenges of using PBLA. The findings of this study highlighted that utilizing PBLA helped the teachers to have a better understanding of the students’ weaknesses and progress over time. The participants in Ripley’s study indicated that in contrast to the traditional assessment approaches such as tests and final exams, portfolio tasks provided better evidence of students’ language development. In addition, the teachers believed that the use of rubrics and CLB benchmarks increased the consistency in the assessment of learners’ proficiency. Beside the potential benefits associated with PBLA implementation, some challenges were also identified, including: “Conflicts with established curricula, an increase in instructors’ workloads, a need for ongoing training, and difficulties integrating PBLA into programmes that allow continuous student intake” (Ripley, 2012, p. 83). According to what the research participants in this study reported, prior to PBLA implementation, some programs had a set curriculum in place. This was in conflict with PBLA training that does not favour “prescribed curricula” and instead, “gives students choices about what they would like to study” (p. 79). Consequently, while lack of a unified curriculum increases interactiveness of PBLA, teachers in Ripley’s study regarded this discrepancy as one of the challenges of PBLA implementation. Due to the limitations of his study, Ripley highly recommends a large scale survey of LINC learners that focuses on their perceptions of PBLA.  Along similar lines, Fox and Fraser (2012) conducted a longitudinal quantitative study from 2010 to 2012 in a different province in a PBLA field test. This study investigated the impact of PBLA on LINC teachers (N=119) who taught in Literacy level to CLB level 4 classes       35 and their students (N=774) in these levels in 17 programmes in a large Canadian city. As reported by Fox (2014), the findings of this study showed “statistically significant changes in LINC teachers’ accounts of the CLB, task-based instruction and assessment, and portfolio-based assessment as an outcome of initial PBLA implementation” (p. 72). Furthermore, students who participated in this study reported the positive impact of PBLA on increasing their engagement level, reflection and autonomy, which echo the findings of Little (2005, 2007, 2009, 2013) who investigated learners’ involvement in the assessment process using Common European Framework of Reference (CEFR) and the European Language Portfolio (ELP).       In the same vein, Fox (2014) conducted a longitudinal qualitative case study from 2012 to 2014 in order to investigate issues emerging from using PBLA over a period of two years in four LINC programmes in a large Canadian city. The study described the experience of a LINC teacher in relation to the responses of four other teachers three who participated in semi-structured interviews periodically during the same period, and one who acted as a key informant. Thematic analysis of the interviews with the key participants revealed four recurring themes regarding the use of PBLA in the LINC programme. Salient themes included a “(a) increased planning and accountability; (b) the clearer articulation of goals for activity; (c) increased awareness of assessment; and (d) increased emphasis on summative assessment.” (p. 81) The study called for a critical need to support the formative use of PBLA as an essential teaching and learning resource that can potentially have a positive impact on improving LINC students’ self-assessment, goal-setting, and autonomy.  It seems that none of the studies reviewed above have looked into PBLA through the systematic lens of Bachman and Palmer’s (1996) framework of test usefulness, nor has any of       36 these studies investigated teachers’ perceptions of the usefulness of PBLA or its implementation across different levels. This is particularly the case, from Literacy level to CLB level 4, which is the minimum language requirement for the Canadian citizenship application. In addition, the way portfolios are actually assessed and potential challenges teachers face in assessing portfolios and designing rubrics for different levels has not yet been adequately explored. This study aims to address these gaps in the literature on PBLA implementation in the LINC programme.             37 Chapter 3: Methodology 3.1 Introduction In this qualitative study, semi-structured interviews were employed as the main method of data collection, which is consistent with the overarching purpose of the study—that is, to gain an insight into the experiences of LINC instructors in utilizing PBLA in the LINC programme and in particular into their perceptions of the usefulness of the PBLA. I aimed to delve into the teachers’ perceptions of PBLA by investigating the challenges they face in using and assessing the portfolios. According to Corbin and Strauss (2014), qualitative research is used to “explore the inner experiences of the participants” (p. 5). Therefore, a qualitative approach appears to best suit the purposes of this study.  Interviews provided a unique opportunity for my research participants to share their views about PBLA in a dialogic manner and helped me gain a deeper understanding of their stance on this method of assessment. In terms of the type of interviews, in this study I conducted semi-structured interviews mainly because they provided me with a chance to have an adequate level of flexibly to refine and modify the interview protocol in a justified manner (Kvale & Brinkmann, 2009). The following research questions were addressed in this study: 1. How do teachers perceive PBLA compared to other traditional assessment methods?  2. From the perspective of LINC instructors, what is the usefulness of PBLA?   3. From the perspective of LINC instructors, what are the challenges and benefits of using PBLA?  4. What are teachers’ views on PBLA training?         38 3.2 Research participants The target participants of the study were 10 instructors who (a) had been using PBLA in the LINC programme for at least three months, and (b) had been teaching from Literacy to CLB 4 level at the time of the study. Instructors who had been using PBLA for less than three months were excluded from this study, because I intended to make sure that teachers who participated in the study had sufficient familiarity with this method of assessment. In what follows I will explain the reasons why I have selected these particular CLB levels for my study.   LINC students who are in the Literacy level may not have received any form of education, neither in their first language nor in other languages. Therefore, it is quite conceivable that many of them may lack basic literacy skills such as reading or writing in any languages. Depending on their background, some learners in the Literacy level may be familiar with the English alphabet and numbers and others may not even have this knowledge in their first language, particularly if their first language has no written form. Teaching Literacy levels can be quite challenging, which is why I was interested to learn about the potential benefits and challenges of utilizing PBLA for this group of learners by looking into teachers’ experiences and perceptions of using PBLA and the challenges they face in assessing portfolios. In addition, based on the Canadian Language Benchmarks (CLB), learners who are at CLB levels 1 to 4 have basic levels of English proficiency, and according to CIC documents, the minimum language requirement for citizenship application is CLB 4 in listening and speaking. Due to the significance of achieving CLB 4 in the lives of many immigrants who aim to become Canadian citizens, in this study I aimed to investigate the teachers’ perceptions of using PBLA only up to CLB 4. Therefore, I conducted interviews with two teachers from each level, so in total the       39 number of participants came to 10 LINC instructors from Literacy level to CLB 4. (See Table 3.1)      40 Table 3.1  Participants' Demographic Information  Name (Pseudonym) Gender Age group Education CLB Level Teaching Experience in LINC Experience with PBLA Origin of the Majority of Students 1 Rose F 25-34 BA in TESL Literacy CLB 1 3 years 1 year China 2 Linda F 45 or above Post  Baccalaureate Diploma Literacy CLB1 CLB 2 10 years 3 years Syria Iraq China 3 Olivia F 45 or above BA  in Theatre & English Literacy CLB 1 CLB 2 13 years 2 years Syria Iraq Afghanistan 4 Violet F 45 or above MA  CLB 2 15 years 7 months Diverse groups 5 Carol F 45 or above BA CLB 3 10 years 2-3 years Diverse groups 6 Maya F 45 or above 2 BAs  in English & Linguistics CLB 3 2 years 2 years China 7 Kate F 25-34 MEd CLB 3 2.5 years 1 year China 8 David M 35-44 MA in TESOL CLB 2 CLB 3 CLB 4 1 year Less than 1 year China 9 Nina F 35-44 BA  in English Literature CLB 4 CLB 5 2 years 2 years Diverse groups 10 Sarah F 45 or above BA  in TESOL Former CLB 4 & Lead Teacher 21 years 2 years China       41 3.2.1 Participant profiles In order to provide a clearer picture of participants’ backgrounds, in what follows, I will provide brief biographies for each.  Rose  Rose who is in her early-thirties, has a Bachelor’s degree in Teaching English as a Second Language (TESL). She has been teaching in the LINC programme for three years and is currently teaching a mixed class of Literacy and CLB 1 students. She is a PBLA Lead Teacher and has been using PBLA for a year at the time of the interview. The majority of her students are elderly immigrants from China, as well as a few Syrian refugees.  Linda Linda, who is about 45 years old, has a Post Baccalaureate Diploma in Special Education. She has over 10 years of experience in teaching Special Education, as well as Early Childhood Education. She has been teaching in the LINC programme in the past 10 years, and has been using PBLA for three years. She is currently teaching a multilevel Literacy, CLB 1 and CLB 2 class. Her students are mainly refugees from Syria, Iraq and Afghanistan (most of whom do not have any literacy skills even in their first language). In addition, she has some students from China and a few other students from other countries. Olivia Olivia, who is also about 45 years old, has a Bachelor’s degree in Theatre and English. She has over 13 years of teaching experience in the LINC programme, and has been using PBLA as a Lead Teacher for two years. Similar to Linda, she is currently teaching a multilevel Literacy, CLB 1 and CLB 2 class. Her students are mainly refugees from Syria, Iraq and Afghanistan. She       42 calls her students “multi-barriered” because not only do most of them lack education in their first language, but they may also suffer from war trauma and other related issues.   Violet Violet, who is in her mid-forties, has a Master’s degree in Curriculum and Pedagogy. She has been teaching in the LINC programme for about 15 years and has been using PBLA as a Lead Teacher for seven months. At the time of this research, she was teaching a CLB 2 class, and had a diverse group of students from different countries.   Carol Carol, who is in her late forties, has a Bachelor’s degree. She has been a LINC instructor for 10 years and has been using PBLA for approximately three years. She is now teaching a diverse group of CLB 3 learners. Maya Maya, who is about 45 years old, has two Bachelor degrees, one in English and another in Linguistics. She has been teaching in LINC programme and using PBLA for the past two years. At the time of this research, she was teaching a CLB 3 class, but she has experienced teaching lower CLB levels as well. Her students are predominantly from China. Kate Kate, who is in her early-thirties, has a Master of Education degree in Professional Development Programme. She has been teaching in LINC for almost three years and has been using PBLA for a year. She is co-teaching two CLB 3 classes with another teacher. Her morning students are mainly housewives or elderly people originally from China, but her evening class students are       43 mostly young people from diverse cultural backgrounds, who work during the day and go to LINC classes in the evenings. David David, who is in his mid-thirties, has a Master’s degree in Teaching English to Speakers of Other Languages (TESOL). He has been working as a substitute and a part-time LINC instructor at four different LINC programmes for a year. Therefore, he teaches different CLB levels including CLB 2, 3 and 4. He has been using PBLA for less than a year. The majority of his students are from China.  Nina Nina, who is in her mid-thirties, has a Bachelor’s degree in English Literature. She is a Lead Teacher who has been teaching and using PBLA in the LINC programme for two years. She teaches a multilevel CLB 3, 4 and 5 class. Her students come from a variety of cultural backgrounds.  Sarah Sarah, who is in her mid-forties, has a Bachelor’s degree in TESOL. She has over 21 years of teaching experience and has been using PBLA as a Lead Teacher in the past two years. She used to be a CLB 4 teacher, but at the time of the interview, she was mainly working only as a PBLA Lead Teacher. 3.3 Data collection After the research protocol was approved by the UBC Behavioural Research Ethics Board (BREB), I sent emails for invitation to participate in the study to different LINC programmes across Vancouver.  Those who responded to my initial emails and expressed interest       44 to participate in this study, were sent another email in which they were briefly introduced to the study and its goals. I should note that since LINC classes are offered through a variety of different service providers, I used a snowball sampling approach by asking different LINC programme administrators to pass on my invitation to the study to their colleagues in other programmes. Therefore, the LINC administrators who received the invitation for the study by email distributed it to the teachers who teach from Literacy to CLB level 4 in their programmes and their colleagues in other LINC programmes. Fortunately, all the 10 instructors who responded to the invitation email and showed interest to participate in the study met the criteria for participation in the study. Therefore, they were all contacted and a time for interview was arranged. Prior to the interview, the research participants were required to sign the consent form. The participants were interviewed once only. Two of the interviews were conducted online (via Skype), while the other ones were conducted face-to face based on the interviewees’ preferences. Each semi-structured interview was about 60 minutes long. As a token of appreciation for participation in the study, the interviewees received a $10 Starbucks gift card. The interviews were audio recorded with the participants’ consent, and after the interviews, pseudonyms were used to protect the identity and privacy of the research participants. In the following section, I will describe the methods used to analyze the qualitative data from the interviews. 3.4 Data Analysis Once the interviews were concluded, the audio files of the interviews were transcribed. The transcribed data were coded based on their relevance to research questions. I organized the codes based on their correspondence to the facets of the concept of usefulness, as well as the recurring themes in relation to each research question. In order to present the voices of participants, and to       45 highlight and support the most outstanding themes, some quotations from the interviews were selected. The Excel programme was used to tabulate the results of the analysis. On another level of analysis, demographic information was used to provide an overall picture of the research participants and to investigate the difference in experiences among teachers teaching at different levels in the LINC programme. The following information was included in the descriptive statistics:  Teacher’s educational background  Teacher’s years of experience and levels taught in the LINC programme  Teacher’s length of experience using PBLA in the LINC programme   Teacher’s level of teaching (e.g., “which CLB level do you teach?”)  Teacher’s age range  Teacher’s years of experience in teaching ESL/EFL in Canada or abroad   General class information such as students’ cultural and educational backgrounds. The interview data were cross-referenced with the above demographic information to shed further light on the findings.          46 Chapter 4: Findings and Discussion 4.1 Introduction In this chapter, the relevant findings in light of my four research questions, along with a discussion of how the findings relate to the Literature Review in Chapter 2, are presented. This chapter is divided into different sections, each of which addresses the most salient themes related to the research questions. The following research questions were investigated in this study: 1. How do teachers perceive PBLA compared to other traditional assessment methods?  2. From the perspective of LINC instructors, what is the usefulness of PBLA?   3. From the perspective of LINC instructors, what are the challenges and benefits of using PBLA?  4. What are LINC instructors’ views on PBLA training?  Since there were some overlaps among the salient themes in different interview questions, I decided to present the findings according to the main themes that emerged from my interview questions. Thus, the results related to the following themes will be presented:  a) Teachers’ perceptions of PBLA compared with other traditional assessment methods b) Teachers’ perceptions of the usefulness of PBLA as a method of assessment c) Logistical challenges in implementing PBLA as an assessment tool d) Teachers’ perceptions of PBLA training   4.2 Teachers’ perceptions of PBLA compared with other traditional assessment methods The first research question sought to investigate LINC teachers’ perceptions of PBLA in comparison with other traditional assessment methods. I was also interested to know which method of assessment teachers would prefer to use if they were to choose between PBLA and       47 other traditional assessment methods. In response to research question 1, most participants showed a generally positive attitude towards PBLA, and commented that PBLA is overall advantageous for learners due to various reasons. First, according to Maya, PBLA is “more relatable to the students”, since it is a reflection of a variety of skills, which is what students really need in life. Therefore, according to her, PBLA is better than other traditional assessment types because it is more practical and the aim of instruction is not merely preparing students to pass a test, instead, they are trying to build skills that are useful for their lives in the Canadian context. Researchers in the past two decades (Davis, 1994; Ginther, 2002; Long & Macia ́n 1994; Omaggio Hadley, 1993) have promoted an approach utilizing tasks simulating real-life situations, both for assessing receptive and productive skills. Since PBLA follows this trend by using students’ linguistic performances in a variety of social contexts (Cummins & Davesne, 2009), it is potentially more beneficial than standardized and traditional tests.   In addition, Linda perceived PBLA to be more beneficial than standardized tests when it comes to assessing the progress of lower-level learners, because it provides a picture of the learners’ strengths and weaknesses, and it illustrates what they need to improve on, as indicated in her comment below: If you teach lower levels like Literacy and one, you are you are thinking about how they think and how did they perform it. It’s not just whether or not they got the correct number or word — but for example if it’s a form filling task, did they form the letters right? Are the words on the line? Are the letters legible? Are there spaces between the words? Did they put the words in the right parts related to the apartment number address? — you are looking at all of that. You are looking at what they’ve learnt and what are the skills that they are using within the assessment task.    However, while the majority of the interviewed teachers were positive about the underlying rationales and theories behind PBLA and its numerous benefits for learners, they had       48 mixed feelings about using it as a replacement for traditional assessment tools. As a matter of fact, the majority of the participants expressed concerns about using PBLA as the only measure of assessment for summative purposes. In the following sections, I will provide more details in this regard. 4.2.1 Conclusion To recapitulate this section, in investigating teachers’ perceptions of PBLA compared to traditional assessment methods, the findings showed that while PBLA has numerous benefits for students, as with any other form of assessment, it has some drawbacks, which can potentially pose some challenges for both teachers and learners. The merits of PBLA mentioned by the participants can be best encapsulated in the following quote:  Portfolios provide a portrait of what students know and what they can do, offer a multidimensional perspective of student progress over time, encourage student self-reflection and participation, and link instruction and assessment. (Delett, Barnhardt, & Kevorkian, 2001, p. 559)  In addition, based on the findings, all teachers in my study unanimously agreed that compared to other traditional assessment methods, PBLA is more advantageous for learners. However, the majority of the participants believed that PBLA should be used only for formative assessment, but that for summative assessment purposes, it should be used in combination with other assessment methods.  4.3  Teachers’ perceptions of the usefulness of PBLA as a method of assessment In this section, I will discuss the findings related to the theoretical framework of the study to explore the usefulness of PBLA as a method of assessment from the perspective of LINC instructors. As explained in previous chapters, Bachman and Palmer’s (1996) concept of test       49 usefulness is comprised of the following major test qualities: validity, reliability, authenticity, interactiveness, impact and practicality, each of which has been explored in what follows.  4.3.1 Validity In order to investigate teachers’ perceptions of the concept of validity in PBLA, three different questions were asked. The first question concerned whether teachers clearly specify which language abilities are being assessed, when it comes to developing each assessment task. In addressing this question, all participants noted that they tend to think of the language abilities they want to assess prior to or while developing the assessment tasks. Some teachers including David, Kate, Olivia and Violet reported using backward planning; that is, devising the assessment prior to planning their lesson, or in David’s words, “reverse engineer” the assessments. In the first step, they develop the assessment task by thinking about the target language ability intended to be assessed. Subsequently, they plan their lesson and design the skill-building and skill-using activities by cross-referencing them to the CLB document to meet the needs of the planned assessment. For example, Kate and David commented: Again, I think it goes back the CLB document and again just cross referencing, making sure I'm covering all the areas and working backwards from that as we're practicing and building the skills, making sure that there's a focus on the areas which I want to see the students perform on. … but, I worry that we're just teaching to these tasks and we're teaching to the assessments. (Kate)  I don’t know if it’s the way everyone else does it, but I reverse engineer it. So, usually I start with the basic idea of what I expect or want the learners to be able to do. So, I think of the task, the thing that they are going to do and then I kind of roughly design some way of testing or teaching that in the class. Then, I go to the CLB descriptions to see how it matches up and if it doesn’t, how I can change it to match up. My class is split CLB two and three. So, that poses a few issues as well because, I have to think about if I gave the same assessment tool to everyone how fair it would be. That’s my basic approach. (David)        50 Linda, among some others, said she uses the “Can Do Statements” document, and looks at “what it is that learners have to be able to do” and then matches the criteria with the intended assessment task in order to specify which language ability she wants to assess. Needless to say, as illustrated in the interview excerpts above, all teachers mentioned using the CLB document in designing their assessment tasks, mainly because the CLB document lists what the students should be able to do in a certain level. For example, Maya explained that in order to make sure she is assessing the language abilities related to the particular level of her students, she always looks at the criteria for assessing that particular language ability (one level below and one level above the target level) to check whether her expectations are too low or too high. However, it is worth noting that creating assessment tasks that align with the CLB document is not always easy. Some teachers including Kate noted that it is really challenging to develop teaching materials and assessment tasks that precisely conform to a specific CLB level. As shown in Kate’s quote below: … understanding the language that's in the CLB document is very challenging, and I find there's a lot of cross referencing between the skills and the competencies and the areas of ability and I'm scared to leave one term out because and it creates a loophole and as you're making the assessment you realize, well, the assessments don't seem accurate if you miss out one component. So I'd say that's the greatest challenge. Just making sure everything is covered …  Bachman and Palmer (1996) define construct validity as “the extent to which the results of an assessment can be interpreted as an indicator of the ability we intend to measure, with respect to a specific domain of generalization” (p. 21). The interview data presented here demonstrate teachers’ tacit awareness of the concept of validity in developing assessment tasks. The participants’ comments illustrate how teachers endeavour to carefully design the assessment       51 tasks to tap into the language ability they intend to assess. However, developing assessment tasks and selecting the appropriate language abilities to be assessed does not seem to be always straightforward. As David pointed out, it is quite challenging to think of assessment tasks that target different language abilities in split level classes.  Another point worth discussing here is the issue of teaching to the test, as referred to by Kate who said, “I think the main thing is that, I worry that we're just teaching to these tasks and we're teaching to the assessments”. Many participants raised the point that they plan their lessons by “having the end in mind” (Violet). In other words, the whole purpose of using skill-building and skill-using activities is to prepare learners for the assessment tasks. Therefore, based on the teachers’ accounts, it can be fairly concluded that PBLA potentially leads to “measurement-driven instruction” (Popham, 1987), which occurs when testing influences and directs teaching and learning. Thus, one may view this feature of PBLA as the washback effect, which refers to tailoring classroom activities to the demands of the test (Buck, 1988). However, according to Baily (1996), washback can be beneficial when it incorporates “1) language learning goals; 2) authenticity; 3) learner autonomy and self-assessment; and 4) detailed score reporting” (p. 268). Interestingly enough, PBLA takes all of these factors into account. For instance, according to PBLA Guide (2014), goal setting and self-assessment are two important elements of PBLA, which can potentially lead to learner autonomy. Also, using real-life resources and assessment tasks can serve to promote the authenticity of PBLA. Moreover, according to PBLA Guide (2014), LINC instructors are required to provide action-oriented feedback that helps students move forward, rather than merely using “ego-oriented” feedback such as “Good job” that does not clarify what the students need to improve. In other words, action-oriented feedback should       52 indicate what “the students should continue to do, what they should start to do or do more of, what they might think about as their next challenge, and what they should stop doing” (p.16). Hence, action-oriented feedback is similar to “detailed score reporting” mentioned earlier. Thus, it is fair to assume PBLA can have positive washback effects. Moreover, PBLA potentially results in “systemic validity”, which refers to the effects of results on learning and teaching. “Systemic validity is achieved when desired educational outcomes are attained by virtue of assessing those same outcomes.” (Hickey, Taasoobshirazi, Schafer, & Michael, p. 189) As a case in point, in different instances the participants referred to the fact that PBLA can inform instruction. For instance, remarking on learners’ portfolios, David described some benefits concerning both teachers and learners: The end product, I guess will allow the teacher to improve the actual teaching because the whole process is informative. So, the teacher can observe what’s going on with the learners, give them accurate and specific feedback to help them improve their learning and be involved in and take ownership of their own learning.   According to David, PBLA can potentially help teachers to improve their teaching and helps learners to be involved in their own learning. David’s comments support the view that “perhaps the greatest theoretical and practical strength of a portfolio, used as an assessment instrument, is the way it reveals and informs teaching and learning” (Hamp-Lyons & Condon, 2000, p. 4). It was mentioned in the interviews that by observing the students’ performance on the assessment tasks, teachers can find out if they have achieved their instructional goals and they can get a better understanding of what the students need more help with. Hence, the “systemic validity” of PBLA is enhanced when teachers fine-tune their assessment tools and make instructional changes that create better opportunities for learners to develop the language skills intended to be       53 measured in assessment tasks. Similarly, another distinct benefit of PBLA for learners is that “it enables the matching of the means of assessment with the goals of a curriculum, especially as opposed to external or one-shot exams that do not match what teachers have taught” (Yin, 2014, p. 5). As the participants reported, the assessment tasks in PBLA are tailored toward what has actually been taught in the classroom, thus the assessments are specific to each individual class, and according to Sarah “there are no surprises for the students”:  There are no surprises for students. You know what is taught and what is assessed. The students are aware of the criteria of what exactly goes into an assessment task and how they will be assessed. There is lots of repetitive exposure to the criteria, that is self-assessment and peer assessment and teacher assessment. So it really solidifies learning, consolidates it all and maybe reinforces the learning within the work.   As Sarah pointed out, through self-assessment and peer assessment activities, PBLA allows students to have plenty of opportunities to practice the target language skills they are going to be assessed on, and in the meantime, as they prepare for the assessment task, they are also familiarized with the assessment criteria. Therefore, they know what is expected of them in the assessment task. Furthermore, Violet referred to the theoretical basis of PBLA and explained that PBLA incorporates “strategic competence and also social competencies which really reflects what the language is all about”, unlike standardized tests that tend to target language skills and elements discretely.   The second question concerning validity pertained to how teachers make sure the assessment tasks are relevant to the language ability they intended to assess. Interestingly enough, although all participants use the CLB document and the “Can Do Statements” some teachers were somewhat uncertain about the degree to which the assessment tasks they design       54 are relevant to the language ability they aim to assess. As a case in point, in the interview excerpt below, Linda touched upon the challenges in making decisions about the relevance of the assessment task to the language ability that is being assessed: I think ideally PBLA is a really good thing, but trying to implement it is quite complex. It looks simple on paper so you do this way and then you do this and then you set up your criteria, it looks simple but when you try to do it there is so many other complexities that come up especially in deciding what criteria to use and how to weight it. I just had the thing today like: should I put it under numerals as to reading prices or under developing reading skills reading numerals. I had to decide which one and because we have to stagger them, I put it as the reading one because we did a theme on numbers last month. It’s like you have to make all these decisions that take time and it takes a teacher’s time to think about should I do it this way or should I do it this way? Does this really fulfill the task?  Linda’s comments illustrate the complexities in making decisions about selecting the relevant target language abilities to be assessed in an assessment task. Linda’s concerns are in line with Bachman (2002), who believes that “the process of demonstrating the validity of intended inferences about specific areas of language ability (e.g., grammar, vocabulary, cohesion, rhetorical organization, and sociolinguistic appropriateness) becomes more difficult” (p. 5) in performance assessment tasks such as portfolio assessment.  The final question regarding validity investigated how teachers make sure the rubrics they develop and their evaluation procedures reflect the target language ability they want to measure in assessment tasks. (When you design rubrics for assessing the tasks, how do you make sure the scoring procedures reflect the language ability you want to measure?) Overall, the participants’ responses to this question indicated that there is consensus amongst teachers that the CLB document and “Can Do Statements” constitute the two essential documents that have a pivotal role in guiding teachers through developing assessment tools and tasks. They explained       55 that the “Profile of Ability” in the CLB document adequately describes what to expect from learners at each level and the relevant criteria for designing the assessment tools.  In terms of the assessment tools, since teachers have to share the evaluation criteria with their students, most participants reported using very simple rubrics or just creating a short and straightforward checklist, or a rating scale. The findings showed that teachers tend to shy away from creating descriptive rubrics because they are very time-consuming to make, and hard to explain to students. Teachers who were teaching lower levels noted that they make students aware of the assessment criteria through skill-using activities. They model what constitutes success through a variety of activities. By doing a lot of practice they make sure students understand what is expected of them in the assessment task. On the other hand, instructors who were teaching CLB 3 and 4 reported that they usually show the assessment criteria on the board, and go over the details with the students to illustrate what they should be able to do on the assessment tasks. As an example, in the quote below Maya explains how she clarifies to her (CLB 1 and 2) students what language abilities they would need to demonstrate in assessment tasks. Interestingly, her approach seems to have raised awareness of her students about what they are expected to do in the assessments:  When we skill build, I show them what they need to be doing. Especially like LINC one and two, things like eye contact or saying “pardon me” when they need something repeated, we constantly do it in class and if people forget to do that, often times the other students will say hey, you forgot to…, you didn’t look at me. So they actually correct each other, and they know what the criteria for greeting people is, especially some of the students have been in the class a little bit longer, they know.  According to Bachman (2002), in the design and use of performance assessment tasks, three elements are required to enhance validity:        56 (a) a cognitively-based definition of the constructs-abilities we want to assess (b) a clearly identified and defined domain of target use situations and tasks (c) a set of distinguishing characteristics for describing both the assessment task and the target use tasks. (p. 9) Based on the findings presented above, all teachers in my study utilize the CLB document and the “Can Do Statements” as reference points to make sure the assessment procedures accurately reflect the language ability they aim to assess. In my view, it can be reasonably concluded that teachers’ use of the CLB document and “Can Do Statements” reflect the three elements that was mentioned in Bachman’s quote above. Moreover, by means of deploying skill-building activities and sharing the assessment criteria with the language learners, teachers strive to make learners aware of the assessment procedures as well as the language abilities they are expected to demonstrate in order to successfully complete the assessment tasks. Therefore, I think the key aspects of validity are observed in designing assessment tasks in PBLA. However, the extent to which PBLA is a valid measure of assessment would need to be further investigated by looking at the actual student portfolios and the assessment outcomes.  4.3.2 Reliability In order to inquire into teachers’ views on reliability of PBLA, I asked three questions. The first question (which investigated inter-rater reliability) looked into how teachers make sure a given student would receive the same evaluation from different teachers on one portfolio assessment task, and whether teachers have received any training in that respect. The second question looked into how teachers ensure the students at the same level of proficiency receive similar scores on       57 their portfolios. Finally, the third question was related to the characteristics of the assessment setting. Before presenting the interview findings for this section, I should note that since the majority of teachers in this study – eight out of ten – were still in the training phase of PBLA, they were not fully implementing PBLA yet, and as such could not provide thorough responses to the questions regarding reliability. To be more specific, only Carol and Nina reported having fully implemented PBLA, including the final portfolio assessment. The remaining eight other participants were still being trained, and had not yet proceeded to the stage where they could evaluate student portfolios. Therefore, when I asked them about inter-rater reliability, and whether they had received any form of calibration training to ensure their PBLA assessment is consistent with other teachers, they unanimously said that they had not, but that they were hopeful to receive some training in the future. However, it should also be noted that, quite surprisingly, even Carol and Nina, who had completed all stages of the PBLA training, said they had not received any calibration training either. Agreement and consistency of evaluation (scores) among teachers has always been viewed as a significant aspect of reliability in assessment (e.g., Herman & Winters, 1994; Johnston, 2004). However, the data in this study suggest that the reliability aspect of portfolios as means of assessment needs more attention by policy makers and researchers.  My second question concerning reliability was how teachers ensure the students who have the same level of proficiency receive similar evaluations on their portfolios. According to Herman and Winters (1994), it is essential that those who judge student portfolios agree on “what scores should be assigned to students’ work within the limits of what experts call       58 measurement error”. In their view, if raters do not agree on how a portfolio should be scored, or if a particular student’s work does not receive the same or nearly similar scores from different raters, “then student scores are a measure of who does the scoring rather than the quality of the work” (p. 49).   As mentioned earlier, since most of the participants had not yet done the final portfolio assessment, in response to this question, they referred to their experiences of evaluating the weekly (or biweekly) assessment tasks. Salient among their responses was teachers’ uncertainty as to whether or not they were properly performing the evaluations. According to Violet, “I’m doing what I can … but there is that unsettling feeling that you're not sure whether what you are doing is right”. Teachers also showed a great degree of concern about the consistency of assessments among different teachers and recounted calibration as one of their biggest struggles in assessing portfolio tasks. As Olivia noted, “we’re still all learning to use the CLB document and how effectively or correctly we are using it is a question”.  Olivia and Violet’s concerns are indicative of the fact that, while teachers are making considerable efforts to make sure they are following the PBLA guidelines and the CLB document in evaluating the assessment tasks, there is a prevailing sense of uncertainty as to whether or not they are doing it “right”. Echoing the same concerns regarding calibration among teachers, Maya raised the point that since teachers have a central role in making decisions about developing the assessment tasks in PBLA, it is possible that their assessments may not be “uniform” with other teachers’ assessments. As a result, this lack of task standardization may impede reliable scoring in portfolio assessment as Koretz et al. (1994) argue. Maya elaborated on her personal experience regarding this issue, and said that when she took over her current CLB 3       59 class, the former teacher had planned to promote a student to the next level, but when that student did a speaking test with the Regional Lead person – who usually does the speaking assessments – he did not do well, and thus could not pass the speaking test for CLB 3. On that note, Maya went on to say:  … then, once I got to know the student, I thought to myself why did this other teacher think he was ready? Because in my mind, I don’t think he’s ready. I wonder why she thought he was ready! Then, when you look through their PBLA assessment tasks, well technically, she thought he had passed all these assessments. But the assessments from one teacher to another might not be consistent. Like the level of task difficulty, the assessment rubrics, how they mark things, like what they think is achieved versus what skills they need to build more before they achieve whatever task or whatever level they are. So, in that way sometimes PBLA is not … it can be improved I guess. Maybe if more teachers shared their tools …  Maya’s account supports the view that portfolio scores have low reliability and they should perhaps be used mainly for formative assessment purposes (e.g., Reckase, 1995). Similarly, as Johnson, Mcdaniel and Willeke (2000) argue, the reliability of open-ended assessments such as portfolio assessment has been frequently challenged and the “rater judgment in the scoring of alternative assessments is a source of measurement error” (p. 68). As such, depending on how the assessment tool is developed, PBLA assessments results may vary. As indicated in the interviews, there may be inconsistencies in how teachers weigh different components of the assessment tasks, which can make the assessment tools quite subjective. That is one of the reasons why the reliability of portfolio assessments can be called into question. Kate’s comments below exemplify the possibility of inconsistency of assessments among different teachers: I do think PBLA is good to get a more well-rounded portrait of the student as a learner and to see what their abilities are. That said, it also depends on the quality of the assessments and as much as we all sort of try to standardize and keep to the same level, every teacher has got a different understanding, or different feeling of       60 what's important and what should be tested and what should be weighed. I work with my colleague who's also CLB 3, we often collaborate and so our assessments are quite similar, but what he thinks is important and what I think is important sometimes is very different, so sometimes it depends on what we’ve been focusing on. It's very subjective!  Teachers’ concerns regarding the issue of assessing portfolio tasks highlight the dire need for more training in this area. According to Johnston (2004), “Training’ is a vital component in the ability of graders to reach agreement” (p. 402).  In addition, several studies show that calibration sessions along with collaboration among teachers can have positive effects on achieving rater agreement (e.g., Baume & Yorke, 2002; Heller et al., 1998; LeMahieu et al., 1995; Nystrand et al., 1993; Supovitz et al., 1997). Similar to this view, Maya strongly supported the idea of collaboration among teachers who teach the same levels, and argued: If teachers don’t collaborate, I think we’re losing the good ideas that everybody has done things that didn’t work. There is so much we could learn from stuff that didn’t work, and also there’s so much good stuff you get from looking at other people’s assessments and how they create their assessments. I think that’s the key to building success with PBLA and with assessment tools.  Finally, the last question (relevant to reliability) probed into the extent to which the characteristics of the test setting vary from one administration to another. In response to this question, all participants provided more or less similar answers. The findings showed that PBLA assessment tasks are generally conducted in the classroom context. However, some teachers take different approaches in conducting the speaking assessments. For example, Rose noted that she sometimes conducts the speaking and listening assessments on field trips and in real-life contexts such as a in supermarket. She also added that she provides her students with a variety of options to choose from. For instance, her students can choose whether they prefer to be assessed in front of the whole class or outside the classroom in a quiet area. They can also choose to be assessed       61 individually (e.g., a role-play with the teacher) or with other students. Another teacher, Linda, reported that she takes her students out of the classroom for the speaking assessment, and has the students switch roles in role-play tasks to make sure students can play both roles of the role-play. In addition, she sometimes video records the students’ performance for students’ self-assessment and for providing a more detailed feedback to the students and making sure her assessments are accurate enough, as noted below: when I record it, then I can go back again and again and don’t have to trust my memory; and then they can look at it or they can listen to it and then they can decide for themselves whether or not they have met the criteria and they re-record if they want to and when they feel happy with it they can give it to me or send it to me, email it to me. Then I can look at it or listen to it again and assess it.  In the example above, Linda elaborates on how she uses audio or video recordings of students’ speaking performance. In my view, although recording students’ performance can potentially increase the validity and reliability of the assessment, it may make students uncomfortable and therefore it may impact the authenticity of the assessment. Having said that, I also think that the way Linda is utilizing the recordings can potentially be beneficial for the students because it allows them to do self-assessment and gives them a chance to redo their speaking assessment in case they are not satisfied with their own performance.   According to the available data, another point worth noting is issues regarding students’ placements and moving them to the next level using PBLA. The interview data revealed that since PBLA has not been fully implemented in the majority of the LINC programmes in B.C. –and given that most programmes are still in the training phase – there is not much consistency among different programmes in their approach in progressing learners to the next level. Hence, based on what the participants reported, the programmes are taking different measures in this       62 regard. For instance, Violet, Sarah, Kate, Maya, Olivia and David reported that since their PBLA training was in progress, they were still using summative assessments – the former CLB standardized tests used prior to PBLA implementation – besides PBLA. However, Carol, Nina and Rose said that they are fully implementing PBLA in their classes.   In terms of the challenges in progressing the learners, according to the participants, one of the typical issues likely to happen in LINC classes is when students are misplaced – due to various reasons, such as test anxiety or other factors – while their language proficiency level is higher than their placement test results. Another common scenario takes place when learners have multiple levels of proficiency in different skill areas. Such cases can potentially pose some challenges for LINC instructors, as indicated in the following comment by Rose:  The difficult part in PBLA is that I cannot evaluate the student’s performance without seeing enough artefacts. There needs to be eight to ten artefacts from each skill in order to evaluate that student. For example, I have a student who is a higher level than my class, I know that she is not sitting at the appropriate level, but there’s nothing I can do! So these are some aspects of PBLA that are not so clear. There are still questions for myself as a Lead Teacher.  I’m concerned about other instructors that how will they be able to deal with those students that are way higher or even lower than the rest and are sitting in the same class with other classmates but they cannot move up or down without doing enough number of assessments.  I asked Rose if it would be possible to use the CLB standardized tests (like some other LINC programmes) to resolve such issues, but she responded: We tried that actually last year. When all the instructors were actually in the middle of their trainings, but that left our administrators with bunch of things that weren’t working in the system. Because we work for federal government and they are asking for lots of reports about every single student who comes and registers. We had exit progress tests for those students to see which levels they were at. But we are not doing that anymore, and the reason is that it left our administrators with such a mess in the school and also it spoiled other students. You know how the message spreads out in the school and they tell each other and they came to us asking, “Can I do a progress test? I want to move up or down.”       63  Rose was really concerned about this issue, and believed that the current PBLA system is not fair to some students. Nevertheless, she explained how she tries to address this issue: I know that is not fair, I know that is not fair for some of the students. I have four students in my class that I would definitely say they are CLB 2 or 3, but they will have to be in my class. It was so difficult for me to explain why I can’t move them up, and two of them are refugees, and they are looking for a job. They have needs, but you know, I can’t do anything about it, I can’t ask them to go and do a progress test. That’s why I’m doing assessments every week in order to have enough evidence to progress them faster and pass into the next level.  Referring to the same challenges Carol and Olivia elaborated on how they deal with similar situations. For instance, Olivia described three types of challenges in this regard: The first challenge is when students “have the expectation to move up, but they are not ready, and there’s no test to say that they’re not ready exactly … and that’s kind of a negotiation thing”. According to her, this mostly happens when students are preparing to apply for their Canadian citizenship and the results of the PBLA assessment tasks are not compelling enough for them, as a result of which, they expect their teachers to take some measures to move them up faster. The second challenge arises when some students are ready to move up – since they are learning at a faster rate than their classmates – but they do not have the required number of artefacts yet. In the following example, Olivia talked about a student who was really motivated and hardworking and was ahead of class in many different ways, but did not have enough number of portfolio artefacts yet: I’m sure she would be so bored just waiting for eight assessments.  I mean, she’s skilled. It’s hard to keep somebody there until everybody does another assessment. For somebody like that, maybe I would have to set her aside and give her a bunch of assessments to do, so that she has them to then move into the next level. I don’t know, I think that’s how I would do that but I don’t think anyone knows the answer       64 to that. So I would give her assessments on her own so she could move … but that’s a lot of work for me.    The third challenge according to Olivia is the case when a learner completes her PBLA assessment tasks successfully, but at a very low speed. In the following quote, Olivia expressed her uncertainty as to whether such students are eligible to move up:  Then there’s this other person who does the assessment, but is unbelievably slow at it.  Does she still qualify to go into level three even though it took her forty-five minutes to do the assessment?  She did it, she did it really well and she passed, but she wouldn’t be able to keep up with the class because she’s a very, very slow processor.  So, those kinds of things, in a test, in a standardized test, stops, you won’t have time to answer the questions. You would fail if you didn’t answer the questions. … so, if you’re going to use PBLA, this is just the kinds of things that you’re going to have to actually deal with, but I think PBLA is better though, overall.   Carol highlighted another similar challenge in using PBLA, and explained that in her programme, they do the portfolio review only two times a year. Nevertheless, sometimes in order to address the needs of her students, she does an extra (mini) portfolio review on her own.  I have students who are too high now, then they’re bored and unhappy and they need to go up … so I’m doing a mini one myself, on my own time. I’m not having the interviews in 15 minutes, it’s much shorter, but I still have to do all those things, review the binder, write the report card and I still have to talk to them.   On a different note, Carol mentioned another challenge in using PBLA. Reflecting back on her many years of teaching experience, she said, “I’ve taught LINC for 10 years, it is possible for a student to get completing on a whole bunch of assessments and for me to know they can't go to the next level”. As suggested by her comment, she deems PBLA as an appropriate, yet not ideal, method of assessment, as it is quite conceivable for a given student to complete all the tasks required for moving up to the next level, and yet he or she may not be adequately ready for the higher level.        65 To sum up, despite the fact that many studies have recognized the merits of portfolio assessment for formative purposes (e.g., Fox, 2008; Little, 2005), based on the participants’ accounts presented in this section, it seems that using PBLA for summative purposes – and as the only method of assessment – can potentially be problematic (e.g., Fox, 2008; Hargreaves, Earl, & Schmidt, 2002). This is particularly more evident in cases where students are demanding to move up, but they do not have the required number of portfolio artefacts. In terms of the assessment settings in PBLA, the findings show that the test setting is usually the same. Most of the assessments are conducted in the classroom context. However, for assessing the speaking, teachers usually conduct assessments outside the classroom in a quiet area, or in some cases in the classroom in front of the whole class.  To conclude the discussion of reliability, the findings showed that reliability is potentially a major challenge in PBLA. As mentioned earlier, the majority of the participants in my study had not conducted a full portfolio review and therefore had not experienced assessing the completed student portfolios for summative assessment purposes. However, all the participants reported that they regularly assess the individual assessment tasks that go into student portfolios. Therefore, when they commented on their experiences in assessing portfolio assessment tasks, they showed a great degree of uncertainty as to whether their assessments are consistent with those of other LINC teachers who teach the same levels. In fact, a couple of participants provided examples of cases where there was disagreement among teachers in decision-making regarding how one portfolio assessment task should be evaluated and what should be weighed more. Therefore, the findings of this study appear to be in line with Brown and Hudson’s (1998) point of view that in portfolio assessment, “Reliability may be problematic because of rater       66 inconsistencies, limited numbers of observations, subjectivity in the scoring process, and so on” (p. 662). 4.3.3 Authenticity In order to explore the authenticity of tasks deployed in PBLA, I asked the teachers how they make sure the assessment tasks correspond to the real-life language needs of learners. All participants reported, at the beginning of every month they conduct needs analysis, as a tool to find out what learners would prefer to learn to address their real-life language needs. Therefore, the results of the needs assessment are utilized as a basis for selecting the (real-life) themes that are to be taught and assessed in each individual level and class.  In addition, all teachers reported the use of “Can Do Statements” and the CLB document as their framework of reference. It should be noted that “Can Do Statements”, as the name suggests, refer to what students are expected to be able to do in different levels. Most of the teachers pointed out that they often use the “Can Do Statements” as a guide for designing the assessment tasks.  Interestingly enough, almost all the teachers were of the opinion that designing the assessment tasks that resemble real-life tasks is a real challenge, which demands a lot of preparation and teacher’s time. As a case in point, developing assessment tasks for the listening skill was deemed as a major area of difficulty for the teachers across all the CLB levels. All the participants affirmed that finding listening materials which are in line with the CLB descriptors is a formidable task, and that as a consequence they often have to create their own listening materials both for teaching and assessment purposes, by recording their own voice and sometimes the voices of their colleagues. As far as other skill areas are concerned, instructors       67 who were teaching lower CLB levels stated that they often have to heavily adapt the existing teaching materials or create everything from scratch. The CLB 3 and 4 teachers also claimed that they try their best to simplify and adapt the existing real-life texts such as library membership forms or job applications for teaching and assessment use. It is worth mentioning that despite all the efforts teachers make to develop real-life teaching resources – to help students learn the language skills they need in the Canadian context – some students may never use what they have learned in their real lives. For example, Maya pointed out that her students are predominantly immigrants from China who live in neighbourhoods where almost all the services are available in Chinese. Therefore, as Maya put it, “it’s hard for us to make assessments in English when we know for some students, it’s not a real-world task because they can get help in their L1”. In addition, in spite of the considerable amount of “unpaid” time teachers have to devote to developing real-life teaching materials and assessment tasks, all the participants expressed a degree of uncertainty as to whether the tasks they have created indeed resemble real-life tasks, given that they are all simplified versions of what actually exists and happens in real-life situations. In fact, “Texts produced in the real world differ (inter alia) in complexity depending on their intended audience and the amount of shared information between the parties involved in the discourse” (Lewkowicz, 2000, p. 46). As Maya pointed out, it is a challenge to make real-world assessment tasks especially for lower levels because teachers have to “to trim so many of the edges off of a real-world task for LINC one or two”. Therefore, every assessment task that is used in the classroom is “an artificial version of the real-world task” and “only an approximation of what it would really look like in the world” and in fact, there may be very few chances that       68 lower level learners may have to do a real world task without the help of a family member or friends.   To sum up, it should be noted that over the last few years, there has been a burgeoning interest in assessing the skills of students using tasks that are similar to classroom instructional activities and to those used in real life (e.g., Gifford & O’Connor, 1992). Utilizing assessment tasks which simulate real-life language use can potentially have a positive effect on test takers’ performance. As Bachman and Palmer (1996) put it, the relevance of the assessment tasks to the learners’ target-language use “helps promote a positive affective response to the test task and can thus help test takers perform at their best” (p. 24). When students can relate to the assessment tasks, there is a high chance they can use their prior knowledge or experiences to respond to the task. Therefore, they may be able to perform better on the assessment task. However, as the findings of Lewkowicz’ (2000) study suggest, “A high degree of correspondence between test and target-language use tasks may be a necessary but insufficient condition for authenticity to be discerned” (p. 61). According to Lewkowicz, the authenticity of test tasks also depends on a range of conditions that interact with the test input and therefore impact test takers’ perceptions of the assessment task. As far as PBLA is concerned, according to my research participants’ views, creating real-life, authentic assessment tasks, aligned with the CLB descriptors, is highly challenging and even far-fetched in the case of lower CLB levels. In addition, in terms of assessing the listening skills, the quality of the listening materials created by teachers is questionable since it may (at least potentially) negatively impact learners’ performance. Therefore, it can be reasonably concluded that although PBLA advocates the use of authentic materials for instruction and assessment purposes, the quality of such materials is a matter of       69 debate. In addition, the extent to which teacher-made authentic materials impact the reliability and validity of PBLA assessments – which is beyond the scope of this study – should be further investigated.   4.3.4 Interactiveness The interactiveness aspect of PBLA entailed three main questions. The first asked teachers what measures they take to ensure students have the topical knowledge required for task completion. The second question investigated how teachers make sure the assessment tasks are suitable for the particular language learners in their class. Finally, the last question asked if teachers consider the fact that an assessment task may evoke an affective response that would make it relatively easy or difficult for some learners to perform at their best.   In response to the first question, all the participants affirmed that they engage students with skill-building and skill-using activities, and do “lots of practicing” and “scaffolding” with them to make sure their students have the required topical knowledge for completing the assessment tasks. As a case in point, David maintained that the assessment tasks are “usually almost identical to the skill-building and skill-using activities except the content has being modified or changed in some way so students are doing essentially the same skills or otherwise it’s not really a fair assessment probably”. In a similar way, Linda explained that through skill-using activities, they “try to elicit from learners what do they already know and then we build or expand on the knowledge that they already have”. Additionally, Nina explained that she uses needs assessment to learn about what topics interest students and to make decisions about what should be taught. After that, through warm up activities and skill-using tasks she tries to find out       70 how much students already know about a particular topic, and then builds on that to develop new lessons.  Therefore, according to the participants’ accounts, through skill-using activities, learners are provided with many opportunities to obtain the topical knowledge they need for completing the assessment tasks.  In terms of the second question, the participants reported that they find out about the suitability of the assessment tasks for their learners in two very different ways. First, by doing the needs assessment, and second by looking at the students’ performances on assessment tasks. For instance, Rose, Carol, Violet, Sarah, David, Nina and Maya were of the opinion that by doing needs assessment, students can indicate what they want to learn. Hence, when teachers plan their lessons based on the needs assessment, since all the skill-using and assessment tasks are developed on that basis, they can make sure that what they do is suitable for their learners. As an example Nina pointed out that they work on a variety of topics that an immigrant would need on a daily basis. Some of these topics include, banking, employment, health, getting a driver's license, buying or renting a house or apartment, taking a bus, finding their way around the city, etc. She also noted that depending on learners’ level of involvement in Canadian society, their language needs may change over time and that is why teacher do needs assessment at the beginning of every month.  However, other teachers including Kate, Olivia, Linda and Maya, responded to this question differently, and focused more on the assessment aspect of PBLA. To elaborate more on this, Maya said that by observing students’ performance during skill-using activities, she can       71 understand if the assessment tasks and tools she has created are suitable for her learners or not. As she noted: I think a lot of times I kind of know what they should know during the time we’re doing skill-using activities. I know what’s challenging or what’s doable, what they can do on their own, what they can do with their classmates’ help, what they needed help from me, how much help from their notes did they need. So I spend a lot of times, watching them as they’re skill-building and then I know whether the assessment that I’ve made, the tools that I’ve made is going to be suitable or fair for the majority of the students.  Along similar lines, Linda believed that in order to find out whether an assessment task is suitable or not, teachers can refer to “Can Do Statements” to see what they should expect of the learners. She also concurred with Maya that observing learners in contexts where they can use the language is helpful, as indicated in the following example by Linda:  Well, I have to say sometimes it’s trial and error, but sometimes we do things that are a little bit too hard or a little bit too easy and that’s how we learn if something is suitable or not. The other thing is the “Can Do Statements” and your observations, because you have been doing skill-building, if they haven’t been getting that then you go back and do more skill-building. Then you set up situations where they can use the language. It’s also good to have a field trip and take them out so they really have to use it, and then observe them. Do they get stage fright or are they okay walking up to a store clerk and saying “Excuse me where is your spaghetti sauce?”    Quite similarly, Olivia and Kate reported that they pay attention to the assessment outcomes to gauge the suitability of the assessment tasks for their learners. As Olivia put it, “if they can do it, most probably, it’s suitable. If all of them fail the assessment, then we know that we either haven’t done a lot of scaffolding or the assessment is too hard.” Furthermore, Kate explicated that she has two CLB 3 classes, and noted that although they are both the same level, she cannot use the same assessment tasks (that she uses in the morning) in her afternoon class. She added,       72 So how do I know if they're appropriate, I'm not sure! Sometimes, I don't, until I'm marking them and then I realize that I've set the bar way too high for one group and then at an appropriate level or perhaps too low for the other group. For example, for my evening students, if I use the same assessment I have to modify it because I know they're not that high or they have strengths in other areas, but then it becomes a different assessment and maybe perhaps a different level, so I'm still trying to figure that out.   Finally, participants provided a variety of answers in response to the last question, which sought to investigate whether or not – in developing assessments – teachers consider the fact that an assessment task may evoke an affective response that would potentially impact learners’ performance. For example, Linda indicated her awareness of the sociocultural aspects in designing assessment tasks, and elaborated on the fact that the majority of her students are refugees who come from Syria, Iraq and Afghanistan. Therefore, she pointed out that in developing assessment tasks, she is wary of using images that are culturally appropriate for her learners, as stated below: … because we work with refugees we have to think about things. If we are trying to get reading resources or listening resources from other sources, we have to kind of screen them and say, okay is there anything that might indirectly hint at any kind of racism or anything that makes cultural assumptions or allude to cultural things that people either may not understand or may misinterpret. … we use photographs that would not be disturbing, preferably. Although I do think of a couple of situations where it wasn’t in an assessment, but I was trying to pull up Google images of like motorcycles and one of the images, in the middle one of the bigger images, was this really sexy lady in a bikini on a motorcycle and that wasn’t what I wanted to show, but that’s what came up. That’s taken into consideration, I mean you try to be as aware as you can be, so I think a lot of that stuff is kind of fleshed out in the skill-building part before it gets to the actual assessment.  On a different note, Sarah and Maya stated that in developing assessment tasks they try to take learners’ gender, as well as personal and professional background into consideration. For instance, Maya pays attention to where her students come from, or their schooling experience in       73 their home countries to make sure the assessments she makes are suitable and fair to her students.     Similarly, Rose and Violet mentioned taking interpersonal and intrapersonal factors into account, particularly in assessing the speaking skills. For example, Violet explained that she looks at the circumstances in which the assessment is conducted. In making decisions about a students’ performance, she considers whether or not they had missed any classes during skill-using activities, or she pays attention to the test taker’s partner during the speaking assessment. As she put it, “sometimes it depends on the partner, for example, when somebody is very quiet, I know how it affects the other student. I’m aware of that and I take it into consideration”. She added that she also pays attention to the relationships in the classroom, the emotional aspects and other dynamics that come into play. She concluded that although these aspects are not included in the assessment tool, sometimes makes notes and she considers the whole situation surrounding the assessment.  Additionally, Nina, who teaches a multilevel class talked about taking learners’ proficiency levels into consideration in developing the assessment tasks. She elaborated on the fact that although she teaches CLB 4 and 5, some of her students are at CLB 3 or 4 in some skills and CLB 5 or 6 in other skills. Therefore, making assessment tasks for such a group of multilevel students is extremely challenging. In order to address this issue, she has to create different types of questions – on the same assessment task – for different students. For example, for a student who is in CLB 5 or 6 in reading skill, she adds more detailed questions in the reading assessment and for a student who is in CLB 3 or 4 in reading skill and CLB 5 in speaking skill, she would give simpler and fewer questions on the same reading task. These questions are designed in a       74 way that while students are at different levels, they can finish the assessment in about the same time.  To sum up this section, the findings suggest that in developing PBLA assessment tasks, teachers strive to take account of different aspects of learners’ individual characteristics in order to make sure their assessment tasks are linguistically and culturally appropriate for their particular group of learners. Also, as David asserted, in portfolio assessment students are “actively involved” in the assessment process, “which is very different from the standardized high-stake assessments or traditional summative assessments”. Indeed, Delett, Barnhardt and Kevorkian (2001) argue, “well-designed portfolios offer students the opportunity to become actively involved in the learning process by contributing to instructional planning and assessment” (p. 560). Therefore, based on the examples the majority of participants provided, it is fair to conclude that PBLA has all the positive features of interactiveness. As defined by Bachman and Palmer (1996), interactiveness refers to “the ways in which the test taker's areas of language knowledge, metacognitive strategies, topical knowledge, and affective schemata are engaged by the test task” (p. 25) and the teachers’ accounts attest that all of these aspects are taken into consideration in PBLA.  4.3.5 Impact In order to probe into the impact of PBLA as a method of assessment, I divided the questions into the following categories:  Positive impacts of PBLA on learners’ lives  Negative impacts of PBLA on learners’ lives  Positive and negative impacts of PBLA on Canadian society       75  Impacts of PBLA on teachers  In what follows, the participants’ remarks relevant to each category will be discussed. Positive impacts of PBLA on learners’ lives As far as the positive impacts of PBLA on learners’ lives are concerned, it is important to note that the participants unanimously agreed that PBLA is student-centred, and as a result, it encourages learner autonomy and independence. As David put it, “it potentially makes learners feel more positive about their own role in their learning” and according to Kate, it makes learners more “accountable” and “motivated”. In addition, it allows learners to be more involved in the assessment process. These comments support the findings of the previous studies that portfolio assessment increases learner autonomy (Little, 2005, 2009), motivation and involvement in learning process (e.g., Hamp-Lyons & Condon, 2000; Romova & Andrew, 2011; Yin, 2014).   Similarly, teachers believed that PBLA encourages learners to become more responsible for their own learning (Genesee & Upshur, 1996). For instance, since learners are required to compile their portfolio tasks in the Language Companion binders, they are expected to feel responsible to make sure not to lose their binders. According to Nina, “PBLA allows learners to be more responsible. They know they have to have a certain number of artefacts in their portfolios, and therefore they think it's important and they take it seriously”, and as Sarah put it,  It [PBLA] does teach learner independence, and often makes them feel more responsible because they have to keep track of their assessments that goes in their portfolio, the teacher doesn’t keep the assessments. And then they can also get a sense of how they are doing and what their journey looks like in the language training, so that’s all good.  In terms of using Language Companion binders, despite the challenges which will be explained in the practicality section, participants were in agreement that having the Language       76 Companion binders encourages learners to be more involved in keeping the record of their assessments and progress. As Carol noted, I do think that, the notion of the portfolio, like having a binder, and putting your stuff in it, as a student, is a really good tool. To just go “look, look at me, here is my 10 assessments that I’ve done and, this is what I could read 3 months ago and now I can read this”. I think that is probably the best part about it.  Kate, who teaches CLB 3, said that she has noticed her students “take pride in filling out the forms and being able to look back at their assessments and see their growth.” She went on to say that,  They’re able to go home and I'm sure they show their families what they did at school today and say, "Oh look! I got this... I passed this assessment, this is what I was assessed on, and I can do this now!  Nina, a CLB 4 teacher, explained that learners “really enjoy” reading through all the extra materials about Canada, employment, and other topics. that are placed at the front of their binders. “They read them and they probably show it to their families as well, so I think they feel part of a bigger learning community.” She said that she has told her students that everybody across Canada, at some point, will have this binder, so if they go to live in any other parts of Canada, they still should keep their binder, and they can go into class and continue learning English, because they are all part of this community. Similarly, Violet, a CLB 2 teacher, said that when learners receive their binders, which are provided by the government, “They feel very important. They feel it’s serious”, and David who teaches a variety of CLB levels added,  They certainly feel the weight of what they have learned, because they have to carry those binders around everywhere. So, they can feel it building and over time they can see their binders get their thicker and thicker, they can see their work and that’s a good feeling, to see the product of your learning and fruits of your labour!        77  Commenting on the positive impacts of PBLA on learners’ personal lives, teachers across all CLB levels acknowledged that since they are required to design their lessons based on the results of the needs assessment – which they are required to do every month – learners can potentially have a significant role in deciding what they want to learn. Therefore, as Sarah explained, the idea is that they will use the skills they have learned in their real lives. In addition, as mentioned earlier, based on the PBLA guidelines, all the teaching materials and assessment tasks should resemble real-life tasks. Hence, according to participants, the goal of PBLA is to prepare learners for settling in Canada. As Linda – who teaches Literacy and CLB 1 – put it, “because it’s about real-life tasks so if learners can recognize where they need to improve, that’s going to reflect on their life skills and their ability to settle in Canada, which is the whole point of the LINC programme”. As Yin (2014) suggests, portfolio assessment can be effective in improving language skills. Maya, a CLB 3 teacher – who previously used to teach CLB 1 – noted that she has noticed CLB 1 and 2 learners put some of the assessments they use in their real lives at the front of their binders, for instance, she explained: … with LINC one and two where they’re still trying to learn to remember their addresses and things like that, sometimes they actually use their old assessments to help them later on down the road like how was their address supposed to look on a piece of paper. So sometimes they actually reuse their old assessments to use when they actually go to the library and have to give somebody their address again.  Maya’s example shows how portfolio tasks can be helpful for learner’s language use in real-life contexts. This is similar to the findings of Yayli (2011) that showed the participants in her study were able to transfer learning from one genre to another and improve their “cross-genre awareness” by collecting their writing samples in different genres such as e-mails, recipes, writing complaint letters, and essays.        78  Furthermore, some teachers reported that PBLA helps learners improve their soft skills especially in peer feedback activities, for example, according to Violet:  PBLA teaches learners to really think about the language and what they are supposed to do when they are listening. They really have to listen.  Not just pretend they’re listening and not know what the other person said or think about what they are saying.   Thus, it can be reasonably concluded that PBLA affords learners an opportunity to give and receive feedback, which is one of the social skills learners may potentially require in the Canadian society and educational or work contexts.  From a sociocultural perspective, another interesting point raised by Olivia – a Literacy and CLB 1 teacher – was that PBLA creates a “sense of personal agency” (Little 2009; Little & Erickson, 2015). She elaborated on the fact that many of the learners in her class have never experienced schooling even in their first language. Consequently, they have never received that kind of “individual attention” which PBLA promotes. She reasoned that PBLA allows teachers to really take the time to get to know their students in a way that perhaps they have not always had to. As Yin (2014) argues, portfolio assessment increases communication between teachers and students, which is in line with sociocultural theories of learning. This is reflected in the following quote from Olivia: PBLA just really helps to focus on sort of an individualized learning and the expectations for students to take on certain responsibilities. I think it’s really good; I don’t think that many of the students that I have, have had that kind of experience or have thought of themselves in that way in terms of being like this unique individual with a learning style. Things like reflections help them to understand how they learn. So, there’s a lot of that kind of metacognitive sort of theory in PBLA. I think that it’s really good for that.        79 In addition, Olivia provided an example of how the feedback she provides on portfolio tasks can positively impact the lives of female learners who come from less privileged sociocultural backgrounds, as encapsulated in the comment below:  The feedback I give is always really positive and encouraging. It’s really helpful for some of them.  As I’ve said, an Iraqi woman, who’s never been schooled before, who’s got eight kids at home and she’s married when she was thirteen.  For her, it’s really exciting to be in school and just to have this person of authority, which is me, telling her how she’s doing, how great she’s doing at this and to pay some attention to her, just to her! … I know it makes a big difference for them. They’re learning something really more about themselves often than [they are about language]. Also about language, hopefully they’re learning that too, but there’s something deeper going on for them. So, I think PBLA is great in that way.    Moreover, Violet also commented on the beneficial effects of feedback on assessment tasks in PBLA, in comparison with the exam marks in traditional testing: … because it's competency-based and there are certain criteria, so when the students get their feedback, they know better where to focus on, instead of just giving them a number which doesn’t mean much to anybody except to place them on a scale. I think it also brings students’ awareness to what they’re learning.  And they also get more involved in their learning.  As Hamp-Lyons and Condon (2000) argue, portfolio assessment provides students with the opportunity to revisit and refine their earlier work, and gives teachers the opportunity to focus on formative feedback rather than merely summative grades or scores that may not be as informative for students. Similar to this view, Sarah also believed that rather than giving pass or fail scores in traditional testing, PBLA provides more opportunities for learners to improve their areas of weakness. As she put it, in traditional testing “you might not even get to look at your test, after you've written it and received your result, but PBLA is different, a lot of learning can happen”.       80  Last but not least, some teachers considered the frequent number of assessment tasks as a drawback of PBLA, and a potential source of stress for many learners. However, other teachers believed that PBLA in fact is not as stress-inducing as the standardized tests, and according to Violet and Maya, it is in fact more beneficial for learners who get nervous and cannot perform well on exams due to test anxiety. In fact, Olivia and Rose were of the opinion that PBLA provides more chances of improvement for Literacy learners. Referring to her learners Olivia stated: PBLA actually works pretty well with Literacy learners because it’s less stressful and these ongoing assessments are a lot easier to take than doing a test of some cognitive measure, language abilities, because you can do it multiple times. The assessment tasks can be done over and over again until they get it right and then they move to the next level. So, I think it’s better because it’s more individualized.  I think it’s more authentic and for my students it really works a lot better.    In her comment below, Rose also agreed with Olivia in that PBLA is “less stressful” for her learners:  … and with ongoing assessment, it takes the tension of doing assessment away, because with PBLA you have to do assessments every single week. Because it's assessment for learning, it’s not assessment on learning. We have to keep them motivated in different ways. We challenge them from different aspects. It’s not just teacher giving them assessment, it’s also self-assessment, learning reflection, teacher assessment.   This confirms the findings of the previous studies that show portfolio assessment can help educators and students overcome many assessment difficulties (Cooper, 1999; Cooper & Love, 2000). In conclusion, based on the interview excerpts provided here, some of the positive impacts of PBLA on learners’ lives can be listed as follows:   Increasing learner’s autonomy, motivation and responsibility in language learning   Enhancing learner’s real-life language skills       81  Maximizing learner’s engagement and involvement in the assessment process  Potentially decreasing learner’s stress level during the assessment process Negative impacts of PBLA on learners’ lives To follow up on the impact of PBLA, I also asked the teachers whether they could think of any potential negative impacts of PBLA on learners’ lives. In response to this question, one of the major issues pointed out by the majority of the participants was the time-consuming nature of portfolio assessment as also indicated by other studies (Brown & Hudson, 1998; Ripley, 2012). For instance, some teachers believed that PBLA “eats up precious class time”, as remarked by Kate. A potential source of concern in this regard, as pointed out by Rose, Linda and Violet, was the fact that the teachers (especially in lower-level classes) would have to spend a considerable amount of the “precious class time” to orient students to PBLA concepts and requirements. For example, the students often wonder why traditional tests are not used anymore, why teachers are not giving them solid marks, or how to do self-assessment, or give peer feedback to their classmates, etc. and thus teachers have to explain this new information to students. They also have to ensure students fully understand the different required components of the Language Companion binder, and how to keep it organized, among other things. This is particularly problematic where classes have intake on a continuous basis, and as such teachers would have to repeat the instructions over and over again for the new students. To be more specific, teachers who were teaching evening classes believed that, prior to PBLA, they could use the class time more efficiently. They contended that most students who attend evening classes work during the day and have limited time for learning English. Thus, spending the class time on PBLA concepts       82 seems to be “unfair” when the class time could be spent on teaching “what learners actually need”, as Linda argued.   Time constraint was also perceived to be an issue in higher-level classes, where teachers often have to administer a series of assessment tasks, which would not allow for in-depth class discussions. This concern is indicated in the following remarks by Kate, who is a CLB 3 teacher:  I don't like the idea that everything else such as talking about culture and learning from each other is being pushed to the side, when some of the discussions can be very very rich, but then I look at the clock and think "Okay, we’ve got to stop because we have to assess tomorrow and if we continue this discussion, we won't be able to practice filling out a particular form or whatever.  Kate maintained “I think learners lose out on the broader cultural aspect that we had before PBLA where you could talk about those soft skills or things like body language.” She went on to say that “I can't really assess body language very easily, and it's such a subtle and contextual thing and there isn't time to really delve into that, and I would love to be able to do that and they [students] would too”. As she put it: … there are some other discrete elements they can't get the whole picture of, because it's a cultural element too, but it's so hard to get into that and I love getting into that with them because they've got questions but you have to focus on practicing filling out application forms, or something like that.   So I do like the real-world aspect of it, they are very useful skills but I feel there's a lot being lost as well. … if a student has a question and I want to answer because I know that will benefit perhaps the whole class, I really have to keep a balance or weigh it, is it worth going off topic for 15 minutes and now I won't be able to do the full assessment, I’d only be able to do part of it and I hate having to make those decisions because you're always looking at the time, looking at the calendar and thinking "Okay, when do I have to have all of this done by?" So it's not all bad but it's certainly a shift, a paradigm shift.  Therefore, she argued that with PBLA teachers can perhaps merely minimally attend to cultural matters, but in general “cultural aspects are being pushed aside to get more assessments done”.       83 She believed that sometimes it is more important to simply share such “subtle language and cultural aspects”, and provide opportunities for students to learn with and from one another rather than just “teaching to the test and assessment”. Carol also remarked: PBLA requires a lot of paperwork and it is very administratively heavy, lots of forms to be filled out constantly and not much of the interesting discussions can happen anymore because we're so stuck to completing so many assessments and so many hours of class time. Students are continually being tested and it is more than ever teaching to the test!   In addition, Literacy and CLB 1 teachers believed, despite the tremendous efforts on the part of teachers to simplify and clarify the PBLA concepts to their students, “some of them never really understand what PBLA is and how it functions, and what the feedback really means”, as Rose stated. An example of this is an incident that happened in Linda’s Literacy class. She explained that once, for peer assessment, she told her students to exchange papers and put a small “x” next to the wrong answers. Despite her clear instructions, one student put capital and large X next to all the wrong answers of another classmate. According to Linda, the student who received this feedback “totally freaked and interpreted that as possible prosecution, because in some schools in other countries they beat the kids if they get them wrong”. Linda went on to say that the student was so scared that she had to call for an interpreter to come and explain to the student that it was just feedback, and that it did not at all mean that the student would be punished in anyway. Reminiscing about this incident, Linda commented that in lower CLB levels, particularly in Literacy classes,  it is very helpful to have an interpreter and explain why we are doing what we are doing, and what PBLA is, and that it’s not like the test that you took in school which tell you whether or not you’ll stay in the grade or you go home and your parents beat you up because you didn’t do it well enough. Teachers have to constantly       84 remind the students the purpose of PBLA and if you don’t do that you will end up in situations like what ended up happening to my student today; they freaked out!   Echoing Linda, Rose expressed concern about the likely confusion (among some students) caused by switching to PBLA which is a totally different assessment system from what students have experienced, as shown in her remarks below:  Some students are still very confused about what’s going on! They may think, ‘I was a student yesterday, doing well, and today I’m not doing well and I’m confused about all these assessments, different types of assessments and new terminologies that came into school system.   In addition to the issues mentioned above, the time required for compiling (the 8-10 pieces of) portfolio artefacts was considered to be a constraining factor that can give rise to challenges in progressing students to the next level. In fact, according to the participants, this could be considered as one of the major weaknesses of PBLA as a summative assessment tool. As noted earlier (in Chapter 2), according to PBLA guidelines, in order for students to progress to the next level, they are required to have eight to ten pieces of artefacts in their portfolio in each skill area, as evidence of their progress. Once a given student has compiled the required number of artefacts, the teacher needs to have a (one-on-one) conference with the student so as to (a) provide holistic feedback on the portfolio items, and (b) finally write a progress report on whether or not the student is ready to move up to the next level. According to participants, portfolio review generally occurs two times a year; however, this may differ from programme to programme. Based on the interview data, two major issues arise here. First, as the participants reported, many LINC programmes have continuous intake, which means “there is not a fixed start and end date for classes”, as Olivia stated. Therefore, depending on when an individual learner joins a programme, he or she may already be behind his or her classmates in terms of the       85 number of assessment tasks that have already been conducted. Also worth mentioning is that all the teachers indicated that – prior to each assessment task – students have to engage in a number of skill-building and skill-using activities in order to prepare for the assessment. Nevertheless, the interview excerpts showed that (in most programmes) there is not a fixed time frame between one assessment and the next, and that in most cases the assessment time interval depends upon when the class is ready. Consequently, students who have a higher level of proficiency (than their peers) would have to wait for their peers to catch up. As a result, it may take anywhere between six months to a year for a given student to have eight to ten artefacts to be able to move up to the subsequent level, whereas the previous system was more flexible in this regard. Indeed, students who were more proficient could simply take a CLB standardized test and move up, at teacher’s discretion. Another significant issue worth noting is that if a student is misplaced at the outset of the programme, he or she should go through the same process, regardless of how proficient he or she might be. On a similar note, in discussing the challenges of deploying PBLA, another recurring difficulty mentioned by most participants was continuous intake. Teachers were all discontented with the continuous intake in their programmes, since it inevitably adds more challenges to the existing difficulties they are already facing in implementing PBLA. As mentioned earlier, every time new students join the LINC programme, teachers have to reiterate the PBLA concepts to them, which takes away either class time or teacher’s break time. In addition, irregular attendance makes it difficult to keep track of learners’ progress. According to Linda, I have to say that PBLA is a little bit different at the Literacy level, because they don’t really understand about learning. It’s hard for them to track their learning unless it’s really, really concrete. Also, because their attendance is erratic, because       86 a lot of them are still going through settlement issues and health problems,…  they don’t see continuity and one day it can seem like they get it, and then the next day it’s like it’s all brand new. It’s hard for them to see and track how they are doing. It really takes someone sitting down with them and saying this is what you did last year, this is what you are doing this year because these people tend to be there for quite some time.   In a similar fashion, Olivia and Violet explained how such issues can exert additional pressure on teachers, as illustrated in the following comments:  In my school, we have continuous intake. I’m in a multilevel class, and with people with all sorts of crazy situations. So, it’s not a linear process. I think it’s an enormous amount of extra pressure, so sometimes you have to say to yourself, this is as much as I can do, if you don’t have the confidence to be willing to modify some things, I think you often feel like you’re never good enough in doing it.  Because there’s always more to do, so you never feel like you’re really good at anything as a teacher.  It’s weird, but I do think... because there’s so much to do, you’re never finishing anything.  And then, if you have a continuous intake, it’s like you got to have that one student who just joined on a Monday morning and with a Learning Companion and set him up with the autobiography and while everybody else is trying to do something different. So, those kinds of pressures are always on the back of your mind. (Olivia)  … but again, we always come to this point when we say, yes, ideally PBLA works very nice. However, what happens in practice is different. We constantly have to deal with learners’ irregular attendance, and continuous intake is a big challenge for LINC teachers, there is a lot of movement. Students move from one LINC programme to another, students stop classes and come back, life happens, ...  I think if we were to teach one class for one year and we knew that the students would stay with us, this would be perfect. (Violet)  Although the interview excerpts provided in this section do not provide a full picture of the challenges teachers are dealing with in terms of student attendance and continuous intake, they clearly reflect the findings from Ripley’s (2012) study, which considered continuous intake as one of the major challenges in PBLA implementation. These findings are also supported by Weigle (2002), who disputes the practicality of deploying portfolio assessment in adult education programmes, which allow continuous intake. Weigle reasons that portfolio assessment requires       87 students to remain in a programme long enough to compile a portfolio, whereas in reality this may not be feasible in such programmes.  Another salient point raised in the interviews was the case of senior learners (particularly Chinese immigrants) in the LINC programme. Some of these learners merely attend English classes simply to keep busy, and socialize with others. As Rose stated, “they just don’t want to be left at home alone, or some of them their family is very busy to take care of them. They just come to school to feel safe and somewhat feel like they are valued”. Therefore, as she concluded, “they don’t care how well they do on their assessment. They don’t take it as seriously as teachers do”, and as David added,   They may feel like they are forced to take part in a process that they are not interested in.  We have presented a learning opportunity for them to be engaged and involved and some of them may feel like we just want to assess them.  Additionally, Maya believed that if students constantly get “not successful” on their assessments, due to irregular attendance and missing preparation activities, they may think it is a reflection of them and their lack of abilities. Consequently, if they keep compiling a portfolio of the assessments they did not pass, rather than trying harder, they may lose their motivation for learning. To recapitulate the above mentioned issues and challenges, one of the major issues that emerged in the interviews was students’ sporadic attendance. It should be noted that students in LINC classes are adult immigrants, the majority of whom have family obligations and need to work to support their families. As such, it is quite conceivable that some students may occasionally have to miss classes or even take long breaks from the class and go back to their home countries for personal reasons. As a consequence, they might miss the assessment tasks.       88 Such students may never have enough number of artefacts in their portfolios, and even if their language proficiency improves by just living in Canada, they may have to stay in the same level forever because they do not have enough number of artefacts to evidence their progress. All of these factors may hinder their progress in the LINC programme, which can adversely affect their Canadian citizenship status because they may not easily reach CLB 4, which is the minimum language requirement for the Canadian citizenship application.   Finally, it is interesting to note that the majority of the issues raised here such as concerns about sporadic student attendance, issues surrounding continuous intake, and time constraints are all recurring difficulties which were also recounted in Ripley (2012) as the major challenges in PBLA implementation. Positive impacts of PBLA on Canadian society In this question, I asked if participants could think of any positive or negative impacts PBLA may have on Canadian society or education system. Most of the participants believed it is still too early to witness the impacts of PBLA on Canadian society and education system, thus, my data did not show any negative impact on society. However, it may be possible that the PBLA has some potential negative impacts on society but those were not uncovered in this research. On the other hand, the participants had different perspectives regarding the positive impacts of PBLA on Canadian society. For instance, according to Violet, through PBLA students are learning more about “life skills” and “some soft skills that are necessary in a workplace”, or other Canadian contexts. As Violet added, “They’re becoming more aware of some subtleties, unwritten rules and conventions that otherwise they would miss”. Similarly, Maya believed that PBLA gives learners confidence and provides them with a chance to practice the language skill       89 they need to do real-world tasks, so they can actually use those skills outside the classroom context, and even if they do not use them, at least they get a little bit of a taste of what it is like.   Furthermore, Rose added, since PBLA employs task-based teaching approach, it is not just about teaching the language. According to her, “most part of it is a cultural thing”, and the assessment tasks are “more learner-friendly” and the “skills learners learn through PBLA, facilitate their settlement in Canada”. Additionally, Carol referred to the fact that PBLA “promotes self-reflection” (Little 2005, 2007, 2013), which is one of the things people are encouraged to do in a Canadian workplace. Therefore, according to her, “if you are an employee or a newcomer and you want to get a job, it is useful to know things such as self-reflection”. Impacts of PBLA on teachers In this part, I asked whether PBLA might have any impacts on teachers. First and foremost, similar to the findings of other studies on portfolio assessment (e.g., Fox, 2014; Ripley, 2012), all teachers expressed their dissatisfaction with the tremendous amount of workload that PBLA has brought about for them. For instance, David remarked that with PBLA implementation, teachers are “thrust into roles of material developers, assessment designers, assessors and assessment tools developers”, which makes PBLA very demanding in terms of the skills teachers need to acquire to fulfill the additional roles that is expected of them. Similarly, Carol pointed to the fact that in the past “there used to be people who got paid to make the assessments as a separate job”, and therefore she believed that, “it’s unrealistic and unfair to have teachers making so many assessments”. In fact, she admitted that, due to the overwhelming PBLA workload, she is thinking about going back to work for international schools, because she has family obligations and other responsibilities, and cannot devote all her time to PBLA. Similarly, Olivia       90 said “I think it’s just a lot of work for teachers and I think that is not being considered.  I don’t know how other organizations are doing it, but we don’t have extra time to do assessments”.   The fact that portfolio assessment creates extra roles and responsibilities for teachers support the research findings of Delett, Barnhardt, and Kevorkian (2001). In fulfilling the multiple tasks expected of them, all teachers directly or indirectly acknowledged that PBLA implementation has induced undue stress for them on different levels. As mentioned earlier, developing real-life teaching and assessment materials is onerous and designing assessment rubrics, scales and checklists are “very time-consuming” as participants argued. Most teachers expressed a level of uncertainty as to whether they are doing the assessments “right” or in a proper manner. Moreover, in the past when summative testing and standardized assessment were used, there was less pressure on the teachers. As Violet stated, “Now, I feel more responsibility for my students in deciding about somebody’s future.  In a way, if the assessment tasks I create are problematic, it is a big deal”.  However, she also believed that teachers should not merely rely on PBLA for making judgments about their students’ progress and “they should refer to their teaching experience and gut feeling as well”.  In this section, a brief summary of the potential impacts of PBLA on teachers was presented. However, in response to research question 3, which revolves around challenges in PBLA implementation, a more in-depth discussion of teachers’ perceived challenges in utilizing PBLA will be provided in a later section.   4.3.6 Practicality In order to investigate teachers’ perceptions of the practicality of PBLA implementation, I asked about the resources that are at their disposal for developing teaching materials and assessment       91 tasks and tools. In general, one of the recurring themes in all the interviews was the issue of the insufficiency of resources (Fox, 2014). As explained earlier, almost all research participants claimed that they design their own teaching and assessment materials. They reasoned that the existing materials do not necessarily comply with the CLB document; therefore, all teachers reported that they have to develop everything “from scratch” or “adapt” the available resources to match them with the criteria in the CLB document, as Maya reported:  I might get the seed of an idea from a book, but I can’t use any of the pictures, I can’t use any of the dialogues, I can’t use any of the words from the books because they’re outdated! For example, the book will talk about something like a camcorder or a VCR, who has a VCR now? Nobody does! Or it won’t be relevant to Vancouver, the street names or the addresses on a map, I always make my own maps.   She explained that she uses maps with Vancouver streets and addresses and phone numbers, and tries to modify the existing materials to make them related to the Canadian context and relevant to the area she is teaching in and the neighborhoods where her students live in.  As for online resources, some participants tended to use website – which is a repository website for LINC resources – to look for teaching and assessment materials. However, teachers noted that this website is not quite user-friendly and the materials it contains are not well organized, and some may lack quality. According to Linda, “Tutela is a repository, so that’s been very helpful, but I don’t think it’s enough”. She also said that she had tried using the PBLA webinars from Ontario to learn “how they do things”, but unfortunately, she had not found them very helpful, as shown in her comment below: … it takes a while to get your head around it and in Ontario they don’t teach exactly the way we do here and they don’t teach the same themes and they may have more skill building involved in their courses.        92 Sarah, who is a Lead Teacher, talked about the online support forum for Lead Teachers. However, she reported that it is not easy to keep track of the discussions on the forum, and described them as “bits and pieces that you have to keep control of”. In her view, even though it is helpful that the regional coaches participate in the forum discussions and respond to the teachers’ questions, reading through all the discussions (and making sure not to miss any responses) is very challenging.    On a different note, Linda suggested there should also be more support and resources for learners. She explained that the majority of learners in her Literacy class are refugees and those who have very limited financial resources. Therefore, they require more financial support to be able to participate in LINC classes, according to Linda: If the government wants to track their training they need to provide resources; they need money to get to school and if they can’t get to school, they keep missing classes because they don’t have settlement support. They don’t have transportation. They don’t have counseling for what’s going on in their heads so that they can pay attention! … it’s hard to go to school and walk through the pouring freezing rain when you come from a country that’s 45 degrees and you have got three kids and you don’t understand what the teacher is saying and you don’t know how to talk to the doctor and you get lost and take the wrong bus, there is just so many stresses. Money stress especially, the schools are demanding this that and the other and they are the kids’ friends are demanding this that and the other. Particularly, where we are it’s so expensive to live and to give everybody in the country the same amount of money, for here it doesn’t go very far. It’s very belittling actually …   In terms of using Language Companion binders, similar to the findings of Ripley (2012), it turned out that none of the teachers in the study were pleased with the extensive amount of time they have to spend on teaching and monitoring learners. For example, they had to show them how to keep their binders organized by putting the assessments and other handouts in the right section of the Language Companion. According to David, “one of the biggest challenges       93 has been just having the students bring their binders”. David and other teachers also talked about the challenges in organizing the binders when they first arrive, in a class of 15 or 16 students and the linguistic difficulties of explaining the purpose of the Language Companion and the PBLA approach to lower level learners.  Similarly, Violet – a CLB 2 teacher – remarked that keeping the binders in order wastes a lot of precious class time,   Every time I give them a piece of paper, I have to tell them where to put it. Every time I give them assessment back, I have to tell them where to put it. And I have to go around to make sure that they put it where I said, where I told them to put it. And you would be surprised.  Sometimes I forget or I get distracted.  And I don’t go around.  And the next time says, “Where is your assessment now?”  And they’re like, “I don’t know.”  And you find it somewhere else.  So, I think it is really frustrating that we have to spend such a huge amount of time on keeping the binders in order while this time could be used on teaching something more useful.    Another issue pertains to carrying the binders and bringing them to class. Some teachers reported that they had received complaints from their students that binders are heavy and hard to carry around. In particular, according to participants, the majority of the complaints come from senior learners who use walking aids, and take the public transit, as well as the learners who come to classes after work and have to carry the binders with them all day long. For such learners, as the teachers reported, carrying binders “creates stress” since they have to “make sure not to lose them”, and sometimes the heavy weight of binders “impacts the mobility of the elderly learners”. Therefore, these learners tend not to bring their binders to class. The reported difficulties regarding carrying and using the Language Companion binders in this study concur with the findings of Aydin (2010), in which the students reported “portfolio keeping is boring, tiring, and takes too much time” (p. 199).       94  Another area of concern was the time-consuming nature of the speaking assessments. According to Rose – a Literacy and CLB 1 teacher – the speaking assessment tasks often take 15 minutes per student, and (depending on the class size) it sometimes takes up to two days to assess all the students. Rose was concerned that this wastes a lot of class time and takes away time from teaching. However, some teachers like Olivia and Linda – who also teach Literacy and CLB 1 – said that they address this issue by asking volunteer teachers to come to their classes and do some activities with the rest of the class, while they are conducting the assessments. However, other teachers who do not have access to volunteers believed that it is quite a challenge to keep other students engaged on speaking assessment days. Therefore, teachers expressed the need for more volunteers, as teacher assistants, especially during the speaking assessments. They emphasized, particularly, lower-level students, due to their special needs and unique situations require more individualized attention from the teachers. Hence, having volunteers in Literacy classes helps teachers to make sure the students are on task. In addition, they stated that having interpreters on hand could be potentially helpful in order to clarify some PBLA concepts for such learners in a more efficient manner.    Finally, another significant challenge regarding the practicality of PBLA teachers’ challenges in assessing portfolios and providing feedback on PBLA assessment tasks. The interview data revealed two major issues in this regards: The first main challenge reported was providing an “action-oriented feedback” and communicating it with the learners.  The second main challenge regarding providing feedback was the time-consuming nature of writing feedback and assessing portfolios. In order to discuss these issues further, I will start       95 by discussing the challenges Literacy to CLB 2 teachers face, and then I will recount the experiences of CLB 3 and 4 teachers in providing feedback.   According to the participants, one of the most challenging aspects of providing feedback to Literacy, CLB 1 and 2 learners is familiarizing learners with the concept of feedback. According to Rose, in lower-level classes, it is really challenging to explain to students what feedback means and why they are not given marks. Rose explained that according to PBLA Guidelines (2014), teachers are required to “provide action-oriented feedback that gives students a way forward” (p. 16). However, she was not sure how this would be practical for her Literacy learners. She said, “What I don’t like about PBLA for CLB1 and Literacy is feedback. How do I give feedback to a Literacy or CLB1 student that can move them forward?” Rose went on to say: The feedback that I give to them is that I circle that they didn’t make a good eye contact. That would be the only feedback that I can give them, because their level of ability to comprehend a long feedback that you didn’t do this or getting into details would make them very confused about the assessment.    Referring to the same point, Olivia reported that she sometimes uses the help of interpreters to provide more in-depth feedback for her Literacy and CLB 1 students, and said:   … but I don’t think that I can say much. I can’t have a long conversation with them about what they need to improve, unless I have interpreters there.  Sometimes I do, and sometimes I can give them more information because I have some coworkers that I work with. Otherwise, on the comments, I’ll put only two or three words such as “practice spelling or practice these words” and I’ll list the words that they should practice... very, very small, it’s very specific.   Maya also believed that, “It’s really hard to give feedback to CLB 1 and 2, it’s very, very difficult”. Therefore, the only feedback she provides to her students is limited to a few short phrases that students can understand, as she stated:        96 For CLB one and two, it’s usually the same types of comments, like I’ll say, “I can do it, I need help, I can’t do it yet”. So those are like the only three types of comments they understand. I never say “passed, failed, whatever…, I just say “I can do it, I need help, I can’t do it yet”. So my CLB one and two learners know what that means. But it’s really challenging to give them action-oriented feedback!  Maya added that instead of using the word “failed”, she uses the phrase “do it again” and her students know that if they receive this comment, they know that they have not completed the task successfully, and have to redo the assessment. Linda also reported similar challenges in providing feedback to lower-level learners, for instance, she said: It’s difficult to give them feedback or track their progress, and also PBLA requires that they do reflection, that they remember what activities they’ve done, that they can name those activities, but even remembering what those activities are and then how to communicate what those activities are and then how to communicate how one feels and what one has learned. I mean it just doesn’t happen. If they circle a smiley face, that could mean anything. It could be how you feel, it could be how they did on paper, it could be it’s more fun to do that, …  Linda explained in order to provide detailed feedback, she sometimes has to “call in a settlement worker” to translate her feedback for the Literacy and (sometimes) even for CLB 3 learners. However, she mentioned that having interpreters who speak the languages of all students is yet another challenge. Therefore, as a last resort in cases where she cannot communicate detailed feedback to the learners, she makes notes on the assessment sheet recognizing that “although they can’t read it, the next teacher will be able to understand it”. This way, she can make sure she has recorded the students’ progress status. Commenting on the same issues Violet added, Giving feedback to them is a challenge, because I don’t have a lot of options when I’m giving them feedback on their speaking or writing.  It has to be really demonstrated.  So sometimes I write some comments and tell them to ask somebody at home to read this to you.        97  As far as the CLB 3 and 4 instructors are concerned, the participants did not express any major concerns about orienting learners to the essentials of feedback in PBLA. However, one of the recurring themes was the issue of time. Many teachers were displeased with the considerable amount of “unpaid” time they have to spend on providing feedback. It is important to note that the majority of teachers who participated in this study have not yet reached the point where they need to do the final portfolio assessment, and are still in the training phase. However, they reported that they were providing feedback on each individual assessment task that goes into learners’ portfolios. In the comment below, Nina, one of the few teachers who has conducted final portfolio assessment, talked about the time-consuming nature of portfolio assessment: When I review my students’ portfolios, time constraint is I guess my biggest challenge. For example, if I have 12 students I have to go through 12 portfolios, so I have to make arrangements to have conferences with each of them, so one by one I take their portfolios to review, and then I have to write a progress report for them. I make notes so I can provide some feedback to them. I think of some action plans that they can take in the future. Again, it is a great thing for the students. It benefits them really a lot, however for me it takes a lot of my time that I put into. Like I said, looking into everybody's portfolio and going through it and trying to understand where the student is and providing feedback in the conference is a huge challenge for teachers and takes a lot of time. But on the other hand, it benefits the students in many many ways.   Carol who has over ten years of teaching experience in the LINC programme and has been using PBLA for three years stated: I think that assessing the portfolios is difficult. I have my own method of how I go through them, but it’s also very time consuming if you trying to assess 17 portfolios in an afternoon. I think that I still ultimately rely largely on my experience as a teacher and most of the times the portfolio reflects that, sometimes the portfolio doesn’t, and I go with myself.  In addition to this, some teachers expressed challenges in providing “action-oriented” feedback, for instance, Carol said:       98 I use my assessment rubrics for giving feedback, again, it’s time consuming and unpaid. And if you consider, not only are we making the assessment, but now we are marking it, and giving action-oriented feedback, it’s really nice on paper, but some days, you know, I sit and I watch my son’s soccer practice, marking papers, right! You know, the notion of action-oriented feedback is really great, but it’s really hard to do.  Carol also provided some examples of the challenges in giving “action-oriented” feedback: … sometimes you can go, okay I see that my student Johnny always misses the gist questions, he doesn’t understand this. So I can do something with Johnny, I can talk to him and explain about gist, or if 5 or 6 students are also having a problem with that, we could do something in class, which is a good way to do action-oriented feedback, when you can find a common thread. But sometimes like, what do you say, read better, read better next time, you know, like sometimes there is not a lot to say!   Another example that she mentioned here was when students make spelling mistakes. She said instead of just writing the correct spelling, it is a good idea to write a question and ask them a question like “can you spell this correctly?”, and then return the assessment to them to find the correct spelling and spell it correctly. She added, … But again that’s really tough – if you are going to start doing that for 17 students and all the different things that, I mean you ask, maybe you pick out one thing like, may be pick out something that was relevant to the assessment, but it’s very challenging and time consuming.  Likewise, with reference to action-oriented feedback Violet commented,   The goal is to give them action-oriented feedback and we all repeat that like parrots, but what that means and how we do that, I think it takes time to give really action-oriented feedback. And I am still learning to do that, and I don’t think I always have a good answer to that question. What I find interesting is that by having those criteria and having that assessment tool, it really also helps me find out what I was missing in my teaching, like what parts I didn’t really pay attention to, and clearly those parts are the ones where the students didn’t do well.    To sum up issues surrounding feedback on portfolio tasks, although it is believed that action-oriented feedback is highly beneficial for learners (Little 2009; Yin, 2014), the interview       99 data revealed that providing action-oriented feedback is a time-consuming and laborious task for teachers across all levels. For instance, teachers who teach lower-level classes reported that familiarizing students with the concept of feedback and conveying action-oriented feedback to low-level learners is highly challenging. Therefore, teachers use different strategies to address this issue. For instance, some teachers get help from interpreters, and others try to use very simple words and short phrases to give feedback. In higher-level classes, teachers were mostly frustrated with the extensive amount of “unpaid” time they have to spend on giving feedback to their large group of students.      In addition to the points mentioned above, Carol raised the point that “PBLA is hugely a Western concept” and trying to force the adult and elderly learners – who mostly come from teacher-centred educational backgrounds – to accept learning this new system is sometimes “a waste of time and resources”. She argued that although young learners and those who are trying to enter the Canadian workforce may find PBLA beneficial for their settlement in Canada, other learners who do not intend to work in the Canadian context might assume PBLA is wasting their time. Along similar lines, another issue raised here was the cost of LINC classes for the government. According to Violet, in cases where students stay in the same level for a long time and do not show any progress to move up to the next level, it may seem like “the resources and the tax-payers’ money are being wasted”. This recurring, conspicuous issue is typical of classes where the majority of learners are seniors or those who frequently travel back and forth to their country of origin.    To summaries this section, based on the findings, lack of instructional and assessment resources is a major issue regarding the practicality of PBLA. Also, the findings indicated that       100 the challenges in organizing, keeping and carrying Language Companion binders is another potential source of frustration both for LINC teachers and students. Finally, all teachers in my study believed that utilizing PBLA is time-consuming which leads to varying kinds of challenges in both lower-level and higher-level classes. For instance, the time required for conducting speaking assessments in large classes, and the difficulties in providing action-oriented feedback are other areas of concern regarding the practicality of PBLA. These perceptions echo the findings of previous research (e.g., Brown & Hudson, 1998; Lambdin & Walker, 1994; Ripley, 2012). 4.3.7 Conclusion In this section, the findings related to LINC teachers’ perceptions of the usefulness of PBLA in light of Bachman and Palmer’s concepts of test usefulness (validity, reliability, authenticity, interactiveness, impact and practicality) were discussed. The findings showed that although all of my research participants believed that PBLA is a useful method of assessment, particularly in terms of interactiveness and impact on learners’ lives, the other aspects of PBLA such as validity, reliability, authenticity and practicality deserve further attention and reconsideration, which I will discuss in more detail in the concluding chapter.  In conclusion, while the findings were consistent with the earlier studies that indicate PBLA is an effective assessment tool for formative assessment purposes (Delett, Barnhardt, & Kevorkian, 2001; Fox, 2008; Hamp-Lyons & Condon, 2000; Little, 2005), the results of this study support the idea that when it comes to summative assessment purposes, PBLA may not be the best option (Fox, 2008; Hargreaves, Earl, & Schmidt, 2002), and it should be used in       101 combination with other forms of assessments. In the following section, I present a discussion of the findings concerning the logistical challenges in PBLA implementation. 4.4 Logistical challenges in implementing PBLA as an assessment tool Implementing PBLA, as with any other form of assessment, inevitably involves certain pitfalls and challenges (Brown & Hudson, 1998; Fox, 2014; Ripley, 2012). In investigating the teachers’ perceptions of the challenges of using PBLA, teachers talked about a variety of issues, some of which concerning the usefulness of PBLA as an assessment tool – which I discussed in the previous section – and the other ones regarding the logistical issues.  In what follows, I will mainly focus on participants’ perceived challenges in utilizing PBLA in relation to the following salient themes from the interviews:   Challenges in conducting needs assessment and introducing PBLA concepts  Challenges in training new teachers Challenges in conducting needs assessment and introducing PBLA concepts Another major issue reported by all the teachers was conducting monthly needs assessment and communicating PBLA concepts to learners. Prior to discussing needs assessment, it is important to note that most participants said that their programmes provide a curriculum for the teachers. However, all the teachers reported that they are required to conduct a needs assessment every month, based on which they create a module plan and develop teaching materials. Despite the considerable amount of time spent on creating and conducting a needs assessment, since the outcomes of needs assessment generally vary from one class to another, teachers stated that they often have to redesign their module plan and teaching materials for each individual class and thereby, they seldom can reuse what they have created for another class. According to Carol,       102 If you go to the Best Practices Guide, you are supposed to base your lessons entirely on the needs assessment, so if last year my class wanted, writing complaint letters, but this year, if I'm lucky enough to have same level, they don’t want complaint letters, I'm not supposed to teach that, but I think it’s not reasonable.   Teachers like Rose, Olivia, Linda and Violet who teach from Literacy to CLB 2 expressed the challenges they face in conducting needs assessment in their classes. They explained – no matter how hard they try to explain what needs assessment is – many students still have trouble understanding it. For instance, according to Olivia, We do needs assessment every month, and so we hope that eventually, maybe by the time they get to a CLB 3 or 4, they keep doing these needs assessment, they’ll have some idea of what they’re supposed to do, and they’ll become familiar with it. But obviously, my students are so low, they’re not really sure what they’re doing.  I know that they’re not sure, and I know that sometimes they’re just checking up the boxes because they’re just looking around – because it’s what someone else is doing, so I’m doing it too! I see my job is to build the first layer of maybe, possibly, years of being in a LINC programme, so that I’ll be giving them a needs assessment, and when they move into the next level another teacher would do the same. By then, at some point, they should really understand that it’s them that has to choose what they want to learn rather than me telling them what they need to learn.     In a similar fashion, Rose also mentioned the fact that, lower-level learners have countless language needs, and that there are many things they need to learn, but often due to their limited language skills they cannot communicate their ideas to say what they need to learn, which puts a heavy burden of responsibility on the teacher. As she put it, … sometimes they don’t know what they need to learn. They say, “Okay, teacher, you choose what we need to learn.” I feel so much more responsibility compared to other instructors in higher levels to think of my students and their needs.   In terms of challenges in familiarizing learners with PBLA concepts, David added,  I think one of the challenges is involving learners in the process. I think it is more challenging to do that, and then it kind of puts more pressure on the teacher too, to explain or unpack things, and this is difficult to engage them at the same level, especially for lower CLB levels I guess it’s much more challenging.       103   In addition to the issues mentioned above, Violet raised the point that the majority of learners in LINC classes come from teacher-centred educational backgrounds, hence “sometimes, they’re not used to being asked what they want to learn. So, I don’t know how much thought they put into deciding what they want to study if they really get it.” Similar to Rose and Olivia, Violet was also uncertain as to whether her students understand what needs assessment is; nevertheless, she was hopeful that they would eventually understand it as they progress to the next level, as shown in the remarks below:  I’m taking this approach.  Well, once they’re ready, they will get it, because they will continue.  When they go to the next level and the next level.  The teachers will ask them again to make choices, and I think when they’re ready, they will get it. And then, they say, “Oh! So if I say I want to study this, the teacher would teach me that.  Oh, okay”.  And then, they will pay more attention. It’s a process for them.     Quite surprisingly, the challenges in administering needs assessment are not limited to lower-level classes. Even Carol who teaches CLB 3 reported that her students are still not clear about the needs assessment: I don’t think the students understand needs assessment, especially when I try to tell them, “I am assessing your needs”. First of all, they get confused, because the word assessment makes them think it’s a test, and then I tell them, and I’ve told them every month, “this is where you tell me what you want to learn” and then they say, “oh, whatever you say teacher”.   Another major challenge identified by the majority of teachers is explaining PBLA concepts to language learners across all levels. According to Olivia,  PBLA is created with this idea that everybody has some kind of level of education, that they’re going to be able to understand, goal setting, needs assessments, reflections and that kind of thing. They’re going to be able to organize the Language Companion binder. They’re going to be able to make a list of their assessments that they’ve passed and all that kind of stuff.  That is incredibly hard for one teacher to do with 15 students who really have no experience of thinking in that way, right?         104   Olivia believed that PBLA takes a certain cultural perspective, which may be “culturally biased” especially as far as Literacy and “multi-barrier learners” are concerned. She went on to say:  I think that a lot of the students that I have really have not experienced that kind of individuality, sense of identity and setting up goals for themselves and things like that. They just don’t… Those concepts of thinking of themselves as this sort of person on their own with these specific goals in mind that they have, they just haven’t thought of their lives in that way. So, it’s making those assumptions that I think is really problematic actually.   In a similar vein, Carol viewed PBLA as a “culturally western notion”, and believed explaining PBLA concepts such as “self-assessment, peer assessment and self-reflection” to adult learners who come from teacher-centred educational backgrounds, and asking them to think and act this way wastes a lot of class time. Additionally, Rose elaborated on the difficulty of introducing PBLA concepts to Literacy and CLB1 learners,  Our programme has continuous intake and every week I have a new student in class. Trying to get out of that traditional ways of testing and trying to explain PBLA as an assessment tool to a CLB1 or I could say pre-CLB1 (because I also have Literacy students and my class is multilevel) has been a really challenging issue for my class and also for my students. I would say it works at some points, and it doesn’t at some other points.   Rose also added, “trying to explain the PBLA terminology is a huge challenge for Literacy and CLB 1 teachers”.   To conclude this section, teachers across various CLB levels were all in agreement that conducting needs assessment and introducing PBLA concepts to learners were not only time-consuming, but also labor-intensive and challenging, as also found in Ripley’s (2012) study.       105 Challenges in training new LINC instructors Besides the aforementioned challenges, a further issue raised by four participants was the difficulties in training new teachers in the LINC programme. Based on the participants’ accounts, familiarity with the CLB document and the “Can Do Statements” is essential in utilizing PBLA. More importantly, PBLA implementation requires special training. Thereby, LINC instructors are required to attend a certain number of training sessions and workshops in their programmes to be trained as to how to utilize PBLA in their classes. Therefore, when programmes hire new teachers, or even substitute teachers, they have to train them. However, in reality, this poses a great challenge for the programmes, as some participants noted in the comments below: It's a huge issue, because our teachers have already received a course of 30 hours in training as well as detailed feedback on their assessments. So, when a new teacher comes in, with none of that training, what do you do? I guess the CCLB is developing some kind of Moodle course to address that, but right now it's a huge challenge for us! (Sarah)  You have to have in mind that every teacher’s level of acceptance and their level of experience with the CLB document is different. So, everybody is at a different stage. So, whenever they’re getting to the PBLA training, whether it’s the Lead Teachers or the rest of the teachers, there will be different levels of readiness. In my organization, I have some teachers who have taught for example very high levels in another department, and because of the changes that have happened in the last two years, they’re now teaching in LINC. So, they’re seeing the CLB document for the first time, and they will have a harder time than the teachers who have been in LINC for a longer time. So they require more time and more support in training.  (Violet)  Well, the other problem is that I have been at my school for 3 years and I’ve been getting professional development for 3 years or something like that. But if you are a new teacher, you can't go back, and then get all that Pro-D and they can’t do it over again. You know, there needs to be some sort of way to resolve that issue, but I don’t think they have a method for managing that, I don’t think any programme has that yet. And personally I'm kind of sick of it, like I would like to get       106 professional development about something else, you know, 3 years of PBLA training is okay, but if you are a new teacher then you need all that. (Carol)  Additionally, Olivia raised the following critical questions regarding PBLA training and financial support: Well, I don’t know how other schools train new teachers and, you just can't make some people do it over and over and over again. So then, do you start splitting your professional development, may be new teachers over here, old teachers over there? Is there enough budget? The federal government cut professional development money and we have a lot less Pro-D money than we used to have with the provincial government. So, what do you do? I don’t know!   Linda also elaborated on the issues of budget cut in her programme, and explained that if people who trained them want to train the new teachers, they have to volunteer their time and they won’t be paid for that, as she stated:  If new teachers come in and they don’t know how to do PBLA – like we just got a whole bunch of new classes because of this extra part of money for Syrians – but the person who used to train teachers had to volunteer her time to meet with them and show them how to do things. She is not going to be able to do that very often, so the new teachers are not going to get as much support as they need, and we are going on asking each other, hoping that somebody will remember or have a better answer or had dealt with it before and had asked someone else. I think it’s like putting teachers through an awful lot that really shouldn’t be necessary. I mean there is not enough funding, there is not enough support and we shouldn’t have to go through this kind of emotional anxiety because we don’t know what to do! We are trying very hard to do what’s right … but the government needs to support everybody down the line so that this could happen. Conclusion To recapitulate this section, in order to implement PBLA more efficiently, it is essential to consider the logistical issues that have been posing challenges in its implementation. Among all other challenges that have been discussed earlier, teacher training and finding more effective       107 ways to communicate PBLA concepts to LINC students (particularly lower level learners) constitute two areas that merit more attention from researchers and policy makers. 4.5 Teachers perceptions of PBLA training In this section, I will provide an overview of the participants’ perceptions of PBLA training. First, I will discuss the teachers’ overall views on PBLA training. I will also discuss whether they believe their training had adequately prepared them for designing assessment tools and tasks. Second, I will share the participants’ suggestions regarding how the quality of PBLA training and implementation could improve.   Before elaborating on participants’ responses, it is important to note that Olivia, Rose, Sarah, Violet and Nina (besides being LINC instructors) were also working as PBLA Lead Teachers in their programmes. Therefore, compared to other participants, they had received a more in-depth training directly from PBLA regional coaches. The Lead Teachers in the study explained that, as part of their training, they were required to develop assessment tasks, assessment tools and rubrics as assignments, based on which they would receive rigorous feedback from their regional coaches. I should also note that since my research participants were all from various LINC service provider organizations, they were in different phases of PBLA implementation.  4.5.1 Teachers’ general views on PBLA training In order to probe into the teachers’ views on PBLA training, I asked the participants to comment on the quality of PBLA training in their programmes. Interestingly, the data analysis revealed that teachers had very differing experiences in PBLA training in their respective LINC programmes, and therefore they had divergent views on the quality of their training. I will start       108 this section by sharing the experiences of Lead Teachers with PBLA training, and then I will present the other LINC instructors’ attitudes toward their training.      Olivia, a Lead Teacher – who has over 13 years of teaching experience in the LINC programme, and was teaching Literacy and CLB 1 at the time of the study – described her PBLA training similar to “doing a Master’s degree”. She was very pleased with her training, and described it as “extremely thorough”, and she believed that “it’s too bad that everyone can’t do that training”.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                          Rose – who besides being a Lead Teacher, taught Literacy and CLB 1 – started PBLA training as a Lead Teacher two years ago. She explained that she was the only Literacy and CLB 1 teacher in her cohort, which made it hard to receive the support she needed during her training. She said,  During my training, I felt so lonely and isolated because there wasn’t much collaboration and communication. I didn’t know who to talk to, to make sure that I’m doing my assignments in the right way or not. I was getting lots of support from my cohort, from my regional coach, but her specialty was in Stage 2 and she wasn’t much clear of… she didn’t know much about CLB1 and how it would be like. So I didn’t receive any specific training for CLB1 and how to write criteria or how to write an assessment task or any of them. I think our training was mostly generalized for our cohort. There were so many questions that weren’t answered for me, and I had to figure out so many things by myself. It was very challenging and hard. I’m still figuring so many things out by myself.   She explained that not only during her training period, but also even now in her programme, she is the only Literacy and CLB 1 teacher, which makes it hard to exchange ideas, collaborate and share teaching and assessment resources with other teachers. Therefore, she was very displeased with her PBLA training, due to the fact that she has had to learn everything on her own given that the training materials in her cohort were tailored toward higher CLB levels. Consequently,       109 she was left with many unanswered questions and did not get a chance to learn how to develop assessment criteria and tasks for Literacy and CLB 1.   Sarah, another Lead Teacher – who has been teaching in the LINC programme for 21 years and has been using PBLA in the past two years – was not quite happy with her PBLA training. She elaborated on the fact that the exemplars they were given (in their cohort for training purposes) were developed by PBLA trainees in the previous cohort (Cohort 1). According to her,  We were given the previous cohort’s assessments and these are trainees, as exemplars for our own work and then when we developed assessments based on those, we received lots of feedback telling us that’s wrong, wrong, wrong, wrong, wrong! … so we didn’t have PBLA coaches’ own assessments or the developers did not make those assessments, they were assessments that trainees had come up with.  Thus, Sarah described the exemplars as “a big source of frustration” for their cohort and believed that PBLA “training could have been better.” Nina, who was teaching CLB 4, received all of her training as a Lead Teacher online last year. She stated that she learned lots of things on her own by trying to find online resources and teaching herself, because according to her, “there wasn't really something where they told us how to create for example rubrics, … so I had put a lot of time into it.” On the other hand, Violet who teaches CLB 2 was satisfied with the PBLA training, and (similar to Olivia’s view) believed that the training was very thorough. She was very pleased with the feedback she has received on PBLA training assignments, and believed they were very helpful and informative, as shown below: I think like I’m happy with the training I have gotten as a Lead Teacher.  I think when it comes to passing down this knowledge to our colleagues.  I don’t know       110 how that will look like, I think that’s a big difference, because our coaches are really experts, but I’m not, I’m still learning. So I think in that area, the teachers really need to feel that their Lead Teachers are strong, and that they know the stuff to help them.   As far as the other LINC teachers are concerned, there were differential views on the quality of PBLA training. For instance, Linda – a Literacy and CLB1 teacher, who has been using PBLA in the past three years and has been teaching in LINC for 10 years – was very discontented with the way they received training in their programme. She explained that although they had been using PBLA since 2013, due to the budget cut, they did not receive a proper PBLA training. According to her, they were using PBLA on their own, until last November when their programme received new funding for PBLA training. As she reported,   They did hand pick a few people and trained them to help others but for a year they were going through the training and then they cut the funding. I mean they paid them a little bit and then they cut it so now they can help if they want to but it’s volunteer.   Therefore, she went on to say,  Our whole introduction to PBLA was kind of backwards, because for like a year we just did it we had no idea if we were doing it right; we didn’t get any feedback.  There was no funding to have anybody who was trained to come and look at them and it turns out we were doing it really wrong.   When I asked her how and when they realized they were doing it wrong, she responded: … Well, about the third year and the government just put up this training, who knows. I mean, we live in B.C, we got it last, and they have been doing it for five years before we started it in Ontario. When our supervisors were calling Manitoba, because I think they had it done last before us, and asking them all sorts of questions. Then now they are coming down with a specific way of how everything should be written. …   Along similar lines, Kate – a CLB 4 teacher who has been using PBLA and teaching LINC for two years – appreciated the efforts and hard work of the Lead Teachers in her       111 programme. However, she believed that sometimes PBLA training becomes “too overwhelming” due to the fact that the Lead Teachers tend to complicate things by focusing on too much detail and theory, rather than spending more time on providing clear and practical examples. She also commented on the inconsistency in PBLA training in her programme. She reported: … we have three Lead Teachers, so sometimes the information conflicts depending on who's teaching and then who follows up. One question from somebody may receive three very different answers.   Maya – who has been teaching LINC 3, and using PBLA for the past two years – was relatively satisfied with the PBLA training in her school. However, she said given that she felt the training has not been quite enough, she has been taking different measures to address her needs. She maintained,  I spend a lot of time seeking out my own… like talking to other teachers and learning from them. I have spent a lot of time comparing and discussing with other teachers at my level. I attend workshops as much as I can, like going to TEAL workshops or other online webinars and things like that, but I think I spend a lot of time seeking that out for myself. I think if somebody needs more training, if they seek it out themselves, it’s out there, it’s available. But I know there’s other teachers that I work with that don’t seek out any information and I think they don’t feel confident in making assessments.   Another LINC teacher, David – who worked part-time and as a substitute teacher in four different LINC programmes and taught multiple CLB levels – explained that he has been receiving PBLA training from all the different schools he has been working for, and expected that he may receive some “redundant” training in the future. However, he was positive that he “may get more from it”.        112 Contrary to the other participants’ accounts regarding the PBLA training, Carol – who has 10 years of teaching experience and is currently teaching CLB 3 – was very pleased with the PBLA training offered in her programme. She remarked: We’ve received way more training than it was given to us by Immigration Canada [CIC] in the last year. Some of the training stuff we receive from CIC, are just for credentials, and almost like review for us, and I doubt it would be enough, if that’s all we had. In our programme, we have been receiving training on PBLA for the past 3 years. So little bit by little bit, starting with let's look at the CLB book, you know, three years ago, when some people are just starting now! So our programme has been really good at training, and I think we are exceptional in that way, and I have learned a lot.  The aforementioned findings in this section indicated that the quality of the PBLA training differs from programme to programme, and consequently teachers have divergent perspectives on the PBLA training they have received thus far. As indicated in the interviews, this could probably stem from the differential amount of funding and the resultant support different programmes receive for PBLA training. In addition, based on the interviews, it can be inferred that teachers’ satisfaction with the PBLA training really depends upon the relevance of the training to the teachers’ needs. Teachers who have received more practical and hands-on training regarding designing assessment tasks and tools related to their levels seemed to be more pleased with their training compared to those who have received a more theory-driven training. Furthermore, as one can expect, Lead Teachers who have been directly trained by the regional coaches generally showed more satisfaction with their training in comparison with the regular instructors who have been trained by their Lead Teachers. As the Lead Teachers indicated in the interviews, during their training, they are required to complete a number of assignments that include developing assessment tasks and tools followed by rigorous feedback from regional       113 coaches who are PBLA experts. However, the regular instructors do not necessarily have such assignments, and are being trained by Lead Teachers who are themselves still learning about PBLA.     4.5.2 Teachers’ views on PBLA training regarding developing assessment tools In order to gain a better understanding of participants’ views on PBLA training, I also asked the teachers to share their experiences about the training they have received on different aspects of PBLA, such as designing assessment tools, developing assessment tasks and other areas of PBLA assessment. I also asked whether they feel confident that PBLA has made them ready to design and develop assessment tasks and assessment tools.  Kate explained that in her programme, teachers are still using the old CLB tests for assessment purposes. Therefore, at this point teachers are merely learning how to use PBLA and as a result, they are not using the PBLA assessment results for making decisions about student levels. Therefore, commenting on the PBLA training on developing assessment tasks and assessment tools, she reported: I think it was good, and then we practice making them and we'd look at examples. It gave us a good foundation, but I think you can then only build on that by actually doing it and practicing. The approach that we're taking right now is that for the first few months, the assessments are more about the teachers, like “Was it a good rubric? Where do you need to change for next time?”, and after a few months – I'm not sure when –it'll be more about the students, how are the students doing with this. So that's good, at this point the pressure isn’t on the students, it's more on the teachers to refine their skills or to develop their skills.  Interestingly, Sarah who is a Lead Teacher in one of the participants’ programme did not believe she had received enough training in developing assessment rubrics, and she said: I don’t think I have, but I also shy away from rubrics because there are a lot of work. I use a rating scale so it’s like, “yes, almost, not yet”, I don’t give a       114 description of what yes, looks like or what almost looks like or what not yet looks like. It’s just have you met this particular criteria? Do you have an appropriate greeting and closing? yes, almost or not yet.  Although as Sarah noted, creating a descriptive rubric is very time consuming and “a lot of work” for teachers, it is probably more beneficial for learners. Descriptive rubrics can clearly show the students the criteria for success. By reading the rubric descriptors, students can see what is expected of them and why they have received the evaluation (score) they have received. On the other hand, there is a big chance that some students may never read through those descriptors and they may just pay attention to the final scores or evaluation.   According to Olivia, the PBLA training is just a starting point, but she believed that improving skills such as developing assessment tasks and creating assessment tools require a lot of practice, as illustrated in her comment below:   Yes, it started to prepare me, but I don’t think it has prepared me enough. I think it’s just by doing them that we get better at it. It might just be like keep practicing. I feel like I’m still practicing, I’m not... I think, that might be one of the problems with that, you always feel like you’re not quite good at it.    Generally, the common thread among all participants’ experiences with PBLA training was the fact that there is always room for more training and all teachers were of the opinion that they can hone their assessment development skills with more practice coupled with continuous support. In my view, lack of proper PBLA training can have negative impacts on students especially if (or when) PBLA is used for summative assessment purposes. For instance, if teachers are not properly trained, they may not be able to make assessment tasks and tools that fairly assess learners’ proficiency levels. This could lead to students’ frustration and lack of motivation in learning. Instructors’ poor assessment skills can also bear negative consequences       115 for those students who intend to become Canadian citizens, or those who aim to pursue better academic and professional opportunities in Canada. Therefore, if PBLA is to be used for summative assessment, it is essential that teachers receive thorough and in-depth training so that they feel confident about the assessment decisions they make. 4.5.3 Teachers’ suggestions for improving PBLA training and implementation As for suggestions to improve the quality of PBLA, what most participants suggested was having (a) workshops that can offer clear and simplified tips and strategies on designing assessment tasks and tools, and (b) a vetted resource bank, which contains updated assessment worksheets, samples of assessment tools in different skill areas, PBLA instructional stuff, and so forth. Below, I will provide some interview excerpts to demonstrate the participants’ recommendations about refining the PBLA training as well as its implementation. To start with, emphasizing the need for more support in PBLA training, Linda remarked: I believe how well PBLA works has a lot to do with how much support the teachers have. I think PBLA is a good idea, but you have to support people, you have to support the teachers, you have to support them with their time, with training, with formats, from the get go. So that they don’t have to spend so much time saying, “is this right? or is this wrong? or maybe I should do this”. I’m wasting my time going through all this anxiety when if it was just laid out – like first you do this, and then you do this, and this is the skeleton of the template, and this is how you do it –it would have made our lives much easier. In our programme we didn’t get that kind of training until last November, while we have been using PBLA for two years since 2013.   In the same way, Sarah expressed the necessity of having more resources as well as deploying more financial support for the teachers, “I think to be perfectly honest I feel sometimes that they put the cart before the horse you know; I wish that there had been a bank of materials and resources for teachers”.        116 Moreover, Sarah talked about the pressure of increased workload on teachers as also noted by Fox (2014) and Ripley (2012). Regarding the issue of the budget cut in the LINC programmes in B.C. Sarah added: I think pretty much all the LINC programmes in B.C. are going to lose some funding and with that comes again a reduction in yearly wage, so it can be a hard sell when somebody's asking you to do so much more work but lower wage. It doesn't seem fair. So I wish, that had been considered more!  Sarah said that the PBLA Guide indicates that the maximum number of teaching hours should be 12 hours per week. Therefore, if a teacher works 12 hours per week, she or he has more time to develop assessments and other materials. However, according to Sarah, in reality a full time class is 25 hours per week. So preparing for class, designing skill-using activities, developing assessment tasks and tools, evaluating the assessments, making self and peer assessments, creating needs assessments and templates for learning reflections, all require time and energy and it is not really “sustainable”.  She said that when she was being trained as a Lead Teacher, she had to work 55 hours a week and still that was not enough.    Along similar lines, regarding suggestions for facilitating PBLA implementation, Violet suggested that there should be more collaboration among teachers. She also pointed out that although sharing resources is very beneficial and helpful, it is essential for an expert to check the quality and accuracy of those resources. Additionally, she highlighted the dire need for calibration sessions in order make sure teachers have the same understanding of the CLB document in their evaluations, because without proper calibration training, the teachers may have different interpretations of the CLB document, which may consequently compromise the reliability of PBLA assessment outcomes.  As she commented:       117 I guess having time to develop either modules or assessments in collaboration. So, collaboration I think is what I would like to see more. And maybe in the future, there will be some more resources developed.  So, sharing those resources, organizing and making sure that what is out there is accurate.  Because we can have all these samples and we don’t know how good they are.  So, I’m not saying like somebody who will check them, approve them.  But some kind of standard that we need to kind of adhere to, and I think calibration sessions are a must. That’s very crucial because without that, I can interpret the CLB document and my students' results anyway I want to. And giving teachers more money, like giving them time that is paid. I’m sure that will motivate everybody.    In addition, Violet stressed the need for more training and Lead Teacher’s support,  It depends on each organization, and again, how much budgets a programme has. Also, it depends on the teachers and their motivation basically. How much time they want to put into it. It’s not easy. I think maybe having more Lead Teachers, or having a kind of mechanism for ongoing support. Having, drop in sessions for the teachers. That they can talk with someone. …. So, that kind of help I think is really great that I don’t have to worry too much if I don’t know the answer, … like having a 24-hour PBLA helpline, because sometimes you need help now. …Having somebody like a coordinator or somebody who you can go to with a question, not just online by email. Because I am sometimes feeling out in the dark, like I don’t know if I’m getting it right.    Similarly, Nina referred to her personal experience as a Lead Teacher in training other teachers in her programme, and expressed the need for more training on creating assessment tools: Most importantly, I If they had training on different themes, and I guess not necessarily themes, activities, and more workshops on how to create tools, I think that's the biggest part we need help with. Having some kind of text, brochure, audio or videos on creating assessment tools to assess students’ abilities. Right now, I'm training other teachers, and what I see is that they are having struggles creating assessment tasks and tools. And if whoever is up there, if they provided more trainings on creating assessment tools, giving us a chance maybe work in groups to use different skills, different activities, different tasks to learn to make assessment tools. So, mostly on choosing real-world tasks, and then creating assessment tools. So every teacher knows how to create an assessment tool in order to measure a student's performance.        118 Carol, who was quite satisfied with the rigorous training in her programme, also stressed the need for more support, particularly in terms of more resources such as teaching and assessment materials. She said, “I think we need physical support in terms of teaching and assessment materials, like not teach me how to do it, but make it, so that I don’t have to do it all the time”. Reflecting upon her experience as a Lead Teacher, Rose talked about time constraints, and the challenges in training her colleagues: As Lead Teachers, we have very limited time frame in our training sessions that we have to go through each activity back to back, not giving teachers enough time to figure things out, … and you have to move forward and you have to finish the outline before… it’s a three-hour outline, and it’s pretty packed, and that leaves our instructors with many questions. So when we went through training our colleagues, we got a lot of complaints from them, they were saying “we are confused, we don’t know where we are”, and we had a pretty tight timeline to go through each workshop. We got very negative feedback from our teachers, they were so overwhelmed and stressed, and there wasn’t enough time for Q&A. There wasn’t enough time to write a module plan for a topic. I believe, it divides us, it creates a distance between lead instructors and teachers. Now we are trying to collaborate more with more Pro-D days and dialogue days that we can sit and communicate well.  Based on her experience, Rose suggested that in order to enhance the quality of PBLA training, it is crucial to allocate more time for training sessions to make sure teachers have grasped the ideas thoroughly. On the other hand, David who works for four different LINC programmes and is being trained by all of them suggested that online asynchronous training could be more practical for teachers who are in a similar situation. He believed that that through online training, teachers can be more in control of their training, and can focus on the areas they need to learn and avoid repetitious training.       119  On a different note, Olivia suggested that PBLA training could be incorporated in the TESOL programme. Thus, those who want to work as LINC instructors, can have an extra month of training for PBLA at the end of their TESOL training. Moreover, like other teachers, she also added that the quality of the LINC programme and PBLA implementation would be improved by having more funding, and having regular, homogeneous classes with learners of one level in each class, and “having classes with a fixed start and end date for each term rather than continuous intake”.    More importantly, all teachers who participated in this study believed that a bank of resources would save their time and can facilitate PBLA implementation. According to Sarah, “What I see as the challenge is the lack of resources. For example, an assessment bank would be very helpful for teachers, because right now the workload for teachers is heavy and that’s a big challenge.” Similarly, violet pointed out:  I realize that PBLA is something new and there can’t be enough resources at this point. There is some stuff to start us off, but I don’t think it’s enough. I hope somebody is thinking for all of us, ahead of us, and planning to offer more support. And there are a lot of classes in which you teach multiple levels like CLB two and three, or CLB three and four together, doing PBLA in those split levels is very challenging, especially when you are teaching full time and having two different levels. It’s a lot of work for the teachers. So I think a resource bank would be very beneficial for all of us.   Furthermore, regarding assessment and material development, Sarah argued that “the language needs of immigrants and refugees are essentially the same”, and as such, a bank of assessment tasks would be very beneficial. Many teachers including David, Maya and Kate believed that they are constantly “reinventing the wheel”. Even though most teachers had a preference for developing their own materials, the majority of them expressed an urgent need for       120 “a bank of resources”. As Rose remarked, “a bank of resources would really facilitate our job, not feeling desperate by looking for resources”. However, teachers emphasized that the resource bank would be highly beneficial only if the quality of the materials is controlled. As David and Sarah commented, a vetted and filtered bank of resources can allow teachers to see what their peers have done and therefore it can inspire instructors and give them a sense of what they should be striving for and what the assessments should look like.   On a different note, Kate was concerned that cultural aspects of language learning is being ignored, since with PBLA there is not enough time to have engaging class discussions, as she argued: We're trying to give them real-world skills and I think they are gaining a lot of detail on very distinct skills. But they're not getting sort of a global or holistic picture of Canada or of English or of Canadian culture or even just communicating in the language.   Kate believed that although PBLA aims to teach real-world skills, the students are only “becoming better test takers”, because the assessments are identical to the skill-using activities and they already know how to complete the assessments. As she put it, “because they know the format of these assessments which takes away from the real-world aspect, it's not spontaneous anymore, it's not organic or natural”. Kate was discontented with the fact that most of the class time should be spent either on preparing students for the assessments or doing the assessments, “to the exclusion of talking about culture, talking about Canada issues, talking about issues that they have with their children who are becoming very Canadian but the parents don't like that, [etc.]”.  Kate’s concerns about spending too much class time on some unnecessary aspects of PBLA were shared by other teachers too. For instance, even Linda who teaches lower levels was       121 notably displeased with the fact that sometimes, rather than focusing on teaching what learners really need, teachers have to focus on what is mandated by the school curriculum, as she remarked: The fact that there is very little focus on workplace now I think it’s sad. People want to feel like an adult, not like a baby and if they can do what they did in their country if they can show their skills and develop new skills and feel good about themselves, they are going to learn faster and they are going to feel better because now they are providing for their families, well trying to anyway because they are struggling. You have got to look at the self-esteem of the person not just the marks on the paper. You have got to look at the whole scenario and not just ‘did they write this thank you letter properly; did they show that they can do some symbolic reference to what the real-word task is’. For me it’s got to be more than that, you have got to focus on the type of negotiating skills that are necessary for newcomers.  Linda explained that they have a set curriculum and they have to cover 16 language competency areas within 16 weeks, but she said: I have trouble with that because there is times when in certain themes you really need to do a whole lot more than the social aspects such as writing a get well card for a co-worker, well, if you don’t give them job skills so that they can get a job they are not going to be writing letters to a co-worker.  Linda believed that the two most important competency areas that are highly useful for lower-level learners are “following and giving instructions” and “sharing information and getting things done”, since these are what the learners need most in their daily lives and at work. Therefore, in Linda’s view, in order to help learners improve their language skills in such competency areas, it is important to spend more time on activities that provide more chances for learners to practice these language functions. For instance, she explained that instead of spending class time on teaching learners to write a “get well letter to a co-worker”, it is more beneficial to focus more on activities that are more useful for learners. In Linda’s opinion, at the Literacy and CLB 1 level, teachers should not be obliged to strictly follow the set curriculum, which may not       122 effectively address the learners’ language needs, instead, teachers should have more freedom of choice in selecting the themes that are more practical and beneficial for their particular group of learners. Linda and Kate’s comments demonstrate how PBLA can sometimes impose restrictions on teachers in terms of time constraints and choice of topics to be taught.  4.5.4 Conclusion To summarize this section, the findings regarding LINC teachers’ perceptions of PBLA training indicate that teachers in this study had very disparate experiences in terms of the quality of PBLA training in different LINC programmes. The majority of the teachers noted that PBLA training has not been adequate to prepare them for designing PBLA assessment tools. The participants suggested a variety of options that can potentially lead to improving PBLA implementation. Among all the recommendations, what stood out most include: 1) Providing more support in teacher training, 2) Augmenting financial support for teachers, 3) Creating a bank of assessment tools and resources, and 4) Creating a more flexible time frame for teacher training.  In this chapter, the findings of the study in relation to the research questions and the most salient themes from the interviews were presented. In the concluding chapter, I will provide a summary of the findings followed by a discussion of the limitations and implications of the study. Lastly, I will offer suggestions for future research and present some concluding remarks.        123 Chapter 5: Conclusions 5.1 Summary The findings of this study are consistent with the results of previous research in this vein that demonstrated the usefulness of portfolio assessment for students (e.g., Delett, Barnhardt, & Kevorkian, 2001; Little, 2005, 2009; Little & Erickson, 2015, among others). Overall, all the LINC instructors in my study reported that PBLA provides more opportunities for learners to be involved in the assessment process. In addition, through PBLA language learners are given the chance to be further engaged in instructional and assessment tasks which simulate real-life tasks, which, in turn, can serve to improve the language skills requisite for and relevant to their lives in the Canadian context. Furthermore, while teachers who were teaching Literacy, CLB 1 and CLB 2 believed that providing action-oriented feedback for lower-level students is not practical, teachers who were teaching higher CLB levels (CLB 3 and CLB 4) reported that action-oriented feedback helps learners develop a better understanding of their areas of strengths and weaknesses and what they need to work on in order to improve their language skills and as a result, it helps learners to take more responsibility in their language learning. Moreover, all teachers generally agreed that students like the Language Companion binders, despite the fact that they are heavy and that it is very challenging and time-consuming to keep them (Language Companion binders) organized.  However, in spite of all its benefits for learners, since the reliability aspect of PBLA is questionable, it may lead to some test fairness issues when it is used for summative assessment. Due to the impact of PBLA on students’ Canadian citizenship application, and professional or academic opportunities, PBLA can be considered as a high-stake test, and its reliability issues       124 may negatively impact the students in the LINC programme. In addition, PBLA implementation poses a number of challenges for LINC teachers. First and foremost, all teachers who participated in this study reported that with PBLA they have to take on additional roles without receiving extra compensation for their new responsibilities; As David pointed out besides being a teacher, they are now “material developers, assessment designers, assessors and assessment tools developers”. The findings showed that designing real-life instructional materials and assessment tasks that align with the CLB document is very time-consuming and labour-intensive, and some teachers were uncertain as to whether their assessment materials were accurate enough. Therefore, teachers expressed an urgent need for having a vetted bank of assessment resources for different skill areas and in different levels so that they do not have to “reinvent the wheel all time” as they put it. In addition, all the teachers in the study demanded having some calibration workshops to make sure their evaluations are in line with other teachers who teach the same levels. The findings indicated that the quality of PBLA training has been different across different LINC programmes. Although a few teachers were quite happy with the PBLA training, most LINC teachers in this study believed that the training they have received have not adequately prepared them for designing assessment tools and assessing portfolios.  Furthermore, teachers across all the CLB levels were discontented with the extensive amount of time they have to spend (or in other words, waste) on organizing Language Companion binders and conducting needs assessment. Not only did the instructors who teach lower levels, but even also those teaching CLB 3 and 4 commented that their students do not have a clear understanding of needs assessment. Therefore, some teachers were of the opinion that prior to PBLA they could use the class time more efficiently and they could have more time       125 for enriching class discussions. Another reported area of concern was providing action-oriented feedback that can move the learners forward. First of all, all teachers explained that providing action-oriented feedback to large classes is very time-consuming. Second, teachers who were teaching lower CLB levels believed that communicating the feedback to lower-level students is quite challenging.  Finally, as noted earlier, the majority of – eight out of ten – the teachers who participated in my study were still in the PBLA training phase and they had not conducted a full portfolio review and assessment yet. However, those two teachers who were fully implementing it believed that PBLA is only beneficial for formative assessment purposes (Fox, 2008; Little, 2005), because it informs teaching and it provides ample opportunities for learners to become aware of their strengths and weaknesses in order to gradually hone their language skills. These two teachers shared their experiences about assessing portfolios and recounted various examples of difficulties and logistic issues they have encountered in utilizing PBLA as a summative assessment tool, and explained that PBLA should be accompanied with other assessment tools for summative assessment purposes. This is in line with the findings of previous studies on portfolio assessment, which suggest that using portfolios for summative assessment is often problematic (Fox, 2008, 2014; Hargreaves, Earl, & Schmidt, 2002).  5.2 Implications of this study The findings of this study shed light on a small number of LINC teachers’ perceptions of the usefulness of PBLA in the LINC programme. Besides elaborating on the benefits of PBLA for the language learners, the teachers in my study shared their insightful views about the challenges they face in PBLA implementation. Based on my research findings, in the following, I make four       126 key recommendations which can possibly facilitate PBLA implementation. The first recommendation, based on what the participants in my study suggested, concerns PBLA training. The teachers requested to have more training in developing assessment tasks and tools in different skill areas, particularly for multilevel classes. They also requested having calibration sessions with a PBLA expert and other teachers who teach the same levels as them. Although the teachers did not specify the need for training regarding action-oriented feedback, in my view, this needs to receive more attention from PBLA coaches. I think providing some clear and straightforward examples of action-oriented feedback geared toward different language skills for different CLB levels would be highly beneficial for teachers. In addition, some teachers asked for online and asynchronous training. Fortunately, this has already been addressed quite recently. In fact, Centre for Canadian Language Benchmarks (CCLB) has recently created an E-learning portal and CCLB website, where LINC instructors can register for free online PBLA training courses. These courses involve PBLA training, PBLA implementation support forums related to different PBLA cohorts and PBLA training for Lead Teachers. In addition, other training materials on this website include a CLB Bootcamp, CLB training courses, as well as CLB Literacy courses. Although these training materials are now available to all LINC teachers across Canada, it would be interesting to find out how teachers rate them in terms of their quality.  The second recommendation, based on the results, is creating a bank of resources. There are already some PBLA resources available on Tutela website, and I assume there are some materials on the CCLB E-learning Portal. Besides, some schools are already compiling resources in their individual programmes. However, all the LINC teachers in my study were skeptical       127 about the quality of such resources. It would be really helpful if creating resources would be treated as a profession and if there were some people who would be paid as specialists to create assessment materials in different skill areas and for different levels. That would save teachers’ time and energy tremendously. It is also worth noting that as I was conducting the interviews, I noticed that teachers were very eager to collaborate with other teachers to share their experiences and PBLA resources. However, a couple of teachers were not sure if they were allowed to share their instructional and assessment materials with their peers who work for different LINC programmes. As one of the participants noted, in many programmes it is against school policies to share resources with colleagues from other programmes. Therefore, I think it would be very advantageous if measures could be taken so that all same-level teachers from different programmes could collaborate, share their experiences and resources with one another without being constrained by their programmes.  My third recommendation relates to needs assessment. The findings of my study showed that conducting needs assessment is not much useful and efficient because, as all the participants reported, the majority of students – especially lower-level learners – prefer to rely on their teacher’s decision about what needs to be taught. I think rather than spending (or wasting) class time on asking students what they want to learn, it would be much more efficient if teachers could use their own discretion and rely on their own professional judgment about the needs of their students. After all, as one of the participants noted, the language needs of newcomers are essentially the same.  Last but not least, it is essential to provide more financial support for LINC teachers. PBLA is a unique and useful method of assessment for learners. However, as far as teachers are       128 concerned, PBLA means nothing but added workload and constant stress. In order to implement PBLA more successfully, the policy makers should take into account the tremendous amount of pressure that has been imposed on teachers. Teachers need to receive more financial support in return for the extensive amount of time they have to put in for creating instructional materials, developing assessment tasks, designing assessment tools and giving action-oriented feedback, etc.  Having mentioned the above recommendations, I am hoping that this study serves as a communication bridge to connect the voices of the LINC teachers to the policy makers who are directly or indirectly involved in this national project – e.g., Centre for Canadian Language Benchmarks (CCLB), Immigration, Refugees and Citizenship Canada (IRCC), Ministry of Citizenship & Immigration (MCI). Hence, the findings of my study will have implications for the LINC service providers, LINC administrators and teachers, as well as other policy makers. More importantly, the results of this study will impact the language learners in the LINC programme, particularly those who are trying to attain the language requirement for the Canadian citizenship application. I hope my research can lead to the development of support and scaffolding training programmes aimed at a more effective and efficient implementation of PBLA in the LINC programmes across Canada.  5.3 Limitations of the study In terms of the limitations of the study, first of all, I should emphasize the fact that it was really hard to find participants for this study. LINC teachers who agreed to participate in my study said that some of their colleagues have a lot to say about their experiences with PBLA but their heavy       129 workload does not allow them to afford the time to participate in the study. Having said that, there are only two main limitations that should be noted here.  First of all, since PBLA has not been fully implemented in British Columbia, the majority of my participants – eight out of 10 – were still in the PBLA training phase. As a result, most of the LINC teachers in my study were still learning how to implement PBLA in their classes. Only two teachers had experienced evaluating the final portfolios, and the other eight teachers were only evaluating the assessment tasks that should go into the portfolios. Therefore, I had to modify the interview questions that were relevant to the reliability and validity aspects of portfolio assessment. The second issue worth mentioning is that half of the participants – five out of ten – were PBLA Lead Teachers, and had received their training directly from PBLA regional coaches who are experts in PBLA implementation. Therefore, it is fair to infer that they generally had a more positive attitude toward their PBLA training and its implementation. The other five teachers had received their training from the Lead Teachers in their programmes, and thus had a different experience in their training. Although I was privileged to hear the insights of both the PBLA Lead Teachers and regular LINC instructors, some of the views about PBLA expressed by the Lead Teachers in my study may not fully represent the voices of other LINC instructors about their experience with PBLA implementation.  5.4 Recommendations for future research This qualitative study was exploratory in nature and thus involved only a small number (N=10) of participants to investigate LINC teachers’ perceptions of the usefulness of PBLA. Future studies in this vein can possibly use a mixed-method design, and perhaps involve a larger       130 number of LINC teachers to explore the challenges facing teachers – across different CLB levels – particularly with respect to the length of time they have used PBLA. Another niche in the existing literature is in terms of research into language learners’ perceptions of and experiences with PBLA as a method of assessment. Investigating language learners’ perceptions of PBLA would not only enrich the literature and add to the existing body of knowledge about PBLA, but also can have potential implications for both educators and policy makers involved in implementation of PBLA in the LINC programme. Furthermore, I think there is a pressing need for research on teachers’ evaluations of student portfolios in order to investigate whether and the extent to which PBLA, as a tool for summative assessment, can fairly assess the language proficiency of learners. Finally, since the vast majority of the studies on portfolio language assessment have tended to focus on writing proficiency, there seems to be a need for future studies to be done on other language skills such as reading, listening and speaking. In addition, despite the importance of validity and reliability in any form of assessment including PBLA, there is a paucity of research investigating the validity and reliability of teachers’ actual assessments on student portfolios. 5.5 Concluding remarks By and large, the findings of my study concur with those of the previous studies conducted on PBLA in the context of Canada (e.g., Fox, 2014; Fox & Fraser 2012; Ripley, 2012). Quite surprisingly, though, while PBLA has been implemented since 2013 in the East Coast of Canada, the challenges facing LINC teachers in the initial stages of PBLA implementation three years ago seem to have remained unaddressed, and are still being reported now in the West Coast. In my view, PBLA, as a national project, merits more attention not only from researchers, but       131 perhaps more importantly from policy makers. When it comes to developing and formulating new policies and initiatives, as in the case of PBLA, which is gradually replacing traditional assessment in the LINC programmes all over Canada, it is extremely important to take into account, and to the extent possible, act upon the findings and implications of relevant research. By making research-informed policies and decisions, the recurring issues, challenges, and pitfalls identified in the past can be duly addressed and judiciously avoided. Hopefully, doing so would, in turn, not only pave the way for the effective and smooth implementation of PBLA, but also contribute to and enhance its sustainability in the LINC programmes across Canada.        132 References Aydin, S. (2010). EFL writers’ perceptions of portfolio keeping. Assessing Writing, 15(3), 194-203.  Bachman, L. F. (2002). Alternative interpretations of alternative assessments: Some validity issues in educational performance assessments. Educational Measurement: Issues and Practice, 21(3), 5-18. Bachman, L. F., & Palmer, A. (1996). Language testing in practice: Designing and developing useful language tests. New York, NY: Oxford University Press.  Bailey, K. M. (1996). Working for washback: A review of the washback concept in language testing. Language testing, 13(3), 257-279. Baume, D. & Yorke, M. (2002). The reliability of assessment by portfolio on a course to develop and accredit teachers in higher education, Studies in Higher Education, 27(1), 7-25.
  Belanoff, P., & Dickson, M. (1991). Portfolios: Process and product. Portsmouth, NH: Boynton/Cook Publishers. Biggs, J., & Tang, C. (1997). Assessment by portfolio: Constructing learning and designing teaching. Research and Development in Higher Education, pp. 79-87. Black, P. & Wiliam, D. (1998). Assessment and classroom learning. Assessment in Education, 5(1), pp. 7–74.  Black, P. & Wiliam, D. (2001). Inside the black box: Raising standards through classroom assessment. Retrieved from       133 Boyd, M. (1992). Immigrant women: Language, socio-economic inequalities, and policy issues. In B. Burnaby & A. Cumming (Eds.), Socio-political aspects of ESL (pp. 141-159). Toronto, ON: OISE Press.  Bowie, C., Taylor, P., Zimitat, C., & Young, B. (2000). Electronic course portfolio in a new on-line graduate certificate in flexible learning. HERDSA News. Brown, J. D., & Hudson, T. (1998). The alternatives in language assessment. TESOL Quarterly, 32(4), 653-675. Buck, G. (1988). Testing listening comprehension in Japanese university entrance examinations. JALT journal, 10(1), 15-42. Burnaby, B. (1992). Official language training for adult immigrants in Canada: Features and issues. In B. Burnaby & A. Cumming (Eds.), Socio-political aspects of ESL (pp. 3-34). Toronto, ON: OISE Press. Callahan, S. (2001). When portfolios become a site of ethical conflict: Using student portfolios for teacher accountability. Educational Assessment, 7(3), 177-200. Centre for Canadian Language Benchmarks. (2012). Canadian Language Benchmarks: English as a second language for adults. Ottawa, ON: Centre for Canadian Language Benchmarks. Citizenship and Immigration Canada. (2010). Evaluation of the Language  Instruction for Newcomers to Canada (LINC) Program. Retrieved from  Citizenship & Immigration Canada. (2013). Operational bulletin 510, March 15, 2013. Retrieved from       134  Condon, W., & Hamp-Lyons, L. (1991). Introducing a portfolio-based writing assessment: progress through problems. In: P. Belanoff, & M. Dickson (Eds.), Portfolios: Process and product (pp. 231-247). Portsmouth, NH: Boynton. Cooper, T. (1999). Portfolio assessment: A guide for lecturers, teachers and course designers. Perth: Praxis Cooper, T., & Love, T. (2000). Portfolios in university-based design education. In: C. Swann, & E. Young (Eds.), Re-inventing design education in the university (pp. 159-166). Perth: School of Design, Curtin University. Corbin, J., & Strauss, A. (2014). Basics of qualitative research: Techniques and procedures for developing grounded theory. Thousand Oaks, CA: Sage publications. Cummins, P. W., & Davesne, C. (2009). Using electronic portfolios for second language assessment. The Modern Language Journal, 93(s1), 848-867. Davis, J. (1994). Authentic assessment: Reading and writing. In: C. Hancock (Ed.), Teaching, testing, and assessment: Making the connection (pp. 139-162). Lincolnwood, IL: National Textbook Company.  Delett, J. S., Barnhardt, S., & Kevorkian, J. A. (2001). A framework for portfolio assessment in the foreign language classroom. Foreign Language Annals, 34(6), 559-568. Derwing, T. M., & Waugh, E. (2012). Language skills and the social integration of Canada’s adult immigrants. (IRPP Study, 31). Montreal, QC: Institute for Research on Public Policy.        135 Dieleman, C. (2012). Provincial ESL programming. International Settlement Canada (INSCAN Special Issue), 7-11. Retrieved from Duff, P., Wong, P., & Early, M. (2000). Learning language for work and life: The linguistic socialization of immigrant Canadians seeking careers in healthcare. Canadian Modern Language Review, 57(1), 9-57. Ekbatani, G. (2000). Moving toward learner-directed assessment. In Ekbatani, G. and Pierson, H., editors, 2000: Learner-directed assessment in ESL. Mahwah, NJ: Erlbaum, 1–11.  Ekbatani, G. (2000). Moving toward learner-directed assessment. Learner-directed assessment in ESL, 1-11. Fox, J. (2008). Alternative methods of assessment. In E. Shohamy & N. Hornberger (Eds.), The encyclopedia of language and education, volume 7, language testing and assessment, (pp. 97-109). New York, NY: Springer.  Fox, J. (2009). Moderating top-down policy impact and supporting EAP curricular renewal: Exploring the potential of diagnostic assessment. Journal of English for Academic Purposes, 8, 26-42.  Fox, J. (2014). Portfolio based language assessment (PBLA) in Canadian immigrant language training: Have we got it wrong?. Contact, 40(2), 68-83. Retrieved from mposium2014.pdf.2015.09.009       136 Fox, J. & Fraser, W. (2012). Report on the impact of the implementation of PBLA on LINC teachers. Internal technical report submitted to Citizenship and Immigration Canada, the Government of Canada, January 2012.  Fox, J. & Hartwick, P. (2011). Taking a diagnostic turn: Reinventing the portfolio in EAP classrooms. In D. Tsagari and Csepes (Eds.), Classroom-based language assessment, (pp. 47-62). Frankfurt: Peter Lang.  Frederiksen, J. R., & Collins, A. (1989). A systems approach to educational testing. Educational Researcher, 18(9), 27-32. Gearhart, M., Herman, J. L., Baker, E. L., & Whittaker, A. K. (1992). Writing portfolios at the elementary level: A study of methods for writing assessment (CSE Tech. Rep. No. 337). Los Angeles, CA: UCLA Graduate School of Education, National Center for Research on Evaluation, Standards and Student Testing (CRESST).  Gearhart, M., Herman, J. L., Baker, E. L., & Whittaker, A. K. (1993). Whose work is it? The validity of large-scale portfolio assessment (CSE Tech. Rep. No. 363) Los Angeles: University of California, Center for Research on Evaluation, Standards, and Student Testing.  Gearhart, M., & Herman, J. L. (1998). Portfolio assessment: Whose work is it? Issues in the use of classroom assignments for accountability. Educational Assessment, 5(1), 41-55. Genesee, F., & Upshur, J. A. (1996). Classroom-based evaluation in second language education. Cambridge: Cambridge University Press. Hamp-Lyons, L. (1997). Washback, impact and validity: Ethical concerns. Language Testing, 14(3), 295-303.       137 Hamp-Lyons, L., & Condon, W. (2000). Assessing the portfolio: Principles for practice, theory, and research. Cresskill: Hampton Press.  Haney, W., & Madaus, G. (1989). Searching for alternatives to standardized tests: Whys, whats, and whithers. Phi Delta Kappan, 70(9), 683-687. Haque, E. & Cray, E. (2010). Between policy and practice: Teachers in the LINC classroom. Contact, 36(2), 68–83. Retrieved from mposium2010.pdf.2015.09.007 Hargreaves, A., Earl, L., & Schmidt, M. (2002). Perspectives on alternative assessment reform. American Educational Research Journal, 39(1), 69-95. Heller, J. I., Sheingold, K. & Myford, C. M. (1998). Reasoning about evidence in portfolios: Cognitive foundational for valid and reliable assessment. Educational Assessment, 5(1), 5-40. Herman, J. L., Gearhart, M., & Baker, E. L. (1993). Assessing writing portfolios: Issues in the validity and meaning of scores. Educational Assessment, 1(3), 201-224.  Herman, J. L., & Winters, L. (1994). Portfolio research: A slim collection. Educational leadership, 52(2), 48-55. Hickey, D. T., Zuiker, S. J., Taasoobshirazi, G., Schafer, N. J., & Michael, M. A. (2006). Balancing varied assessment functions to attain systemic validity: Three is the magic number. Studies in Educational Evaluation, 32(3), 180-201. Hirvela, A. (1997). Disciplinary portfolios and EAP writing instruction. English for Specific Purposes, 16(2), 83-100.        138 Holmes, T., Kingwell, G., Pettis, J., & Pidlaski, M. (2001). Canadian Language Bench- marks 2000: A guide to implementation. Ottawa, ON: Centre for Canadian Language Benchmarks.  Huerta-Macias, A. (1995). Alternative assessment: Responses to commonly asked questions. TESOL Journal, 5(1), 8-11. Huot, B. (2002). (Re)Articulating writing assessment for teaching and learning. Logan, UT: Utah State University Press.  Huot, B., & Williamson, M. (1997). Rethinking portfolios for evaluating writing: Issues of assessment and power. In K. Yancey & I. Weiser (Eds.), Situating portfolios: Four perspectives. (pp. 43-56). Logan, UT: Utah State University Press. Johns, A. M. (1991). Interpreting an English competency examination: The frustrations of an ESL science student. Written Communication, 8(3), 379-401. Johnson, R.S., Mims-Cox, J.S., & Doyle-Nichols, A. (2010). Developing portfolios in education: A guide to reflection, inquiry, and assessment (2nd ed.). Thousand Oaks, CA: Sage.  Johnston, B. (2004). Summative assessment of portfolios: An examination of different approaches to agreement over outcomes. Studies in Higher Education, 29(3), 395-412.  Johnson, R. L., Mcdaniel, F., & Willeke, M. J. (2000). Using portfolios in program evaluation: An investigation of interrater reliability. American Journal of Evaluation, 21(1), 65-80.  Klenowski, V. (2002). Developing portfolios for learning and assessment: Process and principles. London: Routledge.
 Koretz, D. M., Stecher, B. M., Klein, S. P., McCaffrey, D., & Deibert, E. (1994). Can portfolios assess student performance and influence instruction? The 1991–92 Vermont experience       139 (Vol. RP-259). Santa Monica, CA: RAND Corporation.  Kvale, S. & Brinkmann, S (2009). InterViews: An introduction to qualitative research interviewing. Thousand Oaks, CA: Sage. Lam, R., & Lee, I. (2010). Balancing the dual functions of portfolio assessment. ELT journal, 64(1), 54-64. Lambdin, D. V., & Walker, V. L. (1994). Planning for classroom portfolio assessment. The Arithmetic Teacher, 41(6), 318-324. LeMahieu, P. G., Gitomer, D. H., & Eresh, J. A. T. (1995). Portfolios in large‐scale assessment: Difficult but not impossible. Educational Measurement: Issues and practice, 14(3), 11-28. Lewkowicz, J. A. (2000). Authenticity in language testing: Some outstanding questions. Language Testing, 17(1), 43-64.  Little, D. (2005). The Common European Framework and the European Language Portfolio: Involving learners and their judgments in the assessment process. Language Testing, 22, 321-336.  Little D. (2007). Language learner autonomy: Some fundamental considerations revisited. Innovation in Language Learning and Teaching, 1(1), 14-29.  Little, D. (2013). Learner autonomy: Drawing together the threads of self-assessment, goal setting and reflection. Retrieved from Little, D., & Erickson, G. (2015). Learner identity, learner agency, and the assessment of language proficiency: Some reflections prompted by the Common European Framework       140 of Reference for Languages. Annual Review of Applied Linguistics, 35, 120-139. Lynch, B., & Shaw, P. (2005). Portfolios, power, and ethics. TESOL Quarterly, 39, 263-297.  Makosky, L. (2008). The feasibility and way forward for a standardized exit assessment and test for newcomers in LINC training. Ottawa, ON: Internal report prepared for Citizenship and Immigration Canada. Mathews, J. (2004). Whatever happened to portfolio assessment? Education Next, 4(3), 73-75.  Moss, P. (1994). Validity in high stakes writing assessment: Problems and possibilities. Assessing Writing, 1, 109-128. Moya, S. S., & O’Malley, J. M. (1994). A portfolio assessment model for ESL. Journal of Educational Issues of Language Minority Students, 13, 13-36. Nagy, P. & Stewart, G. (2009). Research study on potential approaches to second language assessment. Ottawa, ON: Internal report prepared for Citizenship and Immigration Canada. National Language Placement and Progression Guidelines. (2013). Retrieved from Nolet, V. (1992). Classroom-based measurement and portfolio assessment. Assessment for Effective Intervention, 18(1), 5-26. Nystrand, M., Cohen, A. S., & Dowling, N. M. (1993). Addressing reliability problems in the portfolio assessment of college writing. Educational Assessment, 1(1), 53-70. Omaggio Hadley, A. (1993). Research in language learning: Principles, processes, and prospects. Lincolnwood, IL: National Textbook Company.        141 Pettis, J. (2014). Portfolio-Based Language Assessment (PBLA): Guide for teachers and programs. Ottawa ON: Centre for Canadian Language Benchmarks. Pettis, J. (2012). Portfolio based language assessment. International Settlement Canada (INSCAN Special Issue), 18-20. Retrieved from Picot, G., & Sweetman, A. (2012). Making it in Canada: Immigration outcomes and policies. (IRPP Study, 29). Montreal: Institute for Research on Public Policy. Retrieved from Pitts, J., Coles, C., & Thomas, P. (2001). Enhancing reliability in portfolio assessment: shaping the portfolio. Medical teacher, 23(4), 351-356. Popham, W. J. (1987). The merits of measurement-driven instruction. The Phi Delta Kappan, 68(9), 679-682. Reckase, M. D. (1995). Portfolio assessment: A theoretical esthnate of score reliability. Educational Measurement: Issues and Practice, 14(1), 12-14.  Ripley, D. (2012). Implementing portfolio-based language assessment in LINC programs: Benefits and challenges. TESL Canada Journal. 30(1), 69-86. Ruetten, M. K. (1994). Evaluating ESL students' performance on proficiency exams. Journal of Second Language Writing, 3(2), 85-96. Supovitz, J., Macgowan, A. & Slattery, V. (1997). Assessing agreement: An examination of the interrater reliability of portfolio assessment in Rochester, New York. Educational Assessment, 4(3), 237-259.       142 Schifter, C. C., & Carey, M. Addressing standardized testing through a novel assessment model. CELDA, 207. Singh, D., & Blakely, G. (2012). Federal and provincial policy initiatives LINC and CLIC: Looking back, looking forward. International Settlement Canada (INSCAN Special Issue), 7-11. Retrieved from Song, B., & August, B. (2001). Using portfolios to assess the writing of ESL students: A powerful alternative? Journal of Second Language Writing, 11(1), 49-72.
 Spalding, E., & Cummins, G. (1998). It was the best of times. It was a waste of time: University of Kentucky students’ views of writing under KERA. Assessing Writing, 5(2), 167-199.  Taki, S., & Heidari, M. (2011). The effect of using portfolio-based writing assessment on language learning: The case of young Iranian EFL learners. English Language Teaching, 4(3), 192. Thompson, R. M. (1990). Writing-proficiency tests and remediation: Some cultural differences. TESOL Quarterly, 24(1), 99-102. Webb, N. M. (1993). Collaborative group versus individual assessment in mathematics: Processes and outcomes. Educational Assessment, 1(2), 131-152. Webb, N. M. (1995). Group collaboration in assessment: Multiple objectives, processes, and outcomes. Educational Evaluation and Policy Analysis, 17(2), 239-261. Weigle, S.C. (2002). Assessing writing. Cambridge, UK: Cambridge University Press.  Wiliam, D. (2010). What counts as evidence of educational achievement? The role of constructs in the pursuit of equity in assessment. Review of Research in Education, 34(1), 254-284.       143 Yayli, D. (2011). From genre awareness to cross-genre awareness: A study in an EFL context. Journal of English for Academic Purposes,10(3), 121-9. Yin, M. (2014). Portfolio Assessment in the Classroom. In A. J. Kunnan (Ed.), The companion to language assessment (1st ed., Vol. 2), (pp. 659-676). Oxford: John Wiley and Sons.  


Citation Scheme:


Citations by CSL (citeproc-js)

Usage Statistics



Customize your widget with the following options, then copy and paste the code below into the HTML of your page to embed this item in your website.
                            <div id="ubcOpenCollectionsWidgetDisplay">
                            <script id="ubcOpenCollectionsWidget"
                            async >
IIIF logo Our image viewer uses the IIIF 2.0 standard. To load this item in other compatible viewers, use this url:


Related Items