UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

The effects of a reading fluency and multisyllabic decoding strategy on reading skills Saqui, Sonja 2017

Your browser doesn't seem to have a PDF viewer, please download the PDF to view this item.

Item Metadata

Download

Media
24-ubc_2017_september_saqui_sonja.pdf [ 1.16MB ]
Metadata
JSON: 24-1.0354258.json
JSON-LD: 24-1.0354258-ld.json
RDF/XML (Pretty): 24-1.0354258-rdf.xml
RDF/JSON: 24-1.0354258-rdf.json
Turtle: 24-1.0354258-turtle.txt
N-Triples: 24-1.0354258-rdf-ntriples.txt
Original Record: 24-1.0354258-source.json
Full Text
24-1.0354258-fulltext.txt
Citation
24-1.0354258.ris

Full Text

 THE EFFECTS OF A READING FLUENCY AND MULTISYLLABIC DECODING STRATEGY ON READING SKILLS  by SONJA SAQUI  B.Ed., University of Victoria, 2004   A THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF  MASTER OF ARTS  in  THE FACULTY OF GRADUATE AND POSTDOCTORAL STUDIES (School Psychology)  THE UNIVERSITY OF BRITISH COLUMBIA (Vancouver)  August 2017 © Sonja Saqui, 2017    ii Abstract  As students enter the upper-elementary grades, the instructional focus shifts from learning to read to reading to learn (Chall, 1983). Much of the meaning in upper-elementary science texts is carried by previously unseen multisyllabic words. Students are expected to demonstrate proficient reading and decoding skills so they are able to access the curriculum and extract meaning from print. However, this is problematic for a cohort of students who demonstrate proficiency with the alphabetic principle, but lack flexible strategies and processes to employ when encountering a multisyllabic word. A delayed multiple baseline design was employed to determine if a multicomponent intervention combining three flexibly applied ‘think aloud’ multisyllabic word-decoding strategies with evidence-based fluency strategies was effective in improving the expository text reading skills of upper-elementary struggling readers. Gains in generalized reading fluency were observed on both expository and standardized reading passages. Minimal gains of multisyllabic word reading accuracy were observed on researcher-created word-lists and on within-passage measures. Gains in broad reading skills were not consistently observed. Students viewed the intervention favourably and perceived gains in their reading skills. Implications and future directions for research are discussed.           iii Lay Summary Some students struggle to learn to read words that are made up of many syllables. It can be hard for students to understand what they are reading if they cannot read long words. This study looked at how to teach students to read longer words. This study combined proven reading fluency strategies with a strategy in which the instructor modelled how to read longer words while reading science passages. We found that students’ reading fluency (how quickly and accurately they read) improved when reading science passages. There was some evidence that their reading was improving on other passages types. There was some evidence that the students’ ability to read longer words improved on science passages but this was not detected on other types of passages. Overall, this study showed that following the intervention students could better access curricular science passages, similar to those read in class.               iv Preface Data collection for this project was approved by the University of British Columbia’s Research Ethics Board (certificate no. H15-03051) and the Vancouver School Board’s Research Committee.  The study design and instructional procedures were collaboratively designed by the lead author and research supervisor, Dr. Sterett Mercer with input from Dr. Joanna Cannon. All written content of this thesis represents my original, unpublished work.                     v Table of Contents Abstract ···································································································· ii Lay Summary ····························································································· iii Preface······································································································ iv Table of Contents ························································································· v List of Tables ······························································································ vii List of Figures ····························································································· viii Acknowledgements ······················································································· ix Chapter 1: Introduction ·················································································· 1 Chapter 2: Review of the Literature ···································································· 3  Development of Decoding  ····································································· 3  Early Reading Intervention ····································································· 4 Intervention for Later Readers ································································· 4 Characteristics of Later Readers ······························································· 6 Characteristics of Upper-Elementary Text ··················································· 8 Supporting Older Readers: Fluency ··························································· 9 Supporting Older Readers: Word Reading Strategies ······································· 10 Skill Generalization ·············································································· 17 Instructional Efficiency ········································································· 18 The Current Study ··············································································· 19 Chapter 3: Methodology ················································································· 21  Participants ······················································································· 21 Materials ·························································································· 21   vi Measures ·························································································· 22 Procedure ························································································· 27 Data Analysis ····················································································· 34 Chapter 4: Results ························································································ 36  Screening Assessments ·········································································· 36  Reading Fluency ················································································· 37   Multisyllabic Word Decoding ·································································· 44   Broad Reading Skills ············································································ 55   Social Validity ··················································································· 56  Chapter 5: Discussion ···················································································· 58  Major Findings ··················································································· 58   Limitations ························································································ 62    Implications ······················································································· 66   Recommendations for Future Research ······················································· 68   Conclusions ······················································································· 70  References ································································································· 72 Appendices ································································································ 85 Appendix A: Introduction of the Cue Card··················································· 85 Appendix B: Multisyllabic Word Decoding ················································· 86  Appendix C: Instructional Sequence and Script ············································· 87       vii List of Tables Table 1. Student Demographics and Screening Results ············································· 36 Table 2. Means, Standard Deviations and slopes of WCPM on Initial Readings of the Instructional Passages across Baseline and Intervention ············································ 39 Table 3. Means, Standard Deviations and slopes of WCPM on the DIBELS Passages Across Baseline and Intervention ················································································ 41 Table 4. Mean Percentage of Multisyllabic Words Read Correctly Within Passages Across Baseline and Intervention phases and associated slopes············································· 48 Table 5. Means, Standard Deviations and slopes of Multisyllabic Word Decoding Accuracy Performance Across Baseline and Intervention phases ·············································· 53 Table 6. Students’ Performance on the Computer Adaptive Measure of Broad Reading Skills ··············································································································· 56 Table 7. Students’ Perceptions of the Intervention and their Reading ··············································································································· 57             viii List of Figures Figure 1. Students’ oral reading performance on first and third reads of the instructional passages across baseline and intervention ········································································ 38 Figure 2. Students’ generalized gains on DIBELS passages ········································ 42 Figure 3. Students’ multisyllabic word decoding accuracy on the first and third reads of the instructional passages during the intervention phase ················································ 45 Figure 4. Multisyllabic Word Decoding Accuracy on the DIBELS Passages ···················· 47 Figure 5. Multisyllabic Word Decoding Accuracy on the Word Probes ·························· 52     i Acknowledgements There are countless individuals who have somehow inspired, encouraged and motivated me to pursue graduate studies. I would like to thank Dr. Sterett Mercer for his steady support and guidance throughout this process. I would also like to recognize my cohort for their unfaltering encouragement and support over the last few years. My acknowledgments would not be complete without thanking my husband and two children for their love, support, and patience. Finally, thanks to the students who participated in this project and to the teachers who warmly welcomed me into their school and classrooms.          1 Chapter 1: Introduction In today’s information rich environment, it is essential that all students be provided with the skills needed to become proficient readers to function optimally in society. Yet, learning to read is a complex process involving the synthesis of multiple skills and processes (Saunders, 2007), that many students have difficulty acquiring (Daly, Chafouleas, & Skinner, 2005; National Reading Panel [NRP], 2000). The Programme for the International Assessment of Adult Competencies (2014) found that 48% of Canadian adults between 16 and 65 years of age have inadequate literacy skills. Further, the National Assessment of Educational Progress (2013) reported that 30% of students in grade four struggle with reading grade-level texts. These data are particularly discouraging in that students who struggle with reading in fourth grade are likely to stay behind (Stanovich, 2000). As such, poor reading skills can lead to poor long-term educational outcomes (Ricketts, Sperring, & Nation, 2014) and less access to employment opportunities (Saunders, 2007). Aside from the educational and economic implications of inadequate literacy skills, profound social consequences emerge. Individuals who have poor literacy skills tend to be less politically engaged, more likely to report poor health, and tend to be more socially isolated (OECD, 2013). Supporting students who struggle with reading is a crucial endeavor when considering both the individual well-being of students and the well-being of our country at large.  In recent decades, how to intervene when foundational reading skills do not develop has been extensively studied. The National Reading Panel (NRP; 2001) outlined a combination of techniques that are effective for teaching children to read, including explicit instruction in phonemic awareness, systematic phonics instruction, methods to improve fluency, vocabulary instruction, and ways to enhance comprehension.     2 The ability to comprehend and derive meaning from text is the ultimate goal of reading and reading instruction (NRP, 2000) and is supported, like reading in general, by a range of skills and processes (Language and Reading Research Consortium, 2015).  The primary step in learning how to derive meaning from text is the ability to decode or recognize print. This process of transcoding print to meaningful units at the sound or word level has been labeled as decoding, word recognition, word identification, or word reading.  A cohort of students begins to emerge with reading difficulties in the upper elementary grades (Leach, Scarborugh, & Rescorla, 2003) when text levels and curricular reading expectations increase. The current study aimed to provide insight into how to efficiently support this cohort of students. An objective of the current study was to provide knowledge in how to assist students in efficiently developing flexible decoding skills needed for proficient reading fluency and reading comprehension. Specifically, the current study explored the relation between a multicomponent intervention combining three flexibly applied ‘think aloud’ multisyllabic word-decoding strategies with evidence-based fluency strategies and an improvement in the expository text reading skills of upper-elementary struggling readers.           3 Chapter 2: Review of the Literature Development of Decoding  To better understand how to support upper-elementary readers, an understanding of the developmental trajectory of decoding is merited. Proficient decoding skills typically develop throughout the elementary school years. Ehri (1995) identified four major stages in a reader’s developmental process of learning to identify words: pre-alphabetic, partial alphabetic, full alphabetic, and consolidated alphabetic. The pre-alphabetic stage, which is characterized by an emergent reader’s reliance on context to ‘read’ a word, is followed by the partial alphabetic stage during which an emergent reader will attend to some, often the first and/or final sounds of a word. Eventually, when a reader is able to attend and utilize all the sounds within words, they have progressed to the full alphabetic stage. Finally, in the consolidated stage, a reader is able to process groups of letters simultaneously in meaningful ‘chunks’. This allows for more efficient decoding of unknown words during reading tasks and builds upon fluency.  During the early elementary grades, the expectation is that emergent readers are learning to decode and master the alphabetic principle. Yet, the accurate identification, recognition or decoding of words, though essential, is alone an insufficient prerequisite of reading comprehension (Adams, 1990). In addition to accurate word identification, readers need to develop the ability to automatically recognize single words or word groups in order to support meaningful comprehension from print (Pikulski & Chard, 2005; Shinn & Good, 1992). This is consistent with cognitive theories of reading that propose that the majority of a beginning reader’s mental resources are absorbed by the initially laborious process of decoding a word sound by sound (LaBerge & Samuels, 1974; Samuels, 1979).  As a reader develops more automaticity with basic word reading skills, increasing cognitive energy can be allocated to   4 higher-level reading processes such as deriving meaning from the text read (LaBerge & Samuels, 1974). Early Reading Intervention Many studies have investigated how to best support emergent readers in the acquisition of word decoding skills. Specifically, when learning to read in English, ample opportunity to practice and recognize individual letters and letter groups in print is important due to the deep orthography of English in which the same letter does not always represent the same sound (Seymour, Aro, & Erskine, 2003). There have been numerous studies comparing English decoding acquisition to decoding acquisition in languages containing less orthographic inconsistencies and complexities (Frith, Wimmer, & Landerl, 1998; Goswami, Porpodas, & Wheelwrite, 1997; Wimmer, & Goswami, 1994). These studies consistently reported slower and less efficient decoding in individuals learning to read English than those learning to decode in languages with more consistent orthography.  To increase accuracy in decoding skills the NRP (2000) report expounds the importance of early, explicit systematic phonics instruction as an integrated component of a balanced reading program. Through a systematic approach to phonics instruction, students are explicitly taught the correspondence between the written and spoken sounds of individual letters and groups of letters. Phonics instruction, though necessary, must be combined with reading comprehension, reading fluency and phonemic awareness instruction (NRP, 2000) to support the development of proficient English decoding skills in primary school-aged students.  Intervention for Later Readers Although phonemic awareness, phonics and fluency are effective instructional targets for early reading instruction or intervention, they do not suffice in preventing the manifestation of   5 later reading difficulties as students advance to the upper-elementary years. While significant advances have been made in regards to reading instruction and intervention, reading research tended to focus upon emergent and early reading skills, and typically ignored older school-age struggling readers (Betourne, & Friel-Patti, 2003; Flynn, Zheng, & Swanson, 2012).  Encouragingly, researchers are increasingly exploring how to best support struggling readers from grade 4-12 (Scammacca, Roberts, Vaughn, & Stuebing, 2015). Despite increased research interest, interventions supporting older readers appear to be less effective than those interventions targeting emergent and early readers. For example, declining instructional benefits were found in a meta-analysis that explored reading interventions for students in grade 5-12 (Flynn & Swanson, 2012). The majority of the effect sizes reported for reading interventions for the older cohort of students in the meta-analysis by Flynn and Swanson (2012) were smaller than the effect sizes reported for younger lower-elementary students by the NRP (2000). The results of the meta-analysis (Flynn & Swanson, 2012) indicate that the reading strategies that are most effective in the lower elementary grades are not being met with the same level of success with older students. Additionally, a meta-analysis of reading interventions from 1980- 2004 (Scammacca et al., 2007) found that the effect sizes for reading interventions provided to middle-grade students on both experimental and standardized measures were higher than with high school students. Further, a recent update to Scammacca et al.’s (2007) meta-analysis extended the previous analysis to include studies from 2004-2011. Scammacca et al. (2015) found that the average effect sizes of recent reading interventions for older students, compared to those formally reported in Scammacca et al. (2007), had declined a clinically significant amount. Although these results are initially discouraging, the authors (Scammacca et al., 2015) attribute the measured effect size decline to an increase in the use of standardized measures, increased   6 research rigor and an improvement in typical ‘business as usual’ comparison control classroom practice. As Lemons, Fuchs and Gilbert (2014) expound it is important to look beyond ‘business as usual’ comparisons in order to more comprehensively evaluate the outcome of an intervention and appreciate the changing nature of the contextual factors in which an intervention is evaluated.  However, regardless the average effect sizes for reading intervention for upper-elementary reading interventions are still consistently smaller than for younger struggling readers. Although the smaller effect sizes for reading interventions for older students, compared to early elementary students, reported by Flynn and Swanson (2012) and the results reported by Scammacca et al. (2007; 2015) strengthen the argument for early intervention, they do not negate the potential impact of a reading intervention for older students as a small effect is better than no effect. There is increasing consensus that students struggling with reading well into the upper-elementary school years can be supported through targeted, systematic intervention to narrow the gap between their own reading achievement and that of their more typically reading peers (Scammacca et al., 2007). Although recent progress has been made, the findings that reading interventions appear less effective for older students support the need to further explore what reading interventions are most effective for this older cohort of struggling readers.  Characteristics of Later Readers   Students who present with reading difficulties in the upper-elementary years may struggle with word decoding or higher-order text comprehension skills, or both (Compton, Fuchs, Fuchs, Elleman, & Gilbert, 2008). In the upper-elementary years, as the focus of classroom instruction shifts from learning to read to reading to learn (Chall, 1983) students are increasingly required to use reading as a tool for learning and comprehending. Students first   7 identified with reading difficulties in the upper-elementary years are typically identified by emerging difficulties with comprehension (Leach, Scarborugh, & Rescorla, 2003). Typically, students who begin to struggle with reading comprehension in the upper-elementary years receive intervention targeting reading comprehension-related skills (Leach, Scarborugh, & Rescorla, 2003) such as oral language, memory, and direct comprehension strategy instruction.  Yet, not all cases of later emerging reading difficulties originate in comprehension deficits alone (Leach, Scarborugh, & Rescorla, 2003). Rather, in some cases, the reading related difficulty originates from increasingly crucial, advanced decoding skills (Leach, Scarborugh, & Rescorla, 2003). In fact, Perfetti’s Verbal Efficiency Theory (1985) highlights the need for automatic word reading and he purports, very much in line with LaBerge and Samuels (1974), that problems in comprehension are often linked to ineffective word identification processes. The Verbal Efficiency Theory supports what is consistently observed between proficient and poor comprehenders—poor comprehenders tend to be poor decoders (Perfetti, 1985).  Further, the ability to capably decode single-syllable words does not necessarily indicate that a student will seamlessly progress to proficiency with multisyllabic word decoding (Just & Carpenter, 1987). There is a subgroup of struggling readers who are able to decode single-syllable words, and recognize high-frequency irregular words, yet demonstrate persistent difficulty with decoding multisyllabic words and lack systematic strategies to employ when tackling multisyllabic words (Archer et al., 2003). These students’ difficulties with decoding may have been masked in the primary years by whole-word memorization strategies that become increasingly ineffective as text and word complexity increases with grade level (Leach, Scarborugh, & Rescorla, 2003). Thus, students who are still mastering decoding skills or those   8 who employed inefficient decoding strategies may have remained unidentified prior to confronting the word and text-level expectations of the upper-elementary years. Although the timing of the aforementioned curricular shift is suitable for students who have developed accurate and fluent word identification skills and reading comprehension strategies during the primary grades, the increased demand for higher-level reading skills in the intermediate grades is problematic for those students still struggling with word identification skills (Abbott & Berninger, 1999). Thus, this transition to the intermediate grades can be associated with a decrease in reading performance known as the “fourth grade slump” (Chall & Jacobs, 2003) which occurs when the curricular expectations shift and students are expected to demonstrate higher order reading skills such as comprehension and advanced decoding skills.  Characteristics of Upper-Elementary Expository Text The importance of flexible and efficient multisyllabic word decoding skills is further supported by the fact that much of the meaning in the content-rich curriculum of the upper-elementary grades, and beyond, is largely carried by multisyllabic words (Archer et al., 2003). Upper-elementary students must be taught systematic procedures for decoding longer words in order to prepare them for the thousands of unknown words they will encounter every year (Nagy & Anderson, 1984). For example, intermediate science and social studies texts are rife with multisyllabic words that carry much of the passages’ meaning. To illustrate read the following passage from a sixth grade British Columbia Science textbook: For many thousands of years, humans have used the resources in their environment for food and clothing and also to build shelters. For example, the Interior Salish people built partially buried pit houses that kept them comfortable during cold, snowy winters. (from BC Science 6 [2005] published by McGraw-Hill Ryerson)  If the multisyllabic words above (defined here as ≥ 3 syllables) are skipped or not accurately decoded, the reader’s ability to comprehend or learn from the above passage is   9 negatively impacted. Given that fluency is related to comprehension (Shinn & Good, 1992), the accurate and smooth decoding of multisyllabic words is a necessary skill for students in the upper-elementary grades when curricular expectations shift and students are expected to increasingly learn from texts read. Thus, the ability to tackle unfamiliar multisyllabic words is a crucial skill for subsequent academic success (Diliberto, Beattie, Flowers, & Algozzine, 2008) given the texts and words typically encountered in the upper-elementary years. Supporting Older Readers: Fluency Reading fluency, which has been described as the ability to decode words accurately and automatically, and with good expression (Daly, Chafouleas, & Skinner, 2005) is a critical target of reading instruction and a key component of overall reading achievement (NRP, 2000). Proficient reading fluency becomes increasingly critical for older students (Wexler, Vaughn, Edmonds, & Klein-Reutenbuch, 2008) due to its association with comprehension (Shinn & Good, 1992). The link between reading fluency and reading comprehension is illustrated by both Perfetti’s (1985) aforementioned Verbal Efficiency Theory and Samuel and LaBerge’s (1974) automatic information processing theory of reading acquisition. Strategies aimed at increasing overall reading fluency, such as repeated readings do improve the speed and accuracy at which lower and upper-elementary school students read (Meyer & Felton, 1999). Repeated readings, which are often included in reading fluency interventions, provide students with the opportunity to reread and practice difficult words and phrases (Samuels, 1979). Encouragingly, relatively easily implemented strategies such as repeated readings have been consistently confirmed to improve reading fluency (Polk & Miller, 1994; Rasinski, 1990) and subsequent comprehension (Therrien, 2004).    10 However, the relationship between reading fluency and comprehension appears to decrease in the secondary years (Edmonds et al., 2009) when background knowledge, word knowledge, and strategy use also begin to increasingly contribute to comprehension in the upper grades (Kintsch & Kintsch, 2004; Wexler et al., 2008). Although reading fluency is central to reading comprehension (Pikulski & Chard, 2005), fluency instruction alone may not be sufficient in improving comprehension (Wexler et al., 2008), especially in older students. Supporting Older Readers: Word Reading Strategies Word decoding strategies are an important instructional component to address with older students with reading difficulties who struggle at the word level (Scammacca et al., 2007). Upper-elementary school readers need skills to systematically and flexibly tackle unknown multisyllabic words within connected text (Lovett, Barron, & Frijters, 2013). There is a need to work on accurately and efficiently decoding multisyllabic words in upper-elementary grades and to support students struggling with those skills (Archer et al., 2003). There is some research surrounding how to best support this subgroup of upper-elementary students struggling with reading in gaining essential decoding skills. In a meta-analysis, Wanzek, Wexler, Vaughn, and Ciullo (2010) synthesized findings from 24 reading intervention studies for students with reading difficulties and disabilities in Grades 4 and 5. Of the 24 research studies presented, 10 studies related to the word reading or recognition skills of this cohort of students through either a single subject or experimental/quasi-experimental design. Overall, small to moderate effect sizes were reported on a range of reading outcomes for the word recognition or word reading interventions reported. Interestingly of the ten studies related to word reading in students in Grades 4 and 5 (Wanzek et al., 2010), eight focused upon sight word identification skills, while only two studies addressed the decoding of unknown words.   11 These struggling readers must be equipped with the skills required to tackle longer words so they can progress to more advanced phases of learning to read (Abbott & Berninger, 1999). Phonics and structural analysis, including syllabification and morphemic analysis, offers some instructional direction when teaching decoding. Phonics instruction. The principal aim of phonics instruction is to teach students to acquire alphabetic knowledge so they can proficiently link written letters (graphemes) to sounds (phonemes) so that students can apply this knowledge to read (and write) words (Ehri, 2003).  Although phonics instruction can facilitate reading acquisition in older students, as determined by a meta-analysis conducted by Ehri (2003), effect sizes are lower for older students indicating that phonics instruction exerts its greatest impact early. This is mirrored in the NRP (2000) report which outlines that while phonic skills are essential in order to read words; alone they are not sufficient in ensuring broader reading competence, needed in the upper grades, and as such should be integrated with fluency and text reading comprehension skills. In a single-subject research design study, Wright and Mullan (2006) explored the effects of a reading program (Phono-Graphix) upon the reading skills of ten 9-11-year-old school children previously identified as having a specific learning difficulty. Over an 8-month period students were systematically instructed over an average of 24.3 hours in phonics instruction that included both advance code and multisyllabic word reading skills. The largest gains on standardized measures of phonological awareness, reading accuracy and comprehension were made on the phonological measure. However, researchers noted that despite the observed improvement in phonological awareness on final reading tests, the number or errors classified as mispronunciations increased. This, researchers suggest, indicates that the students, although more attentive to the decoding of each word, likely gave insufficient attention to the passage’s   12 general meaning. Therefore, comprehension of the material read was likely not proficiently demonstrated as the majority of the students’ cognitive energy was exhausted by the decoding of individual unknown words. Again, this is consistent with theories surrounding the cognitive demands of reading (LaBerge & Samuels, 1974) which highlight the freeing of cognitive resources from foundational reading skills and redirecting them to higher-order reading processes such as extracting meaning from text (NRP, 2000). Although students in the latter study (Wright & Mullan, 2006) made gains in phonological awareness, they were, when measured following the intervention, still allocating significant cognitive energy towards the decoding process. Wright and Mullan (2006) suggest that the inclusion of a more balanced approach to reading instruction may have been needed.  Thus, the proposed study will incorporate a phonics based decoding strategy (decoding of an unknown word from left to right) only as appropriate. Left to right decoding is an inefficient strategy to employ when encountering a novel word with many phonemes (Shanker & Ekwell, 1998). As such, this decoding approach will be combined with evidence-based fluency strategies. Structural analysis. Multisyllabic words can be complex and are difficult to read without analyzing their internal word structure (Abbott & Berninger, 1999). The process in which the syllabic and morphemic structure of a multisyllabic word is employed to assist in decoding tasks is referred to as structural analysis.  Structural analysis serves a similar function to upper-elementary school readers as phonics does to primary school readers in that students are breaking words into more manageable pieces. When a proficient reader encounters an unknown word they break the word apart into manageable units (Adams, 1990). Through structural analysis, students are taught to break down a longer word into meaningful chunks that can be more efficiently blended and the word can thus be more fluently read. Through structural   13 analysis approaches, students are taught to both analyze the syllables and stress patterns of a word and to examine words through their morphemes or meaning units (Abbott & Berninger, 1999). Students are taught to count and analyze syllables and to look for prefixes, suffixes, and known words embedded within an unknown multisyllabic word. Regrettably, the structural analysis approach is not typically taught in the upper-primary and intermediate grades (Abbott & Berninger, 1999), when students are increasingly expected to fluently decode multisyllabic words. However, both syllabication and the utility of morphological analysis will be flexibly modeled as two viable multisyllabic word-decoding strategies.  Syllabication. Syllabication, or syllable analysis, the process of breaking spoken or written words into syllables, will be one of the strategies modeled. Although, not often taught, partially due to the deep orthography of the English language, syllabication can be an effective strategy to employ when teaching students how to decode unknown multisyllabic words (Bhattachary & Ehri, 2004). Through randomly assigning students to one of three instructional conditions (syllabication, whole word reading, instruction as usual), Bhattacharya and Ehri (2004) found that teaching students how to chunk multisyllabic words through a structural analysis approach had a large standardized mean difference effect size (ES = 1.40) upon the readers’ novel word decoding ability. The encouraging effects of syllable training, the researchers report, may be due to the fact that students gain proficiency in recognizing and reading common syllables in words and blending those units into longer words (Bhattacharya & Ehri, 2004). Additionally, this focus on syllables may have encouraged students to pay more attention to the syllabic units across the word, not just at the beginning or the end of the word (Bhattacharya & Ehri, 2004). This study supports the importance of teaching syllable analysis to older students who struggle with decoding multisyllabic words.   14 A study that compared a similar structural analysis approach to typical reading instruction yielded a moderate effect size on word reading (ES = 0.43-0.48; Penney, 2002). High-school students (n  =  21) received 18 x 1-hour 1:1 tutoring sessions while a control group of 12 students received instruction as usual. During the intervention, students read a text and their reading errors were noted. After the first reading the students preformed word analysis drills on the words that they read incorrectly which included three or more words with the same orthographic pattern as that seen in the error-word. The students then re-read the original passage. The promising effects of this intervention may be due to the fact that repeatedly drilling words with similar orthographic patterns alerts students to the utility of the regularities [and variations] in spelling and pronunciation (Penney, 2002) and may further promote the efficient retrieval of meaningful and useful word parts. However, it should be noted that although students may be taught 5-6 specific rules to assist them in breaking words into syllables, this can be difficult for students who are struggling with reading (Shanker & Ekwall, 1998) and syllabication rules often have exceptions and they can be difficult to remember (Archer, Gleason, & Vachon, 2003; Shanker & Ekwall, 1998). Thus this study will incorporate flexible multisyllabic decoding strategies instead of relying upon rigid rules. Morphemic analysis. Further, instruction in syllabification alone may not suffice as English words are morphophonemic which means that word can be simultaneously represented by sounds (phonemes) and units of meaning (morphemes; Carlisle & Stone, 2005; Tinge & Binder, 2015). This means that words, or even parts of words, can sound or look similar but they may not have a shared meaning (Carlisle, 2000). Although word reading in the early elementary years is predicted by phonological awareness, by the time a student is in fifth grade, their decoding ability is better predicted by their morphological skills (Mann & Singson, 2003; Tinge   15 & Binder, 2015). Morphemes are the smallest units of meaning and include prefixes, suffixes, base words and grammatical inflections (e.g., ‘s’ and ‘es’ to denote plurality). The recognition and understanding that word parts carry meaning, which is known as morphological awareness supports the process of learning to decode unknown words (Carlisle, 2000; Nagy & Anderson, 1984). Further, awareness of the morphemic structure of a word becomes increasingly important as students grapple with not only the decoding of unknown words but also the subsequent comprehension of academic texts in the upper elementary years and beyond (Carlisle & Stone, 2005). In fact, Carlise (2000) found that 43% of the variance in reading comprehension in third grade and 55% of the variance in reading comprehension in fifth graders is accounted for by morphological awareness. Similar findings by researchers (Nation, 2004; Tong, Deacon, Kirby, Cain, & Parrila, 2011) also report that morphological awareness contributes to some of the variance in text comprehension skills.  The importance of morphological awareness, and its integral role in proficient decoding, and subsequent comprehension is supported by the Lexical Quality Hypothesis (Perfetti, 2007; Perfetti & Hart, 2002). The Lexical Quality Hypothesis (Perfetti, 2007; Perfetti & Hart, 2002) emphasizes knowledge and the importance of efficiently retrieving “word identities that provide the meanings the reader needs in a given context” (Perfetti, 2007, p.359). Individuals vary in the number of words they can efficiently, reliability and coherently retrieve through the integration of phonological, orthographic and semantic-syntactic codes (Perfetti & Hart, 2002). According to this theory, poor readers likely have fewer quality word representations that assist in the accurate integration of phonological, orthographic, semantic and morpho-syntactic information (Gilbert et al., 2014) compared to individuals who have many high quality word representations. In contrast,   16 when a word has high quality lexical representation for an individual, retrieval during reading tasks tends to be instantaneous and dependent upon the word’s orthographic form (Gilbert et al., 2014). The implications of this theory rest in the fact that skilled readers have a better chance of efficiently retrieving words and efficiently decoding unknown words due to their higher quality lexical representations  Through a crossed random-effects item response model, Goodwin, Gilbert and Cho (2013) explored the role of morphological knowledge and awareness, such as being able to read a root word, upon adolescents’ ability to read more complex words typically encountered in subject area reading tasks. The researchers (Goodwin et al., 2013) found that adolescent readers were significantly more likely to be able to read a morphologically complex word if they read the root word accurately. Thus, interventions that aim to improve older readers’ word decoding skills may benefit from “emphasizing root word knowledge and parsing challenging words into morphemic units” (Goodwin et al., 2013 p. 54).  Following the identification of the root word, students should be taught to identify prefixes and suffixes (Taylor, Harris, Pearson, & Garcia, 1995) and attempt variable vowel pronunciations until a logical word is read. Similarity, Shanker and Ekwall (1998) recommend to look for suffixes and prefixes and attempt different vowel sounds and pronunciations until a logical word is obtained or to look for alternative decodable ‘chunks’ if the first ‘chunks’ do not result in a meaningful word. Thus, morphological word analysis through the identification of the word’s root word, other morphemic units, and variable vowel pronunciation will be flexibly modeled as one of the multisyllabic word decoding strategies in this study.  The usefulness of repeated exposure during morphemic analysis tasks is speculated in a recent study (Good, Lance, & Rainey, 2015). Researchers (Good et al., 2015) compared the   17 effectiveness of an explicit morphological awareness intervention upon the literacy skills of 16 third-grade students with language impairment to a ‘business as usual’ comparison group. Although a significant difference was not noted between the control and the experimental groups’ reading skills, both groups improved their ability to read morphologically complex words (Good et al., 2015).  Researchers hypothesized that the lack of difference in decoding skills between the two groups may have occurred due to the fact that both groups were repeatedly exposed to the longer words (Good et al., 2015).  Thus, participants in the proposed study will reread passages, thereby having multiple exposures to the multisyllabic word and will be provided with multiple opportunities to practice the reading of multisyllabic words within meaningful connected text. Skill Generalization Children who struggle with reading often exhibit difficulty extending a learned reading skill to uninstructed content (Lovett et al., 1990). Thus, the transfer of reading skills learned within the context of a reading intervention to situations beyond the intervention should be included in effective remediation (Lovett et al., 1994). If the student cannot generalize training gains made during the intervention, the intervention has not proven its social value. On way to foster transfer of leaning is to include strategy instruction and provide opportunities to practice the utilization of a range of effective strategies and the metacognitive monitoring of those strategies (Lovett et al., 1994).  In a study by Lovett et al. (1994), 62 children between 7 and 13 years old who demonstrated specific underachievement in reading were assigned to one of two comprehensive instructional programs. One of the programs provided direct instruction in letter-sound correspondence and trained phonological analysis and blending skills while the other provided   18 training in four strategy-based decoding approaches. Although both the phonological and strategy-based instructional programs yielded large positive effects, and transfer to a range of measures, the phonological program resulted in greater generalized gains in the phonological domain (e.g. left to right decoding of regular words) while the strategy-based program results in broader transfer to real words and flexible strategy use.   More recently, through a quasi-experimental design Lovett, Lacerenza, De Palma, and Frijters (2012), evaluated the efficacy of PHAST PACES, an instructional reading intervention designed for struggling high-school readers. PHAST PACES offers students explicit strategy instruction in word identification and text comprehension processes (Lovett et al., 2012). Students are taught how to flexibly apply and monitor five different decoding strategies and are taught comprehension and monitoring strategies.  The 268 students who received the intervention for 60-70 hours achieved significant gains, with an average effect size of 0.67 across all measures (Lovett et al., 2012). Medium effect sizes (d = 0.57) for complex multisyllabic word decoding were reported (Lovett et al., 2012), which supports the utility of flexible decoding strategy instruction; even for older readers. The latter reported gains in generalized word-decoding skills to unknown real words and the flexible use of strategies is encouraging and is consistent with the NRP (2000), which highlights the need for instruction in a range of specific reading strategies. This study will include a range of efficiently modeled flexible decoding strategies and ample opportunity to practice and monitor these strategies during the reading of connected text.  Instructional Efficiency   To serve more students with finite resources, the importance of effective instruction that has been shown to improve student achievement has increasingly come to the forefront of   19 educational discourse (Cates, Burns, & Joseph, 2010). Further, the notion of instructional efficiency, which considers both the learning rate of the student in response to an intervention (essentially the intervention’s effectiveness) and the time spent on the intervention, merits careful consideration when recommending school-based interventions (Cates et al., 2010; Erchul & Martens, 2010). Thus, we need to be able to identify effective strategies for targeted small group instruction, and efficient intervention at the individual level. However, more research is needed to establish what instructional methods for learning to decode words in the context of connected text are not only effective but also efficient (Joseph & Schisler, 2007). Thus, in this study, each student’s multisyllabic word reading errors will be tracked and subsequent multisyllabic word reading instruction will be based upon each student’s specific error patterns. The rationale for this is that each participant’s error patterns will be targeted with the intention of maximizing the instructional efficiency of this intervention at the individual level. The Current Study This study expands on previous research on effective reading interventions for students who struggle with reading in the upper elementary grades. The current study aims to efficiently support older struggling readers in the strategic use of orthographic, phonological, semantic and morpho-syntactic information and the integration of these decoding processes to flexibly decode unknown words meaningfully and to increase the quality of their lexical representations and their broad reading skills. Specifically, in this study, an efficient and easily employed error-correction component, which explicitly modeled three word-decoding strategies, is implemented in response to participants’ specific multisyllabic word reading errors. Further, the targeted multisyllabic word reading component is implemented within the context of evidence-based fluency instruction to promote generalization of student gains in reading skills by providing   20 multiple examples and frequent opportunities to practice a range of skills across a range of settings (Stokes & Baer, 1977), which, in this case, is a range of decoding strategies across a range of unfamiliar reading passages.   To summarize, the three multisyllabic word reading approaches are: 1) left-to-right phonetic decoding and blending (only when appropriate), 2) syllabication with variable vowel pronunciation, 3) morphemic analysis of the root word, prefix and suffix (as appropriate) and attempting variable vowel pronunciations. Students’ gains in reading fluency, multisyllabic word decoding, and overall broad reading skills, are monitored. Students also complete a social validity measure to gauge the acceptability of the intervention and their perceived reading skills gains.  The following hypotheses are evaluated: 1. The instructional procedures will result in direct, within-session gains in reading fluency and multisyllabic word decoding accuracy on the instructional passages. 2. The instructional procedures will result in generalized gains in reading fluency on untaught instructional passages and standardized passages.  3. The instructional procedures will result in generalized gains in multisyllabic word decoding accuracy when presented in lists and within passages. 4. The instructional procedures will result in improvements in participants’ broad reading skills. 5. Students will perceive the instructional procedures favorably and will perceive that their reading skills improved following the 8-week intervention sequence.     21 Chapter 3: Methodology Participants  Three grade 4 and one grade 5 struggling readers who had no diagnosed learning disabilities who were not receiving other reading interventions and attending a public elementary school in Vancouver, British Columbia, were selected to participate in this study. Teachers initially referred a number of students for this study who were perceived to demonstrate social and emotional behaviors conducive to participation in individual academic interventions (e.g., they would be comfortable working with an initially unfamiliar adult). Selected students were required to demonstrate  70% accuracy on multisyllabic word reading, below grade level reading fluency levels, and perform no more than 1.5 standard deviations below the mean on a brief test of cognitive functioning. Students’ names have been replaced with pseudonyms. Materials Initially, a set of 22 instructional passages were selected from BC Science textbooks (Chapman, Dawkins & Herrin, 2005; Doyle & Bowman, 2006; Mason & Walsh, 2005) for this study. To help obtain an appropriate instructional fit, the 22 passages were leveled through the use of Lexile®Analyzer (MetaMetrics, 2016). Lexile®Analyzer (MetaMetrics, 2016) is an easily accessible online text-leveling program that analyses passage complexity and readability through a Lexile quotient. In the current study, the Lexile quotient was used to help ensure that the instructional passages created were at an appropriate level. The 22 passages had an average Lexile quotient of 844 (range = 700 – 950) which falls within the Grade 4 and 5 Lexile bands (CCSS Text Measures, 2012). The passages were 155 words in length (range = 119 – 191) and contained an average of 22 multisyllabic words of ≥ 3 syllables per passage (range = 12 – 28). During the 2nd baseline session, the first student read the instructional passage with a fluency rate   22 of 148 WCPM with only 2 multisyllabic word reading errors, compared to the 1st baseline session passage at a rate of 94 WCPM with 8 multisyllabic word reading errors. Thus, the researcher re-evaluated the consistency of the passages. Based on the metrics above, 16 of the least challenging original passages were expanded. These 16 expanded passages had an average Lexile quotient of 879 (range = 770 –990), which also falls within the Grade 4 and 5 Lexile bands (CCSS Text Measures, 2012). The passages were 182 words in length (range = 163 – 225) and contained an average of 24 multisyllabic words ≥ 3 syllables per passage (range = 13 – 36). Specifically, following the 2nd baseline session the first student received the expanded passages throughout all subsequent baseline and intervention sessions. The second student read the expanded passages following the 1st baseline session. The third student read the expanded passages throughout intervention and baseline. Based on the student’s lower reading fluency levels determined during screening, the fourth student started baseline on the original passage sets. The original passage sets provided an appropriate instructional fit and this student thus continued reading the original passage set throughout baseline and intervention. Measures Measures of multisyllabic word decoding, oral reading fluency, and cognitive functioning were administered to most appropriately select students to participate in this study. The Dynamic Indicators of Basic Early Literacy Skills 7th Edition Next (DIBELS Next; Good & Kaminiski, 2011) passages, researcher-created expository passages, and researcher-created multisyllabic word probes were administered throughout the phases of this study to ascertain students’ gains in reading fluency and multisyllabic word reading skills. A social validity measure was employed following the study to gauge students’ perceptions regarding the appropriateness of the intervention and the perceived impact of the intervention upon their reading skills.   23 Standardized multisyllabic decoding skills. The Consortium On Reading Excellence Phonics Survey, 2nd ed. (CORE-PS; Park, Benedict, & Brownell, 2014) is a criterion-referenced measure of students’ ability to read real and pseudo-words that takes roughly 10-15 minutes to administer per student. The CORE-PS is, according to its developers, capable of informing placement decisions for targeted decoding intervention groupings (CORE, 2008; Reutzel, Brandt, Fawson, & Jones, 2014). A comparison of the content overlap between the Reading Foundation Skills-Phoneme-Grapheme Correspondence section of the Common Core State Standards (NGA & CCSSO, 2010) and the content of the CORE-PS revealed a 97% overlap. Researchers (Park et al., 2014) determined that the CORE-PS significantly predicted students’ reading skills on a norm-referenced reading assessments one year later. The internal consistency reliability of the multisyllabic words subtest (subtest L) is high with a Cronbach’s alpha of .97 (Reutzel et al., 2014). In sum, these latter results help support the use of subtest L of the CORE-PS (Park, Benedict, & Brownell, 2014) as an acceptable measure of a student’s multisyllabic decoding skills. Cognitive functioning. The WJ-IV BIA is an abbreviated, standardized, norm-referenced and valid screening measure of overall cognitive functioning (LaForte, McGrew, & Schrank, 2014). The WJ IV-BIA is comprised of three subtests which each measure a different cognitive ability from which an overall IQ composite can be derived. One subtest is a measure of verbal intelligence, one is a measure of nonverbal intelligence, and one is a measure of short-term working memory.  These three subtests are closely correlated with measures of overall intelligence with a reported correlation coefficient of 0.86 with the widely used Wechsler Intelligence Scale for Children Fourth Edition (LaForte, McGrew, & Schrank, 2014). The strong magnitude of this correlation supports the use of the WJ IV-BIA as a valid screening measure of   24 general intelligence (LaForte, McGrew, & Schrank, 2014). The WJ-IV is appropriate for individuals aged 2:0 to 90:0+ and takes approximately 10-15 minutes to administer (LaForte, McGrew, & Schrank, 2014). Reading fluency. The DIBELS Next Oral Reading Fluency (DORF) assessments (Good & Kaminski, 2011) are efficient, individually administered assessments of reading fluency. Students were asked to read grade-level passages aloud for one minute while an assessor listened to the student reading and noted errors such as substitutions, mispronunciations and hesitations. DORF passages are a reliable and valid standardized tool with which to measure students’ accurate and fluent reading of connected text (Good, Kaminski, Dewey, Walling, Powell-Smith, & Latimer, 2013). Alternate form, test-retest, and inter-rater reliability is consistently high with fourth and fifth grade DORF reliability coefficients ranging from 0.76 - 0.99 (Good et al., 2013). Criterion-related validity coefficients for the fourth and fifth grade DORF single passages with the National Assessment of Educational Progress 4th Grade Passage range from .89-.96 (Good et al., 2013). DIBELS passage sets include expository passages. As many expository passages as possible per grade level were utilized. The DIBELS expository passages were selected and evenly distributed across the progress monitoring sessions. Where necessary, narrative passages supplemented the expository DIBELS passage probes.  The instructional passages also were employed as a proximal measure of reading fluency. To directly measure reading fluency in each session, students were asked to read one expository instructional passage aloud for 1 minute during which time the assessor listened to the student reading and noted overall errors such as substitutions, mispronunciations, and hesitations. To provide a direct measure of reading fluency the number of words read correctly was divided by the total number or words read per minute. The above procedure was repeated twice more. The   25 students’ performance on the final reading of the expository passage was then compared to their first reading to provide a measure of within session gains in reading fluency. Within passage multisyllabic word decoding.  Using the DIBELS passages, the number of multisyllabic words the student read incorrectly during the 1-minute reading was divided by the total number of multisyllabic words present in the section read during the first 1-minute read. The above procedure was repeated twice more. The students’ performance on the final multisyllabic word reading accuracy on the instructional passage was then compared to their first reading to provide a measure of within session gains in multisyllabic word reading accuracy on instructional passages. Multisyllabic word decoding. A researcher-designed multisyllabic word probe list was administered twice weekly to monitor student gains in distinct multisyllabic word reading skills. All participants were administered a list of 10 multisyllabic words derived from a researcher-created word bank. This measure of isolated multisyllabic word reading skills was included to precisely gauge participants’ gains in this specific reading skill. The lists of 10 words were randomly assigned to sessions. The word lists had orthographic patterns and syllable lengths of equivalent difficulty level. For example, the words improvement and engagement both have a two-letter phonetically decodable first syllable (im, en), an identical suffix (ment) and a root word with an ‘e’ controlled long vowel (prove, gage). Thus, simulating similar task demands at each probing occasion was considered and addressed in this study. Specifically, the number of whole words read correctly and the number of correctly read/pronounced syllables within each 10-word probe were measured.  Broad reading skills. The Formative Assessment System for Teachers (FAST) Adaptive Reading (aReading) is an individually administered computer-adaptive test that offers efficient,   26 reliable, and valid insight into students’ reading achievement and comprehension from kindergarten to twelfth grade (Christ, 2014). aReading is a 30-item assessment with a typical administration time of 15 minutes. Items target a range of reading domains including concepts of print, phonological awareness, phonics, vocabulary, comprehension, and for students at or above a grade 6 skill level: orthography, morphology, vocabulary and comprehension (Christ, 2014).  Students were presented with questions of varying difficulty and question presentation was dependent upon students’ previous response pattern. This form of computer adaptive response branching offers advantages over more traditional pen and paper tests in that testing times are minimized, and test reliability is improved (Christ, 2011). aReading has reliability coefficients that range from 0.75 - 0.95 and validity coefficients ranging from 0.78 - 0.80 (Christ, 2014). Additionally, aReading is a reliable and valid measurement tool by the National Center for Response to Intervention (2010), which offers further evidence for the utility of aReading as a measure of students’ broad reading skills.  Social validity. A measure of social validity, which was modeled after the Children’s Intervention Rating Profile, (CIRP; Witt & Elliott, 1985) was completed by the participants following the final intervention session. As outlined by Lindo and Elleman (2010), social validity of an intervention should include the perceived value and acceptability of the goals, procedures, and outcomes. Thus, the 6-point Likert-type scale ranged from 1 (strongly disagree) to 6 (strongly agree) and included the following statements: (a) “I think that the way I am learning to read long words is fun”; (b) “I think that what I am learning to do is important”; (c) “I think that the time this intervention takes is too long”- a reversed scored item; (d) “I think that these activities are helping me to get better at reading longer words”; (e) “I am getting better at   27 reading quickly and accurately”; (f) “I am getting better at understanding what I am reading”; and (g) “I think that the way I am learning to read longer words will work for other students.”  Procedures Experimental design. A delayed multiple baseline design (Heward, 1978) was employed to examine the effects of combining three flexibly applied modeled multisyllabic word-decoding strategies to evidence-based fluency strategies on the text reading skills of four upper-elementary school students. The delayed multiple baseline design included a staggered commencement of baseline across participants, instead of continuous and concurrent baseline assessment, and consequently each participant’s intervention phase begun in a staggered or delayed fashion (Heward, 1978).  The delayed multiple baseline design is particularly useful when a reversal design is undesirable (Heward, 1978) or unachievable. In the current study the dependent variables of reading fluency, multisyllabic word reading and broad reading skills (including reading comprehension) were non-reversible. As such, a multiple baseline design was a suitable design. The inclusion of three participants in this study ensured the possibility of a minimum standard of three functional effects at three different times (Horner et al., 2005) which is required for internal validity (Kratochwill et al., 2013). The fourth participant was included to help control against the possibility of attrition.  In accordance to single case design standards (Kratochwill et al., 2013) the primary dependent variables were monitored during both baseline and intervention phases of this study.  However, due to the logistics regarding the finite number of DIBELS probes, employing a delayed multiple baseline design ensured a sufficient number of baseline and intervention probes were available across all participants and phases of the study. Six baseline data points were   28 collected over three weeks across participants. This exceeded the recommendations of Kratochwill et al. (2013) of five data points. Additionally, as outlined by Kratochwill et al. (2013), participant reactivity, or improvement in reading skills could occur through exposure to the repeated measure of oral reading fluency. Thus, reducing the number of probe sessions during baseline across participants was predicted to minimize the potential for participant reactivity prior to intervention implementation. Additionally, due to the fact that academic data can be highly variable, the phase change occurred on a fixed schedule across participants which occurred following six baseline sessions. In the current study a response-guided, or longer, baseline design was not viewed as ideal for participants as ethically the researchers wanted to begin providing intervention as promptly as possible without compromising the credibility of the design. The six baseline data points met design criteria recommendations (Kratochwill et al., 2013) and respected participants’ best interests. Once baseline data were collected, the 8-week intervention phase commenced for the first participant at the beginning of the 4th week of the study. Phase change, or introduction of the intervention, occurred in the 5th, and 6th week of the study, for the second and third participant, respectively. The fourth participant was absent during the baseline phase and entered baseline during the 6th week of the study and commenced intervention during the 9th week of the study.  Recruitment and screening. A total of 11 students were initially referred by teachers to participate in the current study. Following the initial referral and prior to any student involvement in the study, parents of the referred students provided informed consent. It was made clear to parents that should their child, following the screening procedures not be selected for the study, that they may request a sheet of reading support recommendations. The researcher informed parents that they could withdraw consent on behalf of their child at any point. Student   29 informed assent was obtained from all participants. To assess students’ decoding skills, those students initially referred for the intervention by the classroom teacher were screened with the CORE-PS (Park, Benedict, & Brownell, 2014). Four of these students did not meet inclusion criteria because they read with greater than 70% accuracy on the CORE-PS multisyllabic word decoding subtest. To determine students’ optimal progress monitoring level, the remaining seven students were screened with three DIBELS Next Benchmarking probes (Good & Kaminski, 2011). To assess students’ reading level, three grade-level DIBELs Next Benchmarking probes were administered. Students began by reading benchmark passages matched to their grade level. Students were requested to read each passage for one minute during which student WCPM was recorded. A median score across the three passages was calculated. If a student’s score did not fall above the fall benchmark for their expected grade level the procedure was repeated with benchmark passages for the preceding grade level until student performance fell between the fall and spring benchmark levels for that grade level. Students selected for inclusion were required to demonstrate reading fluency one to two levels below grade-level expectations. Although this range captures a variety of reading proficiency levels, this distribution represents students who struggle with grade-level reading and who were predicted to respond to this intervention. Three students read below grade-level expectations on the DIBELS Next passages and were selected to participate in the current study. The other four students read at their grade level, or above, so the student with the highest level of multisyllabic word decoding errors was selected to participate in this study. The four selected students were Lisa, Oscar, Katy, and Iris. Although Katy was reading at her expected reading level, she struggled so significantly with multisyllabic word-reading accuracy (as highlighted above), that she was selected to participate in this intervention   30 as she was expected to benefit from this intervention. Finally, the WJ-COG-BIA was administered as a brief measure of cognitive functioning to ensure average cognitive functioning.  Assessment during baseline and intervention. Before the first baseline session, selected students completed the aReading measure of broad reading skills. Additionally, selected students were measured on the primary dependent variables of reading fluency and multisyllabic word reading accuracy twice per week during the baseline. To monitor reading fluency, students read a novel expository instructional passage and an unfamiliar DIBELS Next passage at each session during the baseline phase. Multisyllabic word reading accuracy was measured via the 10-item multisyllabic word reading probe and within text multisyllabic word reading was determined via their multisyllabic word reading accuracy demonstrated while reading the instructional and DIBELS passages. Following three weeks of baseline participants’ broad reading skills were measured again via aReading and the intervention phase was introduced to students in a staggered fashion.   During the intervention phase, students read one DIBELS Next passage per session as a measure of generalized reading fluency. Additionally, students repeatedly read an expository instructional passage three times per session as part of the twice-weekly intervention. Although the students read the passage three times during the intervention phase, the reading fluency rates of only the first and final passage reading were recorded. Both the DIBELS Next and the instructional passages served as a measure of student within-text multisyllabic word decoding skills. The accuracy of students’ within text multisyllabic word reading was monitored at the whole word and syllabic level. The 10-word researcher-created multisyllabic word probe was also administered during the intervention phase. Following the intervention phase students’ overall broad reading skills were individually measured with aReading.    31 Intervention phase. Directly following the baseline, intervention sessions commenced. The four selected students participated in twice-weekly assessment and intervention sessions at their school.  Participants received twice-weekly 15-minute 1:1 intervention sessions administered by the author, a graduate researcher enrolled in the School Psychology program at the University of British Columbia. Each participant received eight weeks of intervention, totaling 16 reading intervention sessions.  The multicomponent intervention combined the modeling of multisyllabic word decoding strategies with specific reading fluency instructional components. The reading fluency instructional components included verbal cueing, repeated reading, goal setting, modeling, and performance feedback. These evidence-based strategies have been shown to improve reading fluency in students (Begeny, Krouse, Ross, & Mitchell, 2009; Chard, Vaughn & Tyler, 2002; Morgan & Sideridis, 2006; Sindelar, Monda, & O’Shea, 1990; Therrien, 2004). Although the sequence of the intervention was similar to the Helping Early Literacy with Practice Strategies (HELPS) intervention (Begeny, 2009) the HELPS passages were replaced with the researcher-created expository passages. Additionally, the phrase-drill error correction procedure was replaced with the think aloud multisyllabic word reading component. During the intervention sessions, students were required to complete three one-minute timed readings on the researcher-created expository passage. During the first read, the interventionist noted reading errors and highlighted the multisyllabic words read incorrectly. Following the first read the interventionist graphed the student’s WCPM and prompted the student to set a goal for their third reading. The interventionist then read the passage in full. Upon encountering a multisyllabic word read incorrectly by the student, the interventionist then wrote the error word down on a whiteboard and the student was given an opportunity to identify   32 familiar morphemes or syllables. The interventionist then explicitly modelled how to strategically decode the multisyllabic word by vocalizing her thinking and verbally modeled the use one of the three multisyllabic word decoding strategies on the respective word. Modeling was done for up to five incorrectly, or least fluently, read multisyllabic words per passage reading. If more than five multisyllabic words were incorrectly, or dysfluently read, the interventionist modelled the decoding strategy for the multisyllabic words that had been repeatedly read incorrectly. Alternatively, if the student’s first read of the expository passage yielded less than five errors then the interventionist chose five multisyllabic words evenly distributed across the passage to model. Appendix B outlines an example script that was used to model the various multisyllabic word-reading strategies. After the student’s second timed reading the interventionist repeated the think-aloud multisyllabic word reading component as outlined above. Finally, the student completed their last timed reading for the session and the interventionist recorded and graphed their WCPM and provided performance feedback to the student in relation to their goal set at the outset of the session. All sessions were audio recorded.  Inter-scorer agreement and procedural integrity. Intervention implementation was monitored for treatment fidelity and audio recorded. The instructional steps and intervention script are included in Appendix C. The graduate student implementing the intervention completed a checklist of the intervention steps after each session. Procedural integrity was determined by a second researcher who listened to audio recordings of 30% of the sessions across baseline and intervention. To determine procedural integrity, the second researcher was given a list of the intervention steps and marked off the completed steps while listening to the audio recordings. Each session was then given a procedural integrity score which was calculated by dividing the number of completed steps by the number of total steps and then multiplying the   33 resultant quotient by 100%. The mean score for procedural integrity across students was 99% (range: 90% to 100%).   Multisyllabic word reading accuracy measures, and reading fluency measures were examined to ensure inter-scorer agreement. Training for the audio reviewers was provided by the student researcher and included practice scoring and co-scoring with the researcher until a competence level of ≥ 95% was demonstrated. Inter-observer agreement (IOA) checks were then conducted on 30% of randomly selected sessions in the baseline and intervention phase which exceeds the Design Standards’ recommendations of collecting IOA data for at least 20% of the data points across all phases of the study (Kratochwill et al., 2013). To determine IOA accuracy rates, the reviewer listened to the audio-recordings and scored students’ reading fluency and accuracy on the multisyllabic word reading lists. Oral reading fluency IOA rates were calculated for each of the first reads per session reviewed, by dividing the lower value WCPM by the higher value WCPM and multiplying the quotient by 100. The multisyllabic word reading accuracy IOA rates were determined by dividing the total number of overall agreements by the total number of overall agreements plus overall disagreements and then multiplying the quotient by 100. To meet the inter-assessor agreement evidence standards a minimum acceptable percentage agreement range of 80% – 90% was required (Kratochwill et al., 2013).  The mean percent agreement for WCPM across students on the first reads of the instructional passages was 98% (range: 92% to 100%), suggesting good agreement. The mean percent agreement for WCPM across students on the DIBELS passages was 97% (range: 92% to 100%), suggesting good agreement. The mean percent agreement for multisyllabic word reading accuracy on the Word Lists across students at the whole word level was 93% (range: 80% to   34 100%) and at the syllabic level was 84% (range: 80% to 100%), suggesting good agreement overall.  Data analysis. Student performance on the multisyllabic word reading probes, DORF probes, instructional passages, performance on the computer adaptive aReading assessment and on a brief measure of social validity were analyzed. To determine the presence of an effect for each tier of the multiple baseline, consistent evidence across visual analysis and statistical analysis plus evidence of a non-trivial effect size magnitude was required. Specifically, the oral reading fluency and the multisyllabic word probe data collected during the baseline and intervention phase were graphed and visually analyzed for level, trend, variability. The relative slope change in oral reading fluency and multisyllabic word probe reading between the baseline and intervention phase was calculated. Similarly, data obtained via aReading at the onset of intervention, at phase change, and during follow-up were compared to determine whether participants’ broad reading skills improved in response to the intervention. Students’ responses on a measure of social validity administered following the completion of the eight-week intervention phase were also analyzed.  Finally, the performances of the four participants were examined to see if a similar pattern of response was documented across participants on the dependent variables. As required by the Evidence Standards, this was done to ascertain whether there were three documented functional effects as evidence of experimental control (Kratochwill et al., 2013). Descriptive statistics include participants’ mean ORF performance, mean multisyllabic probe reading accuracy, mean within passage multisyllabic whole word reading, and mean whole word and syllabic reading accuracy of multisyllabic-words on a word list. The standardized mean   35 difference effect size (mean difference / baseline SD) was calculated when ceiling effects were evident.  The Tau effect size was employed as an objective measure with which to complement visual analysis. The Tau effect size offers several advantages over other nonoverlap methods in that it can account for positive trends in baseline and is typically consistent with visual analysis (Rakap, 2015) and is appropriate to use even when there are few data points (Parker & Vannest, 2012). Finally, it offers 91-95% the precision power of regression (Parker & Vannest, 2012).  A disadvantage of Tau is the necessity for a statistical program (Rakap, 2015). However, there are readily available online calculators to facilitate this calculation. The variant of Tau utilized in the current study considers improvement during intervention as improvement in addition to nonoverlap between baseline and intervention.  If a statistically significant improving baseline trend was determined using the Tau test of statistical significance then the variant of Tau (TauA vs. B + Trend(B) – Trend(A)) that corrects for an improving baseline trend was used as a measure of effect size. If the improvement in trend during baseline was non-statistically significant then the non-corrected Tau variant (TauA vs. B + Trend(B)) was used as a measure of effect size. In the current study, the non-corrected Tau variant was used across all independent variables as no statistically significantly improving trend was determined using the Tau test of statistical significance.         36 Chapter 4: Results Screening Assessments   Screening results for the selected students are reported in Table 1 below.  Table 1. Student Demographics and Screening Results Student Grade Language CORE-PSa Instructional Reading Levelb Instructional Passage Readingc (WCPM) BIA Scored (SS) Lisa 5 English 70% 4 98 104 Oscar 4 English 70% 3 59 106 Katy 4 Cantonese 50% 4 110 107 Iris 4 English 54% 3 60 98 Note: aSection L of the CORE Phonics Survey, bBased on DIBELS survey-level assessment, cAn expository passage taken from a BC Science textbook, dThree subtests from the Woodcock Johnston-IV Tests of Cognitive Abilities-Brief Intellectual Ability.   Results of the multisyllabic word decoding measure indicated that two of the selected students (Lisa and Oscar) committed 7 decoding errors with an overall multisyllabic word decoding accuracy of 70%. Iris committed 11 errors resulting in an overall accuracy of 54% and Katy committed 12 decoding errors resulting in an overall 50% multisyllabic word decoding accuracy. The results of the DIBELS assessment revealed that two of the selected students (Oscar and Iris) read at a Grade 3 oral reading fluency level while the remaining two students (Lisa and Katy) read at a Grade 4 oral reading fluency level.   Oscar and Iris read 59 and 60 WCPM on the instructional passages, respectively. According to the Hasbrouck and Tindal (2006) fall norms, Oscar and Iris read at a level approximately equivalent to the 10th percentile for their grade on these passages. One of the students, Lisa, read 98 WCPM, which suggested that she read at approximately the 25th percentile. Finally, Katy read 110 WCPM, which suggested that she read the instructional passage at approximately the 50th percentile for her grade level. The results of the WJ-COG-BIA   37 showed that all students scored within the average range on the cognitive screening measure suggesting age-appropriate cognitive abilities.  Reading Fluency   To assess the instructional procedures, gains in reading fluency were measured and carefully monitored throughout the current study. If consistent within-session gains were observed between the first and third readings of the passages, these gains could be reasonably anticipated to generalize. Students’ performance in reading fluency on the instructional passages are presented in Figure 1 below.    38  Figure 1. Students’ oral reading performance on first and third reads of the instructional passages across baseline and intervention  Within-session reading fluency. A visual analysis of these data show no instances of overlapping data points. The results show a clear separation in WCPM between the first and third reading of the instructional passages during the intervention phase across students. In other 04 06 08 01 0 01 2 01 4 01 6 01 8 0L isaB a s e l in e In terv en tio n1 s t  R e a d3 rd  R e a d04 06 08 01 0 01 2 01 4 01 6 01 8 0O s c a r04 06 08 01 0 01 2 01 4 01 6 01 8 0K a ty0 2 4 6 8 1 0 1 2 1 4 1 6 1 8 2 0 2 2 2 4 2 6 2 8 3 0 3 204 06 08 01 0 01 2 01 4 01 6 01 8 0Ir isS e s s io nWords Correct per Minute  39 words, students read a greater average WCPM during the third read compared to the first reading of the instructional passage [Lisa: M = 30 (SD = 6); Oscar: M = 39 (SD = 11); Katy: M = 24 (SD = 11); Iris: M = 28 (SD = 12)] during the intervention phase. These results indicate that all students improved their within-session WCPM from the first to third reading across the eight-week intervention phase.  Generalized reading fluency. Results for reading fluency on instructional passages (based on initial readings in session) are presented first and are then presented for DIBELS passages. Instructional passages. Performance on instructional passages, including the means, standard deviations and slopes (SL) of students’ WCPM performance on the first reading during the baseline and the intervention phase is presented in Table 2 below.  Table 2. Means, Standard Deviations and slopes of WCPM on Initial Readings of the Instructional Passages across Baseline and Intervention. Student Mean WCPM (SD)  Mean  WCPM Slope BL Int  BL  Int  Lisa 113 (21) 125 (15) +12 -1.25 2.27 Oscar 68 (13) 93 (17) +25 1.14 2.14 Katy 92 (9) 121 (14) +29 1.31 0.89 Iris 88 (24) 92 (9) |+4 -0.77 0.98 Note: WCPM = words correct per minute; BL = baseline phase; Int = intervention phase. Although Lisa had one unusually high score during baseline, she experienced a positive change in trend upon implementation of intervention (Baseline: -1.25 WCPM per session; Intervention: 2.27 WCPM per session) that eventually resulted in an improvement in level (MBL = 113, MInt = 125). The Tau effect size for Lisa’s WCPM on the instructional passages was 0.49, 95% CI [.18, .79], demonstrating a moderate, statistically significant (p = .002) effect.   40 Oscar also experienced a positive change in trend upon implementation of the intervention (Baseline: 1.14 WCPM; Intervention: 2.14). This resulted in a steady improvement in level throughout the intervention. (MBL = 68, MInt = 93). The Tau effect size for Oscar’s WCPM on the instructional passages was 0.58, 95% CI [ .27, .89], demonstrating a moderate, statistically significant (p < .001) effect. Although Katy did not demonstrate a positive change in trend from baseline to intervention (Baseline: 1.31 WCPM; Intervention: 0.89), she experienced an immediate change in level upon introduction of the intervention that was sustained throughout the intervention phase (MBL = 92, MInt = 121). The Tau effect size for her WCPM on the instructional passages was 0.54, 95% CI [.23, .85], demonstrating a statistically significant (p < .001) effect. Finally, Iris experienced a negative trend in baseline (Baseline: -0.77 WCPM), and higher variability in the data compared to her peers which was largely attributable to one high score which occurred in the first session in baseline. However, upon introduction of the intervention she experienced less variability and an eventual but slight improvement in trend (Intervention: 0.98 WCPM) that resulted in a very slight increase in level from baseline to intervention (MBL = 88, MInt = 92). However, due to the variability in her reading fluency performance on the instructional passages, and the only very slight improvement in level and trend, a clear functional effect cannot be claimed. Consistent with the visual analysis, the Tau effect size for Iris was 0.31, 95% CI [.00, .62], demonstrating a very weak, statistically insignificant effect (p = .052). In sum, visual and statistical analyses indicate three demonstrations of a functional effect. Although one student (Iris) did not demonstrate a functional effect in relation to WCPM on the instructional passages, the three clear demonstrations of a functional effect for the other students   41 provide evidence to support the intervention in promoting generalized reading fluency in upper-elementary struggling readers.   DIBELS passages. Performance on DIBELS passages is presented in Table 3 and Figure 2 below.  Table 3. Means, Standard Deviations and slopes of WCPM on the DIBELS Passages Across Baseline and Intervention.  Mean WCPM (SD)  Mean  WCPM Slope  Student BL Int  BL  Int  Lisa 125 (7) 133 (16) +8 3.2 1.15 Oscar 89 (12) 106 (15) +17 -1.23 1.11 Katy 127 (9) 133 (20) +6 1.29 2.58 Iris 100 (14) 114 (11) +14 0.57 0.73 Note: WCPM = words correct per minute; BL = baseline phase; Int = intervention phase.   42  Figure 2. Students’ generalized gains on DIBELS passages Although Lisa experienced a strong acceleration trend in baseline that was not sustained throughout intervention (Baseline: 3.2 WCPM per session; Intervention: 1.15 WCPM per session), there was a gain in level (MBL = 125, MInt = 133). A visual analysis of the data demonstrates overlap and variability complicating the analysis. Consistent with visual analysis 08 01 0 01 2 01 4 01 6 01 8 0L isaB a s e l in e In terv en tio n08 01 0 01 2 01 4 01 6 01 8 0O s c a r08 01 0 01 2 01 4 01 6 01 8 0K a ty0 2 4 6 8 1 0 1 2 1 4 1 6 1 8 2 0 2 2 2 4 2 6 2 8 3 0 3 208 01 0 01 2 01 4 01 6 01 8 0Ir isS e s s io nWords Correct per Minute  43 the Tau effect size for Lisa’s WCPM on the DIBELS passages was 0.26, CI 95% [-.05, .57], demonstrating a weak and statistically insignificant (p = .109) effect. Following a slight deceleration trend in the baseline, Oscar experienced a change in trend upon implementation of the intervention (Baseline: -1.23 WCPM per session; Intervention: 1.11 WCPM per session) and an increase in level (MBL = 89, MInt = 106). Visual analysis of Oscar’s data reveals minimal overlap, but some variability. Overall the visual analysis is consistent with statistical analysis of a moderate Tau effect size for Oscar, 0.47, CI 95% [.16, .78], that was statistically significant (p = .003). Katy experienced a slight acceleration trend in baseline which accelerated further during intervention (Baseline: 1.29; Intervention: 2.58) and resulted in an eventual improvement in level (MBL = 127, MInt = 133). However, variability and overlap are notable and complicate analysis. The Tau effect size for Katy’s WCPM on the DIBELS passages was 0.27, CI 95% [-.03, .58], demonstrating a weak statistically insignificant (p = .091) effect size.  Although Iris experienced only a minimal increase in trend from baseline to intervention (Baseline: 0.57; Intervention: 0.73) this resulted in an overall increase in level (MBL = 100, MInt = 114). However, there is overlap and variability in the data complicating the analysis. The Tau effect size for Iris was weak to moderate at 0.37, CI 95% [.06, .68], yet statistically significant (p = .021). In sum, the visual and statistical analysis of the students’ WCPM on the DIBELS passages is inconsistent. Additionally, the variability and overlap in the DIBELS data across students complicates the analysis. There is insufficient evidence to support the use of this intervention in enhancing generalized reading fluency gains on DIBELS passages in upper-elementary school students.    44 Multisyllabic Word Decoding To further assess the instructional procedures, gains in multisyllabic word decoding were measured throughout the current study. If consistent within-session gains were observed between the first and third readings of the passages, these gains could be reasonably anticipated to generalize. Students’ performance in multisyllabic word decoding accuracy on the instructional passages are presented in Figure 3 below. Students’ performance in multisyllabic word decoding accuracy on the DIBELS passages are presented in Figure 4 below. Student performance on word list probes were also monitored at the whole word and syllabic level and are presented in Figure 5 below.    45  Figure 3. Students’ multisyllabic word decoding accuracy on the first and third reads of the instructional passages during the intervention phase  Within-session multisyllabic word decoding. A visual analysis of these data show some instances of overlapping data points. Yet, there is generally separation between the first and third read of the instructional passage during the intervention phase. Except for a few sessions (Lisa & 02 04 06 08 01 0 0L isaB a s e l in e In terv en tio n1 s t  R e a d3 rd  R e a d02 04 06 08 01 0 0O s c a r02 04 06 08 01 0 0K aty0 2 4 6 8 1 0 1 2 1 4 1 6 1 8 2 0 2 2 2 4 2 6 2 8 3 0 3 202 04 06 08 01 0 0IrisS e s s io nPercent Accurate  46 Oscar: 1 session; Katy & Iris: 2 sessions), students either maintained accuracy levels or made gains in within passage multisyllabic word decoding accuracy within each session. There were a few instances of identical performance in accuracy. Additionally, students read a higher average percentage (Lisa: M = +8%; Oscar: M = +9%; Katy: M = +17%; Iris: M = +14%) of multisyllabic words accurately during the third reading of the instructional passage compared to the first reading of the instructional passage within each intervention session. Thus, there were gains in multisyllabic word decoding on the instructional passages that were expected to generalize.  Generalized within-passage multisyllabic word decoding. Results for within-passage multisyllabic word decoding accuracy are presented for multisyllabic word decoding accuracy on the instructional passages (based on initial readings) and on DIBELS passages below.    47  Figure 4. Multisyllabic Word Decoding Accuracy on the DIBELS Passages     02 04 06 08 01 0 0L isaB a s e l in e In terv en tio n02 04 06 08 01 0 0O s c a r02 04 06 08 01 0 0K a ty0 2 4 6 8 1 0 1 2 1 4 1 6 1 8 2 0 2 2 2 4 2 6 2 8 3 0 3 202 04 06 08 01 0 0Ir isS e s s io nPercent Accurate  48 Table 4. Mean Percentage of Multisyllabic Words Read Correctly Within Passages Across Baseline and Intervention phases and associated slopes.  Instructional Passages DIBELS Passages Student BL Int  %  BL SL  Int SL BL Int %  BL SL Int SL Lisa 78 89 +11 -3 0 90 91 +1 3 0 Oscar 82 88 +6 2 0 90 92 +2 2 1 Katy 60 74 +14 3 1 81 80 -1 -1 1 Iris 70 79 +9 4 0 78 88 +10 6 0 Note: BL =  baseline phase; Int =  intervention phase, %   =  percentage difference btw baseline ‘cold’ read and 1st read during intervention; SL = slope.  Instructional passages. Performance on instructional passages, including the mean percentage of multisyllabic words read correctly within passages and the associated slopes of students’ multisyllabic word decoding accuracy on the first reading during the baseline and the intervention phase is presented in Table 4 and Figure 3 above. Although Lisa experienced a negative trend during the baseline which improved throughout intervention (Baseline: -3; Intervention: 0) and she experienced both an abrupt increase in level upon introduction of the intervention and an overall increase in level (MBL = 78, MInt = 89) there is high variability and overlap present in the data. The Tau effect size for Lisa’s multisyllabic word decoding accuracy on the instructional passages was 0.21, CI 95% [-.1, .52], demonstrating a weak, statistically insignificant (p = .196) effect. However, the standardized mean difference effect size was 0.73, demonstrating a small effect.  Oscar experienced a slight acceleration trend during the baseline that was not sustained throughout the intervention (Baseline: 2; Intervention: 0) but resulted in a slight improvement in level overall (MBL = 82, MInt = 88). However, according to visual analysis, data variability and data overlap between baseline and intervention were high, minimizing the potential for a meaningful effect. This is consistent with the Tau effect size analysis which revealed a weak and   49 statistically insignificant (p = .154) Tau effect size of 0.24, CI 95% [-.08, .56]. However, there was a small standardized mean difference effect size of 0.5.  Similar to Oscar, Katy experienced an acceleration trend in baseline that was not sustained during intervention (Baseline: 3 per session; Intervention: 1 word per session), but there was an overall gain in level (MBL = 60, MInt = 74). However, the overall gain in level may be attributable to a few instances of higher performance in the intervention phase that likely resulted in inflated average gains in level. Data variability across the two phases was also high and complicates the analysis. Statistical analysis revealed a weak and statistically insignificant (p = .086) Tau effect size of 0.28, CI 95% [-.03, .59], but a small standardized mean difference effect size of 0.93.  Again, similar to Oscar and Katy, Iris also experienced an acceleration trend in baseline that was not sustained during intervention (Baseline: 4 words per session; Intervention 0 words per session), but there was an overall gain in level (MBL = 70, MInt = 79). However, data overlap and variability of the data was present. The variability and data overlap matched with the accelerating trend that neutralized upon introduction of the intervention resulted in a weak, 0.09, CI 95% [-.21, .41], and statistically insignificant (p = .567) Tau effect size. The standardized mean difference effect size was small at 0.38.  DIBELS passages. Performance on DIBELS passages, including the mean percentage of multisyllabic words read correctly within passages and the associated slopes of students’ multisyllabic word decoding accuracy on the first reading during the baseline and intervention phase are presented in Table 4 and Figure 4 above. Lisa experienced an acceleration trend during baseline that was not sustained during intervention (Baseline: 3 words per session; Intervention: 0 words per session) and no   50 meaningful change in percentage of multisyllabic words read accurately upon implementation of the intervention (MBL = 90% accurate, MInt = 91% accurate) was observed. This analysis is consistent with the negligible Tau effect sizes for Lisa’s multisyllabic word reading accuracy on DIBELS passages which was 0.02, CI 95% [-.3, .35], which is statistically insignificant (p = .904). The standardized mean difference effect size is negligible at 0.11. Oscar demonstrated a negligible deterioration in trend regarding multisyllabic word decoding accuracy on DIBELS passages upon implementation of the intervention (Baseline: 2 words per session; Intervention: 1 word per session) which nonetheless resulted in a trivial increase in level (MBL = 90% accurate, MInt = 92% accurate). This visual analysis is consistent with statistical analysis which indicated a weak statistically insignificant (p = .273) Tau effect size of 0.20, CI 95% [-.14, .53] and a standardized mean difference effect size of 0.18.  Although Katy demonstrated a slight increase in trend between baseline and intervention (Baseline: -1 word per session; Intervention: 1 word per session), this is largely attributable to increased variability in her multisyllabic word decoding accuracy immediately following the introduction of the intervention. In fact, the latter change in trend did not meaningfully increase the level of her multisyllabic word reading accuracy on DIBELS passages (MBL = 81% accurate, MInt = 80% accurate). Again, this is consistent with statistical analysis that indicated that the Tau effect size for Katy’s multisyllabic word reading accuracy on DIBELS passages in response to the intervention was weak, 0.14, CI 95% [-.18, .45], and statistically insignificant (p = .419). There was a negative standardized mean difference that was negligible at -0.11.  In regards to her multisyllabic word reading accuracy on DIBELS passages, Iris experienced some variable performances during the baseline. This variability, especially early in the baseline phase, produced an acceleration trend in the baseline that was not sustained during   51 the intervention phase (Baseline: 6 words per session; Intervention: 0 words per session). Additionally, the data reveal an increase in overall level upon implementation of the intervention (MBL = 78% accurate, MInt = 88% accurate); however, the baseline’s mean percentage accurate is likely impacted by the extremely variable data present during baseline and needs to be interpreted with caution. The statistical analysis points to an overall very weak effect for Iris with a Tau effect size of 0.05, CI 95% [-.28, .37], that is statistically insignificant (p = .809) and a small standardized mean difference effect size of 0.31.  Summary. A visual analysis of level, trend, and variability reveals no compelling evidence for generalized multisyllabic word decoding accuracy gains within the instructional or DIBELS passages. The non-corrected Tau effect sizes revealed either extremely weak or statistically insignificant results. However, the Tau values appeared to be impacted by ceiling effects and the standardized mean difference effect sizes for multisyllabic word reading accuracy on the instructional passages indicated a consistent pattern of small mean improvements. The standardized mean difference effect sizes were not compelling for the DIBELS passages. Although the DIBELS passage results do not offer support for the use of this intervention in improving within text multisyllabic word decoding accuracy, there does appear to be, according to the standardized mean difference effect sizes, a pattern of small detectable gains in multisyllabic word decoding accuracy on the instructional passages. Thus, there is some preliminary, albeit mixed, evidence to support this intervention in enhancing the multisyllabic word reading accuracy on instructional passages.  Generalized multisyllabic reading accuracy on word lists. The results of students’ performance on the researcher-created multisyllabic word lists across the baseline and intervention phase are displayed in Figure 5 and Table 6. These results are divided into the mean   52 number of whole words read correctly and the mean number of syllables read correctly and are presented with associated standard deviations.  Figure 5. Multisyllabic Word Decoding Accuracy on the Word Probes    051 01 52 02 53 0024681 0L is aB a s e l in e I n t e r v e n t io nW o r d sS y lla b le s051 01 52 02 53 0024681 0O s c a r051 01 52 02 53 0024681 0K a ty0 2 4 6 8 1 0 1 2 1 4 1 6 1 8 2 0 2 2 2 4 2 6 2 8 3 0 3 2051 01 52 02 53 0024681 0Ir isS e s s io nSyllables Read Correctly Words Read Correctly  53 Table 5. Means, Standard Deviations and slopes of Multisyllabic Word Decoding Accuracy Performance Across Baseline and Intervention phases.  10-Word List Mean Whole Word Correct (SD) and slopes Mean Syllables Correcta (SD) and slopes Student BL Int  BL  Int  BL Int  BL Int  Lisa 6 (2) 6 (1) 0 -0.69 0.12 24 (4) 24 (2) 0 -1.06 0.23 Oscar 5 (2) 6 (2) +1 -0.31 0.10 21 (2) 24 (3) +3 0.06 0.20 Katy 5 (2) 7 (1) +2 -0.06 0.18 22 (3) 24 (3) +2 0.06 0.20 Iris 5 (1) 5 (2) 0 -0.43 0.13 22 (2) 22 (3) 0 -0.77 0.11 Note: BL = baseline phase; Int = intervention phase;   =  difference in means; a29 syllables per list  Whole-Words.  Although Lisa experienced a decreasing trend in whole-word decoding accuracy during the baseline that improved slightly in response to the intervention (Baseline: -0.69; Intervention: 0.12) the overall level did not improve (MBL = 6 words correct, MInt = 6 words correct). The Tau effect size for Lisa’s whole-word reading accuracy on the word probes was 0.24, CI 95% [-.10, .56] demonstrating a weak, statistically insignificant effect (p = 0.172). There was a standardized mean difference effect size of 0.00. Oscar experienced a decreasing trend in whole-word decoding accuracy during the baseline that improved negligibly in response to the intervention (Baseline: -0.31; Intervention: 0.10) and resulted in a slight improvement in level (MBL = 5 words correct, MInt = 6 words correct).  The Tau effect size for Oscar’s whole-word reading accuracy on the word probes was 0.37, CI 95% [.04, .7], demonstrating a weak, statistically significant (p = 0.029) effect. There was a small standardized mean difference effect size of 0.50. Katy also experienced a decreasing trend in whole-word decoding accuracy during the baseline that improved slightly upon introduction of the intervention phase (Baseline: -0.06; Intervention: 0.18). This change in trend resulted in a slight improvement in level (MBL = 5 words correct, MInt = 7 words correct). This is consistent with statistical analysis that revealed a moderate, statistically significant (p = 0.007) Tau effect size of 0.45, CI 95% [.13, .78] for   54 Katy’s whole-word decoding accuracy on the word probes in response to the intervention. This is consistent with a standardized mean difference effect size of 1.00.  Iris experienced a decreasing trend during the baseline that improved upon implementation of the intervention (Baseline: -0.43; Intervention: 0.13) that resulted in no measurable change in level (MBL = 5 words correct, MInt = 5 words correct). This is consistent with statistical analysis that revealed a negative Tau effect size of -.05, CI 95%  [-.39, .28] which is weak and statistically insignificant (p = 0.769). The standardized mean difference effect size was 0.00 Syllables. Although Lisa experienced a decreasing trend during the baseline in regards to her syllabic reading accuracy on the word probes, the trend improved during the intervention phase (Baseline: -1.06; Intervention: 0.23). However, this improvement in trend did not result in measurable gains in level (MBL = 24 syllables correct, MInt = 24 syllables correct). This is consistent with the Tau effect size for Lisa’s syllabic reading accuracy of 0.28, CI 95% [-.05, .6], demonstrating a weak and statistically insignificant (p = .098) effect. The standardized mean difference effect size was 0.00 Oscar demonstrated a flat trend during the baseline that improved during the intervention (Baseline: 0.06 syllables per session; Intervention: 0.20 syllables per session). This improved trend in syllabic reading accuracy resulted in some gains in level (MBL = 21 syllables correct, MInt = 24 syllables correct). Visual analysis is consistent with the small yet statistically significant (p = .009) Tau effect size for Oscar’s syllabic reading accuracy which was 0.44, CI 95% [.11, .76] The standardized mean difference effect size was moderate at 1.50. Katy also experienced a flat trend during the baseline that improved during the intervention (Baseline: 0.06 syllables per session; Intervention: 0.2 0 syllables per session) that   55 resulted in some gains in level (MBL = 22 syllables correct, MInt = 24 syllables correct). The Tau effect size for Katy’s syllabic word reading accuracy on the word probes was 0.37, CI 95% [.05, .7], demonstrating a small, statistically significant (p = .025) effect. The standardized mean difference effect size was small at 0.66. Iris demonstrated a decreasing trend in baseline that improved slightly during the intervention (Baseline: -0.77; Intervention: 0.11) that did not result in any measured gains in level (MBL = 22 syllables correct, MInt = 22 syllables correct). Statistical analysis revealed a Tau effect size for Iris’ syllabic word decoding accuracy on the word probes of -0.1, CI 95% [-.4, .22], demonstrating a weak negative statistically insignificant effect (p = .543). The standardized mean difference effect size was 0.00. Summary. An analysis of level, trend and variability/ non overlap and statistical analysis reveals insufficient compelling evidence of multisyllabic word decoding accuracy gains at the word or syllabic level on the Word List probes. There are only two demonstrations of an effect at the whole word and syllabic level. There is insufficient evidence to currently support this intervention in promoting generalized gains in word reading accuracy to isolated multisyllabic words at the whole-word and syllabic level.  Broad Reading Skills   Student performance on the computer adaptive measure of broad reading skills (aReading) is shown in Table 6.        56 Table 6. Students’ Performance on the Computer Adaptive Measuree of Broad Reading Skills  aReading Scaled Score and Benchmark Classifications Student Pre Baseline Phase Change Post Intervention Lisa 503 low risk 507 low risk 504 low risk Oscar 509 low risk 510 low risk 517 low risk Katy 485 high risk 494 some risk 493 some risk Iris 504 low risk 500 some risk 492 some risk Note: eFastBridge Learning aReading assessment out of a possible 650.   Student performance on aReading, as measured in this study, was variable, and well-within the standard-error for aReading. Thus, overall improvements in broad reading skills were not evident. For example, Oscar demonstrated some slight, but consistent gains in broad reading skills at the three data collection points (Oscar: pre  = 509; phase change  =  510; post  =  517) and his associated benchmark classification remained in the ‘low risk’ category across the study.  Prior to baseline, Katy obtained a score (Katy: pre = 485) in the ‘high risk’ classification. Katy’s next two performances (Katy: phase change  =  494; post = 493) both fell in the ‘some risk’ classification. Lisa’s performance fell consistently within the ‘low risk’ category and showed minimal fluctuation (Lisa: pre  = 503; phase change  =  507; post  =  504). Finally, Iris’ performance declined during the course of the study (Iris: pre  = 504; phase change  =  500; post  =  492) with an initial performance falling in the ‘low risk’ category and the final two performances falling in the ‘some risk’ category. Social Validity Students’ perceptions of their reading skills and impressions of the intervention are displayed in Table 7.      57 Table 7. Students’ Perceptions of the Intervention and their Reading  Mean score out of a maximum of 10 I think that the way I am learning to read long words is fun. 8.75 I think that what I am learning to do is important 8.75 I think that the time this intervention takes is too long. 5 I think that the activities are helping me to get better at reading longer words. 9.2 I am getting better at reading quickly and accurately. 9.2 I am getting better at understanding what I am reading. 8.3 I think that the way I am learning to read longer words will work for other students. 8.3   The results indicate that students perceived the intervention favourably and that they believe that their reading skills improved in response to the instructional procedures. Specifically, the highest ranked statements (M score = 9.2) were ‘I think the activities are helping me to get better at reading longer words’, and ‘I am getting better at reading quickly and accurately.’ The statements: ‘I think that the way I am learning to read long words is fun’, and ‘I think that what I am learning to do is important’ were also highly ranked (M score = 8.75). The students provided a mean score of 8.3 in response to the statements: ‘I am getting better at understanding what I am reading’ and ‘I think the way I am learning to read longer words will work for other student’. The statement that was least endorsed (M score = 5) was ‘I think the time this intervention takes is too long’, indicating ambivalence to the length of the intervention.         58 Chapter 5: Discussion The purpose of the current study was to ascertain whether a think-aloud multisyllabic-word decoding intervention incorporated within evidence-based reading fluency intervention strategies was effective at improving the reading fluency and multisyllabic-word decoding accuracy of upper-elementary school struggling readers.  It was hypothesized that the instructional procedures would promote within session gains on expository curricular reading materials and improve students’ generalized multisyllabic word decoding and reading fluency. The major findings from this study, limitations, implications and directions for future research will be discussed.  Major Findings  Based upon visual analysis of the graphed data, there were clear and consistent direct within session gains in reading fluency on the instructional passages across students. Within session gains in multisyllabic word reading accuracy on the instructional passages were also promising. Further, there were three clear demonstrations of generalized gains in reading fluency on the instructional passages (initial readings). Generalization to the DIBELS passages was less compelling with only two demonstrations of a functional effect. In regards to gains in multisyllabic word decoding accuracy, there was some evidence of generalized improvements to the instructional passages based on consistent mean gains and standardized mean difference effect sizes, but this was not evident with the Tau values due to ceiling effects. There is insufficient evidence to support the use of this invention in improving multisyllabic word decoding accuracy on DIBELS passages. Additionally, students’ broad reading skills did not appear to improve in response to the intervention. Finally, students rated the intervention favourably and perceived that their reading skills increased in response to this intervention.   59 Direct gains. Student gains in reading fluency were measured between students’ first and third reading of the instructional passages. Encouragingly, and as expected, both visual analysis of the data and an analysis of the mean gains in WCPM on the instructional passages between the 1st and 3rd readings for each intervention session yielded consistently positive gains across students. Mean gains in WCPM of the current study were similar to the mean gains in a previous study (Martens et al., 2007) which employed curricular texts as the intervention passages.  Along with the previous study (Martens et al., 2007), the current study provides credence for the instructional procedures employed in improving reading fluency in struggling readers. When visually analysing the within passage multisyllabic word decoding accuracy on the instructional passages between the first and the third read, there were few instances of overlapping data. There were overall within session improvements in students’ within passage multisyllabic word reading accuracy.  Generalized gains. Generalized gains in reading fluency were evident on the first read of the instructional passages but gains were less compelling on the DIBELS passages. There was a pattern of small mean gains in multisyllabic word decoding accuracy on the instructional passages. There is no compelling evidence for generalized gains in multisyllabic word reading on the DIBELS passages or word-probes at the whole-word and syllabic level.  Reading fluency. There were three clear demonstrations of generalized reading fluency gains on the initial readings of instructional passages in response to the intervention procedures employed. Lisa’s reading fluency increased as did Oscar and Katy’s level of oral reading fluency. Anecdotally, the latter two students were very enthused by the intervention’s motivational components (e.g., goal-setting) associated with the instructional passages and this may have facilitated their engagement and reading fluency gains. In contrast, Iris’s oral reading   60 fluency level increased only slightly between baseline and intervention. Iris did not appear to be as motivated during the intervention, and her rate of reading, even when prompted to read quickly and accurately did not change much.  Additionally, gains in reading fluency on the instructional passages were observed immediately upon implementation of the intervention and may be attributable to the motivating evidence-based reading-fluency intervention components (e.g., goal setting, prompting etc.) utilized during the intervention phase. There was little evidence of generalized gains in reading fluency on the DIBELS passages. In fact, only two statistically significant effects (1 moderate and 1 weak) were observed for generalized reading fluency on the DIBELS passages. In the time frame employed for this study, the instructional procedures do not appear effective at promoting generalized gains in reading fluency beyond the instructional passages.   Multisyllabic word reading accuracy. There was some evidence of generalized improvements in multisyllabic word reading accuracy on the instructional passages. According to the Tau values there was one (Lisa) observable statistically significant, yet weak effect. This is likely attributable to the declining trend during the baseline. The other three students’ data demonstrated a large amount of overlap. Although there was a high degree of data variability, the standard mean difference effect sizes reveal small positive effects across students which is consistent with the mean level change of multisyllabic word reading accuracy between baseline and intervention. Thus, there appears to be some evidence to support the procedures employed in the current study in enhancing the multisyllabic word reading accuracy of students on the instructional passages. Additionally, there was not enough evidence to conclusively support this intervention in regards to multisyllabic word decoding skills on the DIBELS passages. According to visual   61 analysis of level, trend, and variability, there were no meaningful effects. According to statistical analysis there was only one weak statistically significant effect. Aside from some limitations in the current study, further described below, the minimal gains in mean multisyllabic word reading accuracy on the DIBELS passages may be partially attributable to ceiling effects. In other words, aside from an outlier in Iris’ baseline date, students were already reading the multisyllabic words within DIBELS passages with a generally high degree of accuracy during baseline.  There was limited evidence to support the use of this invention in improving generalized word decoding accuracy at the whole-word or syllabic level on the word list probes. However the two small, yet statistically significant effects for word reading accuracy at the whole-word and syllabic level on the word list probes can be explained by anecdotal observation. During the intervention phase Oscar read the words slowly and often self-corrected his decoding accuracy. Katy appeared to relax and slow her reading down when reading isolated words, compared to when she was asked to read an entire passage. This more leisurely pace of reading may have facilitated their performance on this measure. However, multisyllabic word reading gains, even for Oscar and Katy, were not compelling.  Broad reading skills. Student gains in broad reading skills were measured via a computerized adaptive test. Student response was variable, but fell within the standard error of measurement for aReading. Students did not consistently make gains in broad reading skills, or alternatively, these gains were not observable on the measure employed. The latter option is very possible because aReading is intended to be used to monitor progress at three points across an academic year (Christ, 2014). In this study, the entire baseline and intervention phase spanned an 11-week duration. Thus, aReading was likely not sensitive enough to measure gains in broad reading skills across students within the time-frame of this study. One student (Iris),   62 demonstrated a consistent decline in scores. This could be attributable to the fact that she repeatedly vocalized that she did not enjoy completing the online assessment and voiced that she was frustrated that some of the questions reoccurred across different forms of the test. Perhaps, this questionable level of motivation compromised her performance on this measure.  Social validity. Overall students rated the intervention favourably and perceived that the intervention was helping them improve their reading skills. This was consistent with students’ overall positive behaviours and willingness to participate in the intervention throughout the current study. Aside from some of the students indicating an ambivalent response to the length of the intervention session (15 minutes), their overall impressions of the intervention and its outcomes were positive.  Limitations There are several limitations worth noting in regards to the measures, selection procedures, and intervention employed. The first limitation surrounds why gains in multisyllabic word decoding did not clearly generalize across measures. This may have occurred for a few reasons. First, as mentioned above one way to enhance the transfer of learning beyond the intervention is to include strategy instruction and provide opportunities to practice the strategies taught (Lovett et al., 1994). Although in the current study the decoding strategies were taught through modelling, a limitation in the current study was the absence of opportunities for students to actively practice decoding words with similar orthographic and morphemic patterns to those of incorrectly read words. For example, a syllabic decoding study by Penney (2002) provided the opportunity for participants to practice through ‘word drill’ the reading of three or more words with similar orthographic patterns to those of incorrectly read words. Good and colleagues (2015) similarly expound the importance of repeated decoding opportunities during morphemic   63 decoding interventions. Thus, adding repeated opportunities to practice decoding within word families may be an effective procedure to consider adding to support students’ generalized multisyllabic word decoding skills. Incorporating planned opportunities for practice, versus repetition, may better facilitate the transfer of learning to real words within passages beyond those encountered during intervention. Secondly, the metacognitive monitoring of decoding strategies is considered to support the generalization of this learning (Lovett et al., 1994). In this study, the interventionist modelled thinking, decoding, and monitoring strategies, but students were likely not given ample opportunity to do so themselves. Thus planning for the generalization of multisyllabic word decoding by providing opportunity for students to self-monitor and reflect upon the decoding strategies employed appears to be important and the lack thereof was a limitation of the current study.   The teaching of multisyllabic word decoding may need to follow a systematic and more explicit procedure versus following the largely random error patterns made by students. This is consistent with an empirical review (Rayner et al., 2001) and the National Reading Panel Report (2000) which both reported that one component of improving the word recognition and decoding skills of struggling readers involves systematic instruction of letter-pattern mapping. The current study did not incorporate a systematic instructional sequence of orthographic patterns. The frequency, or lack of frequency of orthographic patterns taught was not considered during this study. Rather, the potential instructional efficiency of following student error patterns directed the intervention. Following students’ largely random error patterns appears to have contributed to the lack of generalized decoding skills beyond the instructional passages. Some of the words and orthographic patterns intervened upon during intervention were not again encountered on the   64 generalization measures. Students did not have ample opportunity to practice the decoding of the (or similar) words and letter patterns beyond the instructional passages.  Another limitation of this study surrounds the fact that the intervention was a combined package (fluency + decoding) and it is thus difficult to ascertain the relative importance of each component. In other words, are the observed reading gains attributable to the multisyllabic word-decoding component, the reading fluency intervention or a combination of the two. It may have been useful to introduce the reading fluency intervention alone and monitor gains during the baseline and then introduce the multisyllabic word decoding component during the intervention phase. This may have provided more clarity in regards to what ‘active components’ contributed to the observed gains in student reading skills. A further limitation concerns the student selection procedure. Students’ with a very specific reading profile were required for this study. The researcher screened 11 teacher-recommended students and then selected the most appropriate students from that sample. It is possible that students not referred for initial screening may have actually benefited more from the intervention than some of the students eventually selected. However, the researcher did not have an opportunity to screen these potentially suitable participants because they were not initially referred. It may have been more effective to screen at the class-level with the CORE-PS to obtain a larger initial sample from which to then select from for further screening. This would have helped to ensure that selected students had maximal and comparable difficulties with multisyllabic word-decoding accuracy at study outset and may have demonstrated more detectable gains on the multisyllabic word probes. Some students, despite meeting selection criteria, may have been demonstrating reading skills that were already too high to detect changes in multisyllabic word decoding between the baseline and intervention condition. For example,   65 Lisa’s level of multisyllabic word decoding accuracy ascertained during screening fell at the cut- point for study inclusion (committing a minimum of seven errors on the CORE-PS). This may partially explain her lack of observable gains (at both the whole-word and syllabic level) on the multisyllabic word-reading probe. Additionally worth considering is the fact that the ‘cut score’ set for study inclusion may have been too high and resulted in ceiling effects.  No consistent changes were evident in broad reading skills. As mentioned above, this is likely attributable to the fact that aReading is intended to measure student progress at three points throughout the academic year (Christ, 2014). For the current study, the measure was used at three points over an 11-week timespan. It is likely that aReading was not sensitive enough to detect gains in students’ broad reading skills across such an abbreviated time-frame. Another measure of broad reading skills, more sensitive to change over shorter timespans may have been a better choice for this study.  The final limitation is that the researcher created the multisyllabic word probes. This was necessary because there were no standardized repeatable multisyllabic word reading measures available. Although the researcher strived to create a set of parallel probes by controlling several word-related variables (M length of word, M frequency of the word, M number of morphemes, and total number of syllables) across each probe, the researcher did not pilot the set of 22 probes prior to their use. It is possible that the measure was not sensitive enough to measure gains in multisyllabic word decoding skills or the forms were not of equivalent difficulty. Creating parallel forms of reading measures can be extremely difficult and the presence of non-parallel forms can contribute to unstable growth rates (van den Broek, & Espin, 2012). An additional issue with the probes could have been that the words selected were not at an appropriate level for all students. Some children may have demonstrated ceiling effects (e.g., Lisa) while for other   66 children (e.g., Iris), the difficulty of the words employed may have been too high to detect change between baseline and intervention. Although the researcher tried to account for this by analysing student performances on the multisyllabic-word reading probes at both the whole-word and syllabic level, the fact remains that the measure as employed in the current study was not able to detect meaningful gains across students.  Implications  Although there is some evidence suggesting that reading interventions are not as effective for older students as they are for younger students (Flynn & Swanso, 2012; Scammacca et al. 2007; 2015), results of the current study offers some evidence to support the use of expository curricular passages to enhance upper-elementary students’ reading fluency. There is little extant research on fluency interventions with expository text. Reading research employing curricular texts typically focuses upon reading comprehension, not fluency. In fact, reading fluency interventions do not typically incorporate curricular reading material (Coulter & Lambert, 2015) and some studies do not indicate gains in student reading fluency. A study by Vadasy and Sanders (2008) found that the reading fluency rates of 70 fourth and fifth grade students randomly assigned to a reading intervention incorporating expository passages did not improve significantly compared to 70 students in the control group who did not receive the intervention. However, measures of comprehension were included and students demonstrated gains in measure of passage comprehension, word comprehension and measures of vocabulary (Vadasy & Sanders, 2008). Additionally, a recent study by Coulter and Lambert (2015) explored the effects of pre-teaching difficult content words found within social studies texts on the reading accuracy and fluency of three struggling upper-elementary readers. Results indicated that the pre-teaching strategy increased the accuracy, but only minimally increased the fluency, of   67 students’ expository passage reading skills. In contrast to the studies above, the current study found that the reading fluency of upper-elementary struggling readers did improve in response to this intervention employing curricular expository passages. Further, this study exemplified that participants were able to access reading materials reflective of the general education curriculum while receiving a reading intervention. Thus, this study provides some support for the use of expository curricular passages as an accessible tool for interventionists to utilize in conjunction with evidence-based reading fluency strategies to increase the reading fluency of upper-elementary struggling readers. These results are very consistent with previous research (Heibert, 2005; Martens et al., 2007) that also used curricular texts to enhance reading fluency. Heibert (2005) found that using socials and science texts for fluency-based instruction resulted in greater gains in student reading rate than using narrative based basal readers for fluency-based instruction. A final study by Kostwicz and Kubina (2011) used a repeated reading fluency approach with science-based curricular passages to support students with disabilities who according to their teachers needed support reading science passages. Students made gains in oral reading fluency both within and between passages (Kostwicz & Kubina, 2011). Similarly, the current study also demonstrated reading fluency gains both within and between science passages.  Thus, the current study supports previous research that combining reading fluency intervention with curricular text appears to be effective in promoting student access to grade-level content-heavy curricular passages and can increase the reading skills of upper-elementary struggling readers.   The current study sought to explore the effects of three flexibly applied think-aloud multisyllabic word reading strategies (phonetic left-to-right decoding, syllabification, morphemic analysis). However, when implementing this study’s intervention procedure, it quickly became   68 apparent that left-to-right decoding was not frequently employed as an effective approach to decoding longer words. This study supports previous research expounding the utility of structural analysis, including syllabification and morphemic analysis for upper-elementary school students’ multisyllabic word decoding instruction (Bhattachary & Ehri, 2004; Mann & Singson, 2003: Tinge & Binder, 2015). Interventionists supporting the decoding skills of upper-elementary school struggling readers may want to consider using primarily syllabification and morphemic analysis strategies.    A final implication of this research surrounds the length of intervention. Student gains in reading fluency were generally not observed on DIBELS passages until the final weeks of the intervention. In this study, gains in reading fluency were observed more promptly on the instructional passages than on DIBELS passages. This lag in generalized improvements in reading fluency on the DIBELS passages was not surprising in that studies have shown that gains in fluency on novel materials are not as strong as gains on trained materials (Therrien, 2004). Reading practice or the time spent reading has a strong relationship to reading fluency (Torgesen, 2002). Although this intervention was relatively intense (twice weekly) its duration was perhaps not long enough to allow for ample practice to observe reading fluency gains beyond the instructional passages. Thus, interventionists supporting students’ development of oral reading fluency may want to consider the dimension of time, and lengthen the duration of an intervention in order to maximize generalized gains.  Recommendations for Future Research  The following recommendations are based upon the findings and limitations of the current study. Although students’ generalized gains in fluency are apparent on the instructional passages, gains did not consistently generalize to the DIBELS passages. Exploring ways to   69 optimally promote generalized reading fluency gains in struggling upper-elementary school readers beyond instructional passages merits further research. Additionally, research into optimal reading fluency intervention lengths for upper-elementary struggling readers merits further exploration. Generalized multisyllabic word reading accuracy gains were not evident on the DIBELS passage or on the Word List probes. However, these findings do not diminish the importance of multisyllabic word reading skills. There are frequent occurrences of multisyllabic words in expository texts (Archer et al., 2003) and upper-elementary students need to be able to decode novel multisyllabic words in order to access the curriculum (Nagy & Anderson, 1984). Although the current study’s limitations may help shed some light on why multisyllabic word decoding skills did not generalize there are still many questions to answer in regards to how to best support students’ in gaining these important skills in a manner that is effective and efficient. Thus reading intervention studies that incorporate expository curricular passages and systematic, explicit multisyllabic instruction offering multiple opportunity to practice and monitor the learned orthographic patterns may be a valuable avenue for future study. Further, exploring how to do this in a manner that is ecologically valid within the constraints of the school system is worth exploring and determining whether expository passages can be used to do so merits research.  Additionally, future studies could explore how to better measure comprehension gains when using expository curricular text. Although comprehension, which can be broadly defined as the ability to extract meaning from text (Lapp & Flood, 1983; Snow, 2002), has traditionally not been as researched as phonological awareness and fluency (Alfonso, Basaraba, Tindal, & Carriveau, 2009), the assessment of comprehension is beginning to receive more research   70 attention (van den Broek, & Espin, 2012). Assessing comprehension is complex and can require different approaches (Alfonoso et al., 2009). Thus, studies investigating how to accurately gauge gains in reading comprehension of expository passages when administering a reading intervention to upper-elementary students merits research.   Conclusion In support of the instructional procedures employed, within session gains were observed during the intervention phase in regards to reading fluency and multisyllabic word reading accuracy on the instructional passages across students. Thus, students were better able to access the expository texts following each intervention session. Generalized gains in reading fluency were observed on the instructional passages, but were much less compelling on the DIBELS passages. There was some evidence of generalized multisyllabic word decoding accuracy on the instructional passages. However, generalized gains in multisyllabic word reading accuracy were not evident on the DIBELS passages or on the word list probes.  The reading fluency intervention employing expository passages implemented in conjunction with a flexibly applied multisyllabic word decoding component appeared to result in direct and generalized gains in reading fluency on the instructional passages. The instructional procedures promoted within session and some between session gains on multisyllabic word decoding on the instructional passages. Multisyllabic word reading accuracy gains were not observed beyond the instructional passages. These results expound the importance of reading fluency interventions for upper-elementary school students that incorporate expository passages and raise questions about how to better promote generalized multisyllabic word reading accuracy in upper elementary struggling readers. Despite some limitations in the current study, a comprehensive reading intervention employing expository text, such as the one profiled in this   71 study has the potential to enhance the reading skills of students in the upper-elementary years. The findings related to gains in reading fluency and decoding accuracy on expository passages merits replication. Future studies could also explore how to better enhance the generalization of multisyllabic word reading of upper-elementary struggling readers. It is hoped that this study, and future similar studies will help educators support upper-elementary students in further developing the requisite skills to access the curriculum as passage and word-level complexity increases throughout the upper-elementary grades and beyond.        72 References Abbott, S. P., & Berninger, V.W. (1999). It’s never too late to remediate: Teaching word recognition to students with reading disabilities in Grades 4-7. Annals of Dyslexia, 49, 223-250. Adams, M. J. (1990). Beginning to read: Thinking and learning about print. Cambridge, MA: MIT Press.  Alonzo, J., Basaraba, D., Tindal, G., Carriveau, R. S. (2009). They read, but how do well do they understand? an empirical look at the nuances of measuring reading comprehension. Assessment for Effective Intervention, 35, 34-44. doi: 10.1177/1534508408330082 Archer, A. L., Gleason, M. M., & Vachon, V. L. (2003). Decoding fluency: Foundation skills for struggling older readers. Learning Disabilities Quarterly, 26, 89-101. Bhat, P., Griffin, C. C., & Sindelair, P. T. (2003). Phonological awareness instruction for middle school students with learning disabilities. Learning Disability Quarterly, 26, 73–87. Bhattacharya, A., & Ehri, L. C. (2004). Graphosyllabic analysis helps adolescent struggling readers read and spell words. Journal of Learning Disabilities, 37, 331–348. Barth, A. E., Catts, H. W., & Anthony, J. L., (2009). The component skills underlying reading fluency in adolescent readers: a latent variable analysis. Reading & Writing, 22, 567-590. doi: 10.1007/s11145-008-9125-y Begeny, J. C. (2009). Helping Early Literacy with Practice Strategies (HELPS): A one-on-one program designed to improve students’ reading fluency. Raleigh, NC: The HELPS Education Fund. Retrieved July 15, 2015.  http://www.helpsprogragram.org  Begeny, J. C., Krouse, H. E., Ross, S. G., & Mitchell, R. C. (2009). Increasing elementary-aged students’ reading fluency with group-based interventions: A comparison of repeated   73 reading, listening passage preview, and listening only strategies.  Journal of Behavioral Education, 18, 211-228. Betourne, L. S., & Friel-Patti, S. (2003). Phonological processing and oral language abilities in fourth-grade poor readers. Journal of Communication Disorders, 36, 507-527. doi: 10.1016/S0021-9924(03)00035-2 Carlisle, J.F. (2000). Awareness of the structure of meaning of morphologically complex words: Impact on reading. Reading and Writing: An Interdisciplinary Journal, 12, 169-190.  Carlisle, J., & Stone, C. A. (2005). Exploring the role of morphemes in word reading. Reading Research Quarterly, 40, 428-449. doi: 10.1598/RRQ.40.4.3 Cates, G. L., Burns, M. K., & Joseph, L. M. (2010). Introduction to the special issue: Instructional efficiency and the impact on learning and data-based decision making. Psychology in the Schools, 47(2), 111-113. doi:10.1002/pits.20456 Chard, D. J., Vaughn, S., & Tyler, B. J. (2002). A synthesis of research on effective interventions for building reading fluency with elementary students with learning disabilities. Journal of learning disabilities, 35, 386-406. doi: 10.1177/00222194020350050101 Chall, J. S. (1983). Stages of reading development. New York: McGraw- Hill. Christ, T. J. (2011). Adaptive Reading (aReading): Kindergarten to Fifth Grade Broad Measure of Broad Reading Achievement (computer adaptive test based on National Reading Panel) Minneapolis, MN: University of Minnesota. Retrieved July 19, 2016. http://www.fastforteachers.org Christ, T. J. (2014). Formative Assessment System for Teachers: Technical Manual. Minneapolis, MN: University of Minnesota. Retrieved July 19, 2016.  https://d2f8p5dd32sgdr.cloudfront.net/docs/2015-16FastBridgeNormsandBenchmarksAllMeasuresFINAL.pdf   74 Christ, T. J., Arañas, Y. A., Kember, J. M., Kiss, A. J., McCarthy-Trentman, A., Monaghen, B. D., Newell, K. W., Van Norman, E. R., White, M. J. (2014). Formative Assessment System for Teachers Technical Manual: early Reading, CBMReading, aReading, aMath, and earlyMath. Minneapolis, MN: Formative Assessment System for Teachers.  Compton, D. L., Fuchs, D., Fuchs, L. S., Elleman, A. M., & Gilbert, J. K. (2008). Tracking children who fly below the radar: Latent transition modeling of students with late-emerging reading disability. Learning and Individual Differences, 18, 329 –337. doi:10.1016/j.lindif .2008.04.00  Consortium on Reading Excellence. (2008). Core phonics surveys. In B. Honing, L. Diamond, & R. Nathan (Eds.), Assessing reading: Multiple measures for kindergarten through eighth grade (2nd ed., pp. 63– 80). Novato, CA: Arena. Coulter, G. A., & Lambert, M. C. (2015). Access to general education curriculum: The effect of preteaching key words upon fluency and accuracy in expository text. Learning Disability Quarterly. 38, 248-256. doi: 10.1177/0731948715580438 Daly, E. J., III, Chafouleas, S., & Skinner, C. H. (2005). Interventions for reading problems: Designing and evaluating effective strategies. New Your, NY: The Guilford Press Diliberto, J. A., Beattie, J. R., Flowers, C. P., & Algozzine, R. F. (2008). Effects of teaching syllable skills instruction on reading achievement in struggling middle school readers. Literacy Research and Instruction, 48, 14-27. doi: 10.1080/19388070802226253 Edmonds, M. S., Vaughn, S. E., Wexler, J., Reutebuch, C., Cable, A., Klingler Tackett, K., & Wick Schankenger, J. (2009). A synthesis of reading interventions and effects on reading comprehension outcomes for older struggling readers. Review of Educational Research, 79, 262-300. doi: 10.3102/0034654308325998    75 Ehri, L.C. (1995). Phases of development in learning to read by sight. Journal of Research in Reading, 18, 116–125. Ehri, L.C. (2003). Systematic phonics instruction: Findings of the national reading panel. Paper presented at the invitation seminar organized by the Standards and Effectiveness Unit, Department for Education and Skills, British Government (London, England, March 17, 2003).  Erchul, W.P., & Martens, B. K. (2010). School consultation: Conceptual and empirical bases of practice (3rd ed.). New York: Springer.  Flynn, L. J., Zheng, X., Swanson, H. L. (2012). Instructing struggling older readers: A selective meta-analysis of intervention research. Learning Disabilities Research & Practice, 27, 21-32.  Fuchs, L. S., Fuchs, D., Hosp, M. K., & Jenkins, J. R. (2001). Oral reading fluency as an indicator of reading competence: A theoretical, empirical, and historical analysis. Scientific Studies of Reading, 5, 239–256 Gast, D. L. & Ledford, J. R. (Eds.) (2014, 2nd Ed.). Single case research methodology: Applications in special education and behavioural sciences. New York: Routledge Gilbert, J. K., Goodwin, A. P., Compton, D. L., & Kearns, D. M. (2014). Multisyllabic word reading as a moderator of morphological awareness and reading comprehension. Journal of Learning Disabilities, 47, 34-43. doi: 10.1177/0022219413509966 Good, R. H., & Kaminski, R. A. (2011). DIBELS Next. Longmont, CO: Sopris Learning. Good, R. H., Kaminski, R. A., Dewey, E. N., Walling, J., Powell-Smith, K. A., & Latimer, R. J. (2013). DIBELS next technical manual. Eugene, OR: Dynamic Measurement Group.   76 Goodwin, A. P., Gilbert, J. K., & Cho, S-J. (2013). Morphological contributions to adolescent word reading: An item response approach. Reading Research Quarterly, 48(1), 39-60. doi:10.1002/rrq.037 Gough, P.B., & Tunmer, W.E. (1986). Decoding, reading and reading disability. Remedial and Special Education, 7(1), 6–10. doi:10.1177/074193258600700104 Guthrie, J.T., Coddington, C.S., Klauda, S.L., Wigfield, A., Barbosa, P., (2009). Impacts of comprehensive reading instruction on diverse outcomes of low and high-achieving readers. Journal of Learning Disabilities 42,195-214. Heward, W.L. (1978, May). The delayed multiple baseline design. Paper presented at the Fourth Annual Convention of the Association for Behavior Analysis, Chicago. Hintze, J. M., Callahan, J. E., Matthews, W. J., Williams, S. A. S., & Tobin, K. G. (2002). Oral reading fluency and prediction of reading comprehension in African American and Caucasian elementary school children.  School Psychology Review, 31, 540-553.  Hoover, W.A., & Gough, P.B. (1990). The simple view of reading. Reading and Writing, 2(2), 127–160. doi:10.1007/BF00401799 Horner, R. H., Carr, E. G., Halle, J., McGee, G., Odom, S., Wolery, M. (2005). The use of singe subject research to identify evidence-based practice in special education. Exceptional Children, 71, 165-179.  Jenkins, J. R., Fuchs, L. S., van den Broek, P., Espin, C., & Deno, S. L. (2003). Accuracy and fluency in list and context reading of skills and reading disabled groups: Absolute and relative performance levels. Learning Disabilities Research and Practice, 18, 237–245. Joseph, L. M., Schisler, R. A. (2007). Getting the most ‘bang for your buck’: Comparison of the effectiveness and efficiency of phonic and whole word reading techniques during repeated   77 reading lessons. Journal of Applied School Psychology. 24(1), 69-90. doi:10.1300/J370v24n01.04 Just, M. A., and Carpenter, P. A. (1987). The psychology of reading and language. Boston: Allyn and Bacon.   Kintsch, W., & Kintsch, E. (2004). Comprehension. In S. G. Paris & S. A. Stahl (Eds.), Children’s reading comprehension and assessment (pp. 71–92). Mahwah, NJ: Lawrence Erlbaum Kratochwill, T. R., Hitchcock, J., Horner, R. H., Levin, J. R., Odom, S. L., Rindskopf, D. M., & Shadish, W. R. (2010). Single-case designs technical documentation. Retrieved September 29, 2015. https://ies.ed.gov/ncee/wwc/Document/229 Kratochwill, T. R., Hitchcock, J., Horner, R. H., Levin, J. R., Odom, S. L., Rindskopf, D. M., & Shadish, W. R. (2013). Single-case intervention research design standards. Remedial and Special Education, 34, 26-38. doi: 10.1177/0741932512452794 LaForte, E. M., McGrew, K. S., & Schrank, F. A. (2014). WJ IV Technical Abstract (Woodcock-Johnson IV Assessment Service Bulletin No. 2). Rolling Meadows, IL: Riverside. LaBerge, D., & Samuels, S. (1974). Toward a theory of automatic information processing in reading. Cognitive Psychology, 6, 293-323. Language and Reading Research Consortium (2015). Learning to read: Should we keep things simple? Reading Research Quarterly, 50, 141-169. doi: 10.1002/rrq.99 Lapp, D., & Flood, J. (1983). Teaching reading to every child. New York: Macmillan. Leach, J. M., Scarborough, H. S., & Rescorla, L. (2003). Late-emerging reading disabilities. Journal of Educational Psychology, 95, 211–224. doi:10.1037/0022-0663.95.2.211 Lemons, C. J., Fuchs, D., Gilbert, J. K., & Fuchs, L. S. (2014). Evidence-Based Practices in a   78 Changing World: Reconsidering the Counterfactual in Education Research. Educational Researcher. doi:10.3102/0013189x14539189 Lovett, M. W., Barron, S. L., & Frijters, J. C. (2013). Word identification difficulties in children and adolescents with reading disabilities: Intervention research findings. In H. L. Swanson, K. R. Harris, & S. Graham (Eds.), Handbook of learning disabilities (2nd ed. pp.329-360. New York: Guildford.  Lovett, M. W., Borden, S. L., DeLuca, T., Lacerneza, L., Benson, N. J., & Brackstone, D. (1994). Treating the core deficits of developmental dyslexia: Evidence of transfer of learning after phonologically-and strategy-based reading training programs. Developmental Psychology, 30, 805-822.  Lovett M. W., Lacerenza L., De Palma, M., & Frijters, J. C. (2012). Evaluating the efficacy of remediation for struggling readers in high school. Journal of Learning Disabilities, 45(2), 151-169. doi: 10.1177/0022219410371678 Lovett, M. W., Warren-Chaplin, P. M., Ransyby, M. J., & Borden, S. L. (1990). Training the word recognition skills of reading disabled children: Treatment and transfer effects. Journal of Educational Psychology, 82, 769-780.  Mann, V., & Singson, M. (2003). Linking morphological knowledge to English decoding ability: Large effects of little suffixes. In E.M.H. Assink & D. Sandra (Eds.), Reading complex words: Cross-language studies (pp. 1–25). New York: Kluwer Academic.  Martens, B. K., Eckert, T. L., Begeny, J. C., Lewandowski, L. J., DiGennaro, F. D., Montarello, S. A., . . . Fiese, B. H. (2007). Effects of a Fluency-Building Program on the Reading Performance of Low-Achieving Second and Third Grade Students. Journal of Behavioral Education, 16, 39-54. doi:10.1007/s10864-006-9022-x   79 Meyer, M. S., & Felton, R. H. (1999). Repeated reading to enhance fluency: Old approaches and new directions. Annals of Dyslexia, 49, 283–306. Morgan, P. L., & Sideridis, G. D. (2006). Contrasting the effectiveness of fluency interventions for students with or at risk for learning disabilities: A multilevel random coefficient modeling meta-analysis. Learning Disabilities Research & Practice, 21, 191-210. Nagy, W., & Anderson, R. C. (1984). How many words are there in printed school English? Reading Research Quarterly, 19, 304-330.  Nation, K., Clarke, P., Marshall, C.-M., & Durand, M. (2004). Hidden language impairments in children: Parallels between poor reading comprehension and specific language impairment? Journal of Speech, Language, and Hearing Research, 47, 199–211. doi:10.1044/1092- 4388(2004/017) National Center for Response to Intervention (2010). Screening Tools Chart. U.S. Office of Special Education Programs. Retrieved August 2, 2015.   http://www.rti4success.org/sites/default/files/Screening%20Tools%20Chart.pdf  National Governors Association Centre for Best Practices, Council of Chief State School Officers. (2010). Common Core Standards. Retrieved May 14, 2017.  http://www.corestandards.org National Reading Panel (2000). Teaching children to read: An evidence-based assessment of the scientific research literature on reading and its implications for reading instruction: Report of the subgroups. Bethesda, MD: National Institute of Child Health and Human Development. Retrieved June 25, 2015. http://www.nichd.nih.gve/publications/pubs/nrp/pages/smallbook.aspx)   80 OECD (2013), OECD Skills Outlook 2013: First Results from the Survey of Adult Skills, OECD Publishing. Retrieved August 2, 2015  http://dx.doi.org/10.1787/9789264204256-en Park, Y., Benedict, A. E., & Brownell, M. T. (2014). Construct and predictive validity of the CORE Phonics Survey: A diagnostic assessment for students with specific learning disabilities. Exceptionality, 22, 33-50. doi: 10.1080/09362835.2013.865534 Penney, C. G. (2002). Teaching decoding skills to poor readers in high school. Journal of Literacy Research, 34, 99–118. Perfetti, C. A. (1985). Reading Ability.  New York: Oxford University Press. Perfetti, C. A. (2007). Reading ability: Lexical quality to comprehension. Scientific Studies of Reading, 11, 357–383. Perfetti, C. A., & Hart, L. (2002). The lexical quality hypothesis. In L. Verhoeven, C. Elbro, & P. Reitsma (Eds.), Precursors of functional literacy (pp. 189–213). Amsterdam, the Netherlands: John Benjamins. Pikulski, J. J., & Chard, D. J. (2005). Fluency: Bridge between decoding and reading comprehension. The Reading Teacher, 58, 510-519.  Polk, A, L., & Miller A. D. (1994). Repeated readings and precision teaching: increasing reading fluency and comprehension in sixth through twelfth-grade boys with emotional disabilities. Journal of Precision Teaching, 12, 46-66.  Programme for the International Assessment of Adult Competencies (2014). Adults with Inadequate Literacy Skills. Retrieved August 05, 2017. http://www.conferenceboard.ca/hcp/provincial/education/adlt-lowlit.aspx   81 Rayner, K., Foorman, B. R., Perfetti, C. A., Pesetsky, D., & Seidenberg, M. S. (2001). How psychological science informs the reaching of reading. Psychological Science in the Public Interest, 2(2), 31-74 Rasinski, T. V. (1990). Effects of repeated reading and listening while-reading on reading fluency. Journal of Educational Research, 83, 147-150. Ricketts, J., Sperring, R., & Nation, K. (2014) Educational attainment in poor comprehenders. Frontiers in Psychology, 5, 1-11. doi: 10.3389/fpsyg.2014.00445 Reutzel, D. L., Brandt, L., Fawson, P. C., & Jones, C. D. (2014). Exploration of the consortium on reading excellence phonics survey: An instrument for assessing primary-grade students’ phonics knowledge. The Elementary School Journal. 115, 49-72.  Samuels, S. J. (1979). The method of repeated readings. The Reading Teacher, 32, 403 – 408.  Saunders, K. J. (2007). Work-attack skills in individuals with mental retardation. Mental Retardation and Developmental Disabilities Research Reviews, 13, 78-84. doi: 10.1002/mrdd.20137 Scammacca, N., Roberts, G., Vaughn, S., Edmonds, M., Wexler, J. , Klein Reutebuch, C., & Torgesen, J. K. (2007). Interventions for adolescent struggling readers: A meta-analysis with implications for practice. Portsmouth, NH: RMC Research Corporation, Center on Instruction. Scammacca, N. K., Roberts, G., Vaughn, S., & Stuebing, K. K. (2015). A meta-anlayis of interventions for struggling readers in grade 4-12: 1980:2011. Journal of Learning Disabilities, 48 (4), 369 – 390. doi: 10.1177/0022219413504995 Seymour, P. H. K., Aro, M., & Erskine, J. M. (2003). Foundation literacy acquisition in european orthographies. British Journal of Psychology, 94 (2), 143-174.    82 Shanker, J. L., & Ekwall, E. E. (1998). Locating and correction reading difficulties (7th ed.). Columbus, Ohio: Prentice-Hall.  Shapiro, E. S. (2004). Academic skills problems: Direct assessment and intervention (3rd ed.). New York: Guilford Press. Shinn, M. R., & Good, R. H. (1992). Curriculum-based measurement of oral reading fluency: A confirmatory analysis of its relation to reading. School Psychology Review, 21, 459–479. Shrank, F., McGrew, K., & Mather. N. (2014). The Woodcock Johnston IV Test of Cognitive Abilities. Rolling Meadows, IL: Riverside.  Sindelar, P. T., Monda, L. E., & O’Shea, L. J (1990). Effects of repeated readings on instructional and mastery-level readers. Journal of Educational Research, 83, 220-226. Snow, C. E. (2002). Reading for understanding: Toward an R&D program in reading comprehension. Santa Monica, CA: RAND.  Stokes, T. F., & Baer, D. M. (1977). An implicit technology of generalization. Journal of Applied Behavior Analysis, 10 (2), 349-367. Taylor, B., Harris, L. A., Pearson, P. D., & Gracia, G. (1995). Reading Difficulties: Instruction and Assessment (2nd ed.). New York: McGraw-Hill.  Therrien, W. J. (2004). Fluency and comprehension gains as a result of repeated reading: A meta-analysis. Remedial and special education, 25(4), 252-261. doi: 10.1177/07419325040250040801 Tighe, E. L., & Binder, K. S. (2015). An investigation of morphological awareness and processing in adults with low literacy. Applied Psycholinguistics, 36, 245-273. doi:10.1017/S0142716413000222   83 Torgesen, J. K. (2002). The prevention of reading difficulties. Journal of School Psychology, 40 (1), 7-26.  Vadasy, P. F., & Sanders, E. A. (2008). Benefits of repeated reading intervention for low-achieving fourth- and fifth-grade students. Remedial and Special Education, 29, 235-249. doi: 10.1177/0741932507312013 Van den Broek, P., & Espin, C. A. (2012). Connecting cognitive theory and assessment: measuring individual differences in reading comprehension. School Psychology Review, 41, 315-325.  Wanzek, J., Wexler, J., Vaughn, S., & Ciullo, S. (2010). Reading interventions for struggling readers in the upper elementary grades: a synthesis of 20 years of research. Reading & Writing, 23, 889-912. doi: 10.1007/s11145-009-9199-5 Wanzek, J., Vaughn, S., Scammacca, N. K., Metz, K., Murray, C. S., Roberts, G., & Danielson, L. (2013). Extensive reading interventions for students with reading difficulties after grade 3. Review of Educational Research, 83, 163-195. doi: 10.3102/0034654313477212. Wanzek, J., & Vaughn, S. (2007). Research-based implications from extensive early reading interventions. School Psychology Review, 36, 541–561. Wexler, J., Vaugh, S., Edmonds, M., & Klein Reutenbuch, C. (2008). A synthesis of fluency interventions for secondary struggling readers. Reading & Writing, 21, 317-347. doi: 10.1007/s11145-007-9085-7 Witt, J. C., & Elliott, S. N. (1985). Acceptability of classroom management strategies. In. T. R. Kratochwill (Ed.). Advances in school psychology (Vol. 4., pp. 251-288). Hillsdale, NJ: Erlbaum.   84 Write, M., & Mullan, F. (2006). Dyslexia and the phono-graphix reading programme. Dyslexia and Reading, 21, 77-84                        85 Appendices Appendix A INTRODUCTION OF THE CUECARD   1. Interventionist says: “Sometimes when I am reading, I get to a word I don’t know. When that happens, I like to use a couple tricks to help me. Sometimes I find it helpful to use one trick, other times when I find the word really tricky I use more than one of the tricks. I’ve written the tricks on a card for you incase you find it useful when you are reading a word you don’t know. Lets look at the 3 tricks that I like to use.” (Show the card)  3 Tricks Sound It Out  Break a long word into smaller chunks  Look for parts that are meaningful  2. Interventionist says: (Point to: ‘Sound It Out’) “For sounding it out you look at the sounds of a word. You start at the beginning of the word and keep sounding them out in order until you get to the end of the word. For longer words you can also use this for parts of words you don’t know.”  3. (Point to: ‘Break’) “Sometimes when I get to a longer word it is helpful to break the word up into chunks. One way to do this is to look at the vowels and the letters right next to and around the vowels and see if I can read those chunks. I then join the chunks together to read the word and see if it makes sense. When I use this trick, I often find it easier to read longer tricky words.”  4. (Point to: ‘Look’) “Other times I might not know the whole long word, but I do recognize parts of words inside the longer words. Sometimes these meaningful chunks are at the beginning of words and sometimes they are at the end. Other times I find them in the middle. I like to look at longer unknown words to check and see if I recognize any meaningful parts before I try to read an unknown word. I use this trick to help me a lot.”  5. Interventionist says: (Points to all three strategies simultaneously) ”Remember sometimes using just one trick is enough. Other times you might need to use two or maybe even all three. I’ll leave this card on the table for us to remind us to use our reading tricks.”     86 Appendix B  MULTISYLLABIC WORD DECODING  Example of how to model word-decoding of an unknown multisyllabic word:  Unknown word: resources  Hmmm…okay…I need my card for this: First I’m going to look for parts I recognize. I see that the word starts with <ree> and it ends with <es>.I’ve seen both of those chunks in other words. Hey, I also see the word <source> in the middle. Now I just need to blend the whole word together. <ree-source-es>. <resources>. That word makes sense with the sentence too.                                   87 Appendix C  INSTRUCTIONAL SEQUENCE AND SCRIPT   (*Script adapted from the HELPS protocol, Begeny, 2009)  INTRODUCTION OF THE CUECARD: Follow the procedure above  VERBAL CUE* 1. Interventionist says: “<Student Name>, today we’re going to do some reading. While we’re reading I want you to try and do your best reading. So, try and read  smoothly and as quickly as you can without making mistakes.”   2. Place the student copy of the Science Probe expository passage in front of the participant. Have the examiner copy (with the highlighted multisyllabic words) out of the student’s view (behind a clipboard).  FIRST PASSAGE READING*  3. Interventionist points to the first word of the passage and says: “When I say ‘Begin’ I would like you to start reading aloud. Try to read each word. If you come to a word you don’t know, try to use the tricks we talked about. If you still can’t read the word I’ll tell it to you. Do you have any questions? Be sure to do your BEST reading. Begin”  4. Begin timing as soon as the study reads the first word of the passage. The participant reads aloud for one minute. When the participant encounters a multisyllabic word they don’t know underline it for the error correction/modeling procedure. If the participant is not able to read the word after three seconds or is not working on decoding the word, supply the word to them.  MODELING OF THE MULTISYLLABIC DECODING STRATEGIES 5. Interventionist will now model the fluent reading of the passage. Upon encountering one (but up to five) of the participants’ specific multisyllabic word reading errors in the passage, the interventionists will model the most appropriate multisyllabic word decoding strategy or combination of strategies via a ‘think aloud’.   Interventionist says: “Now I’m going to read this to you. Please follow along with your finger, reading the words to yourself as I read them. When I get to a word I don’t know, I’m going to use the word reading tricks to help me. I want you to pay attention to how I am figuring out the tricky longer words.  6. Interventionist then reads the passage aloud and models the decoding strategies as appropriate. Read aloud until the entire passage has been read.  REPEATED READING*   88 7. The participant is then asked to read the passage another time using abbreviated instructions. Again, the student will be timed for one minute. Remember to underline the multisyllabic words that the student reads incorrectly.  Interventionist says: “Now it is your turn to read the story another time. Please do you best reading again. Begin.”  SECOND MODELING OF THE MULTISYLLABIC DECODING STRATEGIES   8. Before the interventionist begins the reading very briefly see which multisyllabic words are underlined twice. These are the words to focus on during this round of modeling.  Interventionist says: “Now it is my turn to read again. I’m going to try and use my reading tricks again to help me. Again, be sure to follow along as I’m reading and read the words quietly to yourself along with me.”  9. Interventionist then reads the passage aloud and models the decoding strategies as appropriate. Read aloud until the entire passage has been read.  FINAL REPEATED READING* 10. The participant is then asked to read the passage another time using abbreviated instructions. Again, the student will be timed for one minute. Remember to underline the multisyllabic words that the student reads incorrectly.  Interventionist says: “Now it is your turn to read the story one last time. Please do you best reading again. Begin.”  11. Record the WCPM for the participant’s first and third reading and use the student graph to plot performance and provide brief verbal praise.   

Cite

Citation Scheme:

        

Citations by CSL (citeproc-js)

Usage Statistics

Share

Embed

Customize your widget with the following options, then copy and paste the code below into the HTML of your page to embed this item in your website.
                        
                            <div id="ubcOpenCollectionsWidgetDisplay">
                            <script id="ubcOpenCollectionsWidget"
                            src="{[{embed.src}]}"
                            data-item="{[{embed.item}]}"
                            data-collection="{[{embed.collection}]}"
                            data-metadata="{[{embed.showMetadata}]}"
                            data-width="{[{embed.width}]}"
                            async >
                            </script>
                            </div>
                        
                    
IIIF logo Our image viewer uses the IIIF 2.0 standard. To load this item in other compatible viewers, use this url:
http://iiif.library.ubc.ca/presentation/dsp.24.1-0354258/manifest

Comment

Related Items