UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Impact of word prediction & symbol-supported writing software on written output of students with Down… McCartney, Joanne 2008

Your browser doesn't seem to have a PDF viewer, please download the PDF to view this item.

Item Metadata

Download

Media
24-ubc_2008_fall_mccartney_joanne.pdf [ 489.53kB ]
Metadata
JSON: 24-1.0066694.json
JSON-LD: 24-1.0066694-ld.json
RDF/XML (Pretty): 24-1.0066694-rdf.xml
RDF/JSON: 24-1.0066694-rdf.json
Turtle: 24-1.0066694-turtle.txt
N-Triples: 24-1.0066694-rdf-ntriples.txt
Original Record: 24-1.0066694-source.json
Full Text
24-1.0066694-fulltext.txt
Citation
24-1.0066694.ris

Full Text

   IMPACT OF WORD PREDICTION & SYMBOL-SUPPORTED WRITING SOFTWARE ON WRITTEN OUTPUT OF STUDENTS WITH DOWN SYNDROME by JOANNE McCARTNEY B.A., University of British Columbia, 2001    A THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF  MASTER OF ARTS in THE FACULTY OF GRADUATE STUDIES (Special Education)  THE UNIVERSITY OF BRITISH COLUMBIA (Vancouver) October 2008  Joanne McCartney, 2008   ii Abstract This study examined the effectiveness of two types of assistive technology for writing instruction of students with Down syndrome in British Columbia. Students received either Clicker 5, a symbol-supported writing software program; or Co:Writer, a word prediction software program designed to support written output. Data collection was conducted between January-June 2007 (Year 1) and October-May 2008 (Year 2). Clicker 5 was provided to 43 students in Year 1 (17 of whom also participated in the study during Year 2) and was designed to support early and emergent literacy development. Co:Writer was provided to 18 students in Year 1 (2 of whom also participated in the study during Year 2) and was designed to support text writing. Each month during both school years, teachers were asked to complete an on-line survey with questions related to their impressions of the impact of the technology and other variables. Students in the Clicker group produced 10-minute monthly writing samples about a selected topic using a Clicker grid designed by the research team. Students in the Co:Writer group produced one handwritten and one Co:Writer-supported 10- minute writing sample every month about the specified topic. Data were analyzed with regard to writing rate, spelling accuracy (Co:Writer group only), and quality (measured both analytically and holistically). Results for dependent measures of writing for the Clicker group were variable but provided some support for the use of symbol-supported writing software for producing meaningful written output. The Co:Writer group was more accurate with regard to spelling and grammar while using Co:Writer compared to handwriting. The results are discussed in terms of the practical implications, limitations, and areas for future research.   iii TABLE OF CONTENTS ABSTRACT.......................................................................................................................ii  TABLE OF CONTENTS ..................................................................................................iii  LIST OF TABLES ............................................................................................................ iv  LIST OF FIGURES............................................................................................................v  ACKNOWLEDGEMENTS..............................................................................................vii  CHAPTER 1: REVIEW OF LITERATURE.......................................................................1 Down Syndrome .....................................................................................................1 Importance of Literacy............................................................................................2 Barriers to Literacy Instruction ...............................................................................3 Evolution of Literacy Instruction for Students with Down Syndrome......................5 Literacy and Assistive Technology .......................................................................11 Statement of the Problem......................................................................................15  CHAPTER 2: METHOD..................................................................................................18 Recruitment and Informed Consent.......................................................................18 Participants ...........................................................................................................19 Technology Training.............................................................................................23 Setting ..................................................................................................................23 Activities and Materials ........................................................................................24 Dependent Variables.............................................................................................27 Measures ..............................................................................................................28 Research Assistant Training..................................................................................30 Data Analysis .......................................................................................................31  CHAPTER 3: RESULTS..................................................................................................33 Clicker 5 ...............................................................................................................33 Co:Writer..............................................................................................................44 Summary ..............................................................................................................62  CHAPTER 4: DISCUSSION............................................................................................64 Clicker 5 ...............................................................................................................64 Co:Writer..............................................................................................................69 Implications ..........................................................................................................73 Limitations and Future Research...........................................................................74 Conclusion............................................................................................................76  REFERENCES.................................................................................................................77  APPENDICES .................................................................................................................83   iv  LIST OF TABLES Table 1: Holistic quality rating scales (adapted from Needels & Knapp, 1994)............. 29 Table 2: Mean inter-rater reliability ............................................................................. 32 Table 3: Number of hours per day of student Clicker 5 use at school............................ 40 Table 4: Literacy components in common use across Clicker 5 students ...................... 41 Table 5: Number of minutes per day support staff spent programming Clicker 5.......... 42 Table 6: Percentage of support staff reporting barrier in implementing Clicker 5 ......... 43 Table 7: Correlations between Clicker 5 dependent variables....................................... 44 Table 8:  Number of hours per day of student Co:Writer use at school .......................... 58 Table 9: Literacy components in common use across Co:Writer students..................... 59 Table 10: Number of minutes per day support staff spent programming Co:Writer ........ 60 Table 11: Percentage of support staff reporting barrier in implementing Co:Writer ........ 61 Table 12: Correlations between Co:Writer dependent variables ..................................... 62 Table 13: Summary of major findings............................................................................ 63   v LIST OF FIGURES  Figure 1: Geographic distribution of students in the Clicker 5 group ............................. 21 Figure 2: Geographic distribution of students in the Co:Writer group............................ 22 Figure 3: Clicker 5: Mean number of words produced per 10 minutes ........................... 34 Figure 4: Clicker 5: Percent of correct word pairs.......................................................... 35 Figure 5: Clicker 5: Average scores for conventions and mechanics .............................. 36 Figure 6: Clicker 5: Average scores for meaning ........................................................... 37 Figure 7: Clicker 5: Reported effect on participation in academic activities ................... 38 Figure 8: Clicker 5: Reported effect on communication with peers................................ 39 Figure 9: Clicker 5: Reported effect on participation in social activities ........................ 40 Figure 10: Co:Writer: Mean number of words per 10 minutes ......................................... 45 Figure 11: Co:Writer: Percent of correctly spelled words per 10 minutes ........................ 46 Figure 12: Co:Writer: Percent of correct word pairs per 10 minutes ................................ 47 Figure 13: Co:Writer: Average scores for conventions and mechanics............................. 48 Figure 14: Co:Writer: Average scores for meaning.......................................................... 49 Figure 15: Co:Writer: Total number of words for Student 1 ............................................ 50 Figure 16: Co:Writer: Total number of words for Student 2............................................. 50 Figure 17:  Co:Writer: Percent of correctly spelled words for Student 1 ........................... 51 Figure 18: Co:Writer: Percent of correctly spelled words for Student 2 ........................... 52 Figure 19: Co:Writer: Percent of correct word pairs for Student 1 ................................... 53 Figure 20: Co:Writer: Percent of correct word pairs for Student 2 ................................... 53 Figure 21: Co:Writer: Conventions and mechanics scores for Student 1 .......................... 54 Figure 22: Co:Writer: Conventions and mechanics scores for Student 2 .......................... 54   vi Figure 23: Co:Writer: Meaning scores for Student 1........................................................ 55 Figure 24: Co:Writer: Meaning scores for Student 2........................................................ 55 Figure 25: Co:Writer: Reported effect on participation in academic activities.................. 56 Figure 26: Co:Writer: Reported effect on communication with peers .............................. 57 Figure 27: Co:Writer: Reported effect on participation in social activities ....................... 58   vii ACKNOWLEDGEMENTS I would like to extend sincere thanks to Dr. Pat Mirenda for her continued guidance and encouragement, and to my thesis committee for their advice and support. Thank you to SET-BC for providing me with this project, and to Daphne Mercier for putting so much time and effort into TAP. Thank you to all of the students and school staff who participated in the project, and to Katie, Emily, and Nina for their hard work and dedication. Thank you to Karen for statistical advice, and to the rest of the lab for providing moral support. Special thanks to Andy for inspiring me to work hard, and for his constant support and belief in me.   1 CHAPTER 1 Review of the Literature  Students with Down syndrome often encounter challenges in producing written output. Computer programs such as word prediction software and symbol-supported writing software are gaining popularity as tools that can be used to enhance the written output of students with disabilities. Clicker 5 (Crick Software, Inc.) is an example of a symbol- supported writing software program that pairs symbols with words in order to promote whole word recognition. Students can write in Clicker 5 using a regular keyboard, or by clicking on letters, words, and/or phrases in a grid containing cells with these elements. Co:Writer (Don Johnston, Inc.) is a word prediction software program that enables students to type the first letter of a word in their word processor, and then select the desired word from a numbered list by typing in the corresponding number. The current study examined the impact of these two software programs on the written output of students with Down syndrome. Down Syndrome  Approximately one in every 700-800 births is affected by the congenital condition, Down syndrome (BC Ministry of Education, 2007). Although each individual with Down syndrome is unique, numerous physical and cognitive characteristics are commonly associated with the syndrome, which is caused by the presence of an extra copy of chromosome 21. These characteristics include mild to severe intellectual disability; slow progression through the stages of development; and difficulty mastering grammar, expressive language, and articulation (Dykens, Hodapp, & Finucane, 2000).  Down syndrome is also associated with low muscle tone, which contributes to articulation and fine motor difficulties for many individuals. In addition, Down syndrome is   2 often associated with poor motor control, planning, and memory, which often result in difficulty with writing (Oelwein, 1995). Although vision problems are common, areas of relative strength among individuals in this population typically include visual memory and visual-motor skills (Dykens, et al. 2000). Despite this strength in visual processing, for many years students with Down syndrome were not provided with literacy instruction because they were thought to be unable to acquire reading and writing skills. However, literacy is becoming an important part of these students’ education (Trenholm & Mirenda, 2006). Importance of Literacy The ability to read and write is an important skill for everyday life. Literacy skills allow individuals to participate in the world around them by supporting communication and access to information. The written word pervades our lives—from the labels on food products, to street signs, to e-mail, to bills, to advertisements in newspapers, to textbooks, and even to television, the written word enables us to learn about and communicate with the world around us. Written notes can be used as memory aids, can help focus one’s attention, can document past and current events, and can enhance comprehension of a message. Koppenhaver (2000) suggested that “if communication is the essence of human life, then literacy is the essence of a more involved and connected life” (p. 270). In addition, academic ability, which is often a determining factor in educators’ perceptions of successful school inclusion, is often judged on the basis of one’s literacy and numeracy skills (Kemp & Carter, 2006). Often, as students progress through their school years, the ability to read and write becomes increasingly important for placement decisions in regular vs. special education settings. While academic success is not the only determining factor for inclusive placements, poor academic performance can limit a student’s ability to   3 participate in general education classrooms. Additionally, Kemp and Carter (2006) argued that teachers are more likely to use their perception of students’ academic ability to judge the success of an integrated placement than they are to use actual performance measures. In order to promote successful academic and social outcomes, literacy should be among the skills taught to all students, both with and without disabilities (Kaderavek & Rabidoux, 2004). For many years, individuals with Down syndrome were thought to be incapable of becoming literate; however, there now is an increasing awareness of literacy issues for these individuals (Trenholm & Mirenda, 2006). Researchers have discovered that students with Down syndrome can learn to read (Boudreau, 2002; Byrne, MacDonald, & Buckley, 2002), and that many of these individuals begin to acquire literacy quite early, before entering school (Buckley, 1995). For example, in a recent Canadian survey of the parents of individuals with Down syndrome (N = 224), 47% of parents whose children were between 19-41 years of age reported that they were able to read at a grade 3-4 level and an additional 25% reported reading levels between grades 5-8 (Trenholm & Mirenda, 2006). This suggests that the majority of people with Down syndrome can achieve at least some level of literacy skills, if provided with the supports to do so. Barriers to Literacy Instruction  From an early age, most children are exposed to thousands of literacy activities that allow them to develop the skills necessary for reading and writing. However, many students with disabilities are not given the same literacy opportunities as their typically developing peers. This is often because parents and/or educators have low expectations of their ability to benefit from literacy instruction (Downing, 2006) and/or because educators lack the   4 knowledge and experience required to adapt or modify a literacy curriculum to fit these students’ needs (Sturm et al., 2006). Beukelman and Mirenda’s (2005) Participation Model described two types of barriers that can be applied to participation in literacy learning: opportunity barriers and access barriers. Opportunity barriers often limit opportunities for literacy learning or instruction. These are barriers related to policy, practice, knowledge, skill, and attitude. For example, segregated schools or classrooms for students with disabilities are often devoid of the wide array of literacy activities that are so prominent in typical classrooms, because teachers either assume that students with disabilities cannot learn to read or give priority to teaching “functional” skills in other areas (Kliewer, Biklen, & Kasa-Hendrickson, 2006). Kliewer et al. noted that, in such situations, “disability becomes an idea that precludes the possibility of human development, including, importantly, the development of a literate presence” (p. 175). Furthermore, children with disabilities are often provided with less access to reading and writing activities at home than children without disabilities (Koppenhaver, Hendrix, & Williams, 2007). Trenholm and Mirenda (2006) found that, although 70% of all parents surveyed reported that their children with Down syndrome displayed moderate to high levels of interest in reading and writing, less than 25% of parents rated literacy activities as high priority goals for their children. In addition, more than half of the respondents reported a narrow range of reading materials in the home, consisting primarily of storybooks, picture books, and computers. Although more than half of the children were reported to use these materials on a regular basis (i.e., many times per day), 68% of parents stated that they spent less than 15 minutes per day discussing reading activities with their child with Down syndrome. Despite such barriers to learning reading and writing, many individuals with   5 disabilities, including those with Down syndrome, have demonstrated the ability to develop literacy skills when provided with the opportunity to do so (Kliewer et al., 2006; Oelwein, 1995). Evolution of Literacy Instruction for Students with Down Syndrome  Over the years, several different approaches have been adopted for teaching literacy to students with Down syndrome. The most notable of these are sight word instruction, phonological awareness training, and balanced literacy, which are discussed below. Sight Word Instruction For many years, literacy opportunities have been provided to many students with disabilities in the form of sight word instruction, which is considered one of the essential building blocks for acquiring functional academic skills (Browder & Xin, 1998). Fossett and Mirenda (2006) argued that sight word instruction “holds promise for individuals without functional speech, who are often excluded from literacy instruction because of their inability to verbalize words in order to demonstrate reading comprehension” (p. 413). Sight words are often targeted for individuals with Down syndrome because many individuals in this population demonstrate strong visual processing skills (Buckley, 1995; Trenholm & Mirenda, 2006). However, Browder and Xin (1998) noted that common limitations of sight word research are a lack of comprehension and ability to generalize to daily routines, which calls into question the validity of focusing on this strategy for literacy instruction. Teaching literacy involves much more than mastering a discrete set of skills like sight word recognition (Downing, 2005). Sight word learning alone limits an individual to a finite number of words, whereas the ability to decode unfamiliar words offers a much larger range of possibilities   6 (Smith, 2005). The notion that students must master a set of prerequisite skills, such as sight word recognition, before receiving more meaningful literacy instruction “can serve as a major barrier to addressing more realistic and attainable literacy goals” (Downing, 2005, p. 66). On the other hand, when sight word instruction is included as one component of a more extensive and comprehensive literacy program, it can provide individuals with a set of common words that can enhance their ability to function independently in daily life and can act as a foundation for the “concept of word within text” (Browder, Courtade-Little, Wakeman, & Rickelman, 2006, p. 87). Phonological Awareness More recently, literacy instruction has placed an emphasis on phonological awareness (Trenholm & Mirenda, 2006). Phonological awareness is described as “an individual’s overt knowledge of the sound structure of language” (Fletcher & Buckley, 2002, p. 11), including such tasks as rhyming and alliteration. It has been linked with success in reading and spelling in both typical learners and those with Down syndrome (Fletcher & Buckley, 2002). For example, Kennedy and Flynn (2003) studied the effects of phonological awareness training on three children with Down syndrome and found that all three participants demonstrated significant improvements in spelling, alliteration, and phoneme isolation skills as a result of the intervention. The idea of a balanced approach to literacy, which includes both sight word instruction and phonological awareness training, is rapidly gaining support as an effective method of teaching literacy to students with Down syndrome (Trenholm & Mirenda, 2006). Balanced Literacy Recent research in literacy education has adopted the balanced instructional approach to teaching literacy skills as “best practice” (Koppenhaver et al., 2007). Several key   7 components of instruction can be used in combination to achieve a balanced literacy curriculum: (a) phonemic awareness, (b) phonics, (c) sight words, (d) reading comprehension, (e) writing, (f) self-directed reading, (g) fluency, and (h) vocabulary (Browder, Wakeman, Spooner, Ahlgrim-Delzell, & Algozzine, 2006; Erickson, Koppenhaver, & Cunningham, 2006). Such curricula include a balance of components that incorporate students’ strengths, needs, and personal interests in addition to providing students with opportunities for positive, meaningful, dynamic interactions with a variety of printed materials (Colak & Uzuner, 2004; Kaderavek & Rabidoux, 2004). Despite the fact that they experience greater difficulty acquiring literacy skills than typically developing students, children with disabilities can be successful in acquiring such skills if they are provided with rich emergent literacy experiences at home and at preschool. The concept of emergent literacy has replaced the idea that prerequisite skills are needed before teaching literacy (Byrne et al., 2002; Koppenhaver et al., 2007). Emergent literacy is described as “the range of activities and behaviors related to written language that are developed cooperatively by interactive partners or are evident in an individual’s creative play or social context” (Kaderavek & Rabidoux, 2004, p. 238). In order to promote emergent literacy skills, parents and teachers are encouraged to provide children with drawing and writing tools, interactive storybook reading, software tools, and the supports required for them to interact with these tools on a regular basis and in natural settings. Vygotsky’s model of learning suggests that social interaction is a critical element of learning any new skill, so another important part of a child’s acquisition of reading and writing involves adult- and peer-mediated literacy experiences (Kaderavek & Rabidoux, 2004).   8 While emergent literacy interventions are concerned with students’ awareness of all forms of print, beginning writing interventions encourage students to become more independent, using “written language for their own personal and interactive purposes” (Erickson et al., 2006, p. 310). A balanced approach incorporates key components of literacy instruction in order to promote success beyond emergent skills, in beginning writing. To determine whether or not a student is ready to move beyond emergent literacy learning to beginning writing, Erickson et al. (2006) recommended informal measures of students’ skills and awareness of (a) concepts about print, (b) letter identification, (c) word generation, (d) phonological and phonemic awareness, and (e) receptive language. Providing balanced literacy interventions to students with these foundational skills leads to success in learning to read and other improvements in student achievement for students with developmental disabilities (Erickson et al., 2006). For instance, Hedrick, Katims, and Carr (1999) reported gains in reading and writing (based on measures using the Test of Early Reading Ability – 2) in nine students with intellectual disabilities after participating for one year in a Four Blocks balanced literacy curriculum that consisted of guided reading, writing, and self-selected reading. Downing (2005) suggested additional strategies such as “offering choices, following the student’s interests, providing opportunities, making literacy activities accessible, ensuring meaningfulness, making literacy practices interactive and fun, and having clear educational goals” (p. 67), in order to augment reading instruction. She described systematic instructional methods that assist students to develop more conventional literacy skills, such as highlighting stimuli and shaping responses, modeling reading and writing behaviours, checking comprehension, allowing sufficient time for a response, providing appropriate reinforcement,   9 and fading instructional prompts. Downing also cautioned that steps must be taken to ensure the generalization of newly learned literacy skills to the many relevant environments in students’ lives. For example, she suggested identifying several times in the day to practice important literacy skills and to do this with a variety of people on the student’s support team. Two specific balanced literacy approaches are commonly used with individuals with Down syndrome—Oelwein’s (1995) guide for parents and teachers and LATCH-ON (Moni & Jobling, 2001). Both will be discussed briefly in the sections that follow. Oelwein (1995). Since 1995, Oelwein’s guide for teaching reading to students with Down syndrome has been the primary resource used by teachers and parents, at least in North America and other Commonwealth countries. Although Oelwein (1995) placed heavy emphasis on sight word and phonics instruction, her guide also includes balanced literacy components such as reading comprehension, writing, fluency, and vocabulary development. However, despite its widespread use, there is no empirical evidence to support the use of this program.  LATCH-ON. A recent balanced literacy program that has been found to be successful in teaching literacy to students with Down syndrome is the Literacy and Technology Hands-On (LATCH-ON) program designed by Moni and Jobling (2001). They adopted a socio-cultural approach to instruction that emphasizes the different contexts for and purposes of literacy as elements that allow participation in one’s community. LATCH- ON was designed as a post-secondary continuing education program for young adults with Down syndrome in order to “develop students’ abilities to communicate in written, oral and visual mediums, to foster friendships via literacy and technology, and to support literacy as desirable and valued aspect in the students’ quality of life” (Moni & Jobling, 2001, p. 380).   10  In LATCH-ON, students are guided to engage in a variety of meaningful literacy activities such as sending e-mail messages, writing reviews and critiques about popular culture, doing research on the Internet, and so forth; they are also involved in establishing their own learning goals. Teaching methods include direct instruction, demonstrations, modeling, and scaffolding.  In their research, Moni and Jobling (2001) followed 18 participants (equally spread among three cohorts) with Down syndrome. These participants ranged in chronological age from 17 to 20 years, with age equivalence scores (based on the Neale Analysis of Reading Ability – Revised) ranging from 6 years 4 months to 12 years 2 months for reading accuracy, and 6 years 8 months to 12 years 2 months for reading comprehension. After completing the 24-month LATCH-ON program, students in Cohorts 1 and 2 demonstrated mean gains of 2 years 9 months in reading rate, 14.2 months in reading accuracy, and 8.8 months in comprehension. After completing 12 months of the 24-month program, students in Cohort 3 demonstrated minimal gains in reading rate for all six students, similar gains in accuracy for three of the students, and gains in comprehension for four of the students. Although the gains were not always dramatic and there is little other research to support the LATCH-ON program, Moni and Jobling (2001) demonstrated that these students can continue to acquire literacy skills and engage in sophisticated literacy activities into adulthood if they are provided with ongoing support. This is an important concept to teach educators, who often assume that children who have not acquired literacy skills along with their same-age peers will never learn to read (Gallaher, van Kraayenoord, Jobling, & Moni, 2002).   11 Literacy and Assistive Technology Assistive technology is becoming increasingly recognized as a useful tool in literacy programs for students both with and without developmental disabilities (Lange, McPhillips, Mulhern, & Wylie, 2006; Lloyd, Moni, & Jobling, 2005; Wepner & Bowes, 2004). Technology can enable students to express themselves and find enjoyment in literacy activities (Zhang, 2000). In British Columbia, over 2,000 students receive assistive technology from the provincial resource program, Special Education Technology – BC (SET- BC). Some examples of the tools provided to these students are discussed below. Word Prediction  One of the assistive technology tools frequently offered to schools by SET-BC is word prediction software such as Co:Writer 4000 (Don Johnston, Inc., 2001). This software allows students to type the first letter of a word and then select the desired word from a numbered list by typing the corresponding number. The list of words becomes more precise as the student types more letters, eliminating words that do not include the combination of letters provided. A synthetic speech option allows students to hear words as they are presented or typed, and linguistic prediction uses linguistic and grammatical rules to predict the word that is most likely to occur next in the sentence. Co:Writer 4000 also offers topic dictionaries which provide specialized lists of words, and “flexible spelling” that allows word prediction based on phonetic spelling (e.g. if a student types fon, the software will predict the word phone). Several studies have examined the effects of word prediction software on writing skills of students with learning disabilities (MacArthur, 1998) and physical disabilities (Mirenda, Turoldo, & McAvoy, 2006; Tumlin & Heller, 2004). Tumlin and Heller (2004) suggest that “word prediction software can be beneficial for students with fine   12 motor problems who are poor typists, who have difficulty with handwriting, or who need help with spelling tasks” (p. 6).  Rate. Individuals with disabilities, particularly those with physical disabilities, often struggle with writing because of the large amount of effort required. Newell, Arnott, Booth, Beattie, Brophy, & Ricketts (1992) suggested that word prediction software can be used to enhance writing rate (e.g. words per minute). In their case study involving 17 students with physical disabilities, Newell et al. (1992) demonstrated improvements in writing rates when using the PAL word prediction system. Similarly, Tumlin and Heller (2004) studied the effect of word prediction software on typing speed in four individuals with physical disabilities; two of these individuals showed increased typing rates with the software. However, Mirenda et al. (2006) did not find significant differences in writing rates in a comparative study of 24 individuals with physical disabilities who used Co:Writer, word processing, and handwriting. There is also no research to support the idea that word prediction software benefits typing rates of students with learning disabilities. For example, MacArthur (1998) studied five students with learning disabilities and noted no difference in the number of words per minute with or without word prediction. Nevertheless, since many students with Down syndrome have poor fine motor skills, word prediction may be a promising tool for improving writing rates in this population. Spelling accuracy. Another writing skill which is often challenging for students with disabilities is spelling. Errors in spelling may be the result of poor motor control, limited exposure to spelling activities, or any number of other reasons. Regardless of the reasons for deficits in this area, many studies have found that word prediction software can result in spelling improvement. For example, Tumlin and Heller (2004) noted lower mean percentages   13 of spelling errors when using word prediction software in two out of four individuals with physical disabilities. Similarly, Newell et al. (1992) observed improvements in spelling accuracy in most of their 17 participants with physical disabilities when they had access to a word prediction tool. Mirenda et al. (2006) found that the mean percentage of words spelled correctly was significantly higher with both Co:Writer (96.94%) and word processing (92.14%) than with handwriting (67.27%); however, no significant differences were found between Co:Writer and word processing. Research about students with learning disabilities has been consistent with these findings. MacArthur (1998) observed increases in percentages of correctly spelled words from 42%-75% in baseline to 90%-100% using word prediction software in four students with learning disabilities. Despite small sample sizes, these results are encouraging for students with Down syndrome, who may benefit from assistance with spelling for reasons relating to motor and/or cognitive impairments. Quality. Another commonly used measure of writing is quality. Hasbrouck, Tindal, and Parker (1994) proposed that legibility, spelling, content, vocabulary, and grammar can be used to assess the quality of a student’s writing. There is some evidence to suggest improved quality using word prediction software. In their study, Mirenda et al. (2006) reported significant differences in the mean length of consecutive correct word sequences between Co:Writer (5.55) and word processing (4.13) and between Co:Writer and handwriting (3.31). Additionally, they reported significant differences in the percentage of legible words both between Co:Writer (96.94%) and handwriting (70.83%) and between handwriting and word processing (92.14%). Finally, pairwise comparisons also demonstrated significant differences in correct word sequences between Co:Writer, handwriting, and word processing (85.57% with Co:Writer, 69.21% with word processing, and 50.45% with handwriting), suggesting   14 that word prediction software has a positive effect on writing quality for individuals with physical disabilities. It is possible that results could be similar for students with Down syndrome, who may struggle with these aspects of writing due to physical and/or cognitive deficits. Symbol-Supported Writing  There are currently several symbol-supported writing products designed to support reading and writing, such as Clicker 5 (Crick Software) and Writing with Symbols 2000 (Widgit Software). These programs pair symbols with words in order to promote whole word recognition. For example, Crick Software describes their product as a “talking word processor” with which students can write using the keyboard, or by clicking letters, words, and/or phrases in a grid containing cells. These cells often contain both a word and its corresponding symbol. Although it claims to be a “powerful tool” for writing, no research has been published to support this claim. In fact, Fossett (2003) suggested that simply pairing symbols with words is not an effective method for teaching word recognition because the presence of the picture interferes with the student’s ability to recognize the text itself. Fossett and Mirenda (2006) found that active picture to text matching was more effective for teaching sight words than passive pairing of pictures and text in two students with developmental disabilities. Thus, the effectiveness of symbol-supported writing tools requires focused research attention. Technology in the Classroom Educators choosing to use assistive technology should be able to evaluate the quality of the programs and choose those which will most benefit their students (Huttinger, Bell, Daytner, & Johanson, 2006; Lloyd et al., 2005). Huttinger et al. emphasized the importance   15 of teachers being trained to use the technology effectively and incorporate it into the classroom. Clearly, competence in the use and integration of assistive technology occurs over time. Huntlinger et al. noted that, as students begin to experience success, teachers are more likely to continue to use specific technologies, which in turn results in increased student success. Statement of the Problem  Historically, individuals with cognitive disabilities have been deprived of the “literate citizenship” (Kliewer et al., 2006, p. 165) that would allow them to participate more fully in their communities. Countless students with disabilities have been sent to segregated schools, where reading is often not part of the curriculum because “they’re not readers.” Despite the fact that thousands of individuals with disabilities have learned to read and write, many educators still discuss the “possibility” of teaching reading and writing to these individuals, rather than recognizing literate potential in everyone. The question should not be "should we teach reading and writing to individuals with disabilities?” but rather “how should we teach reading and writing to individuals with disabilities?”  Since technology is being used more and more in balanced literacy curricula, it is important to examine its effectiveness. Students in both special and general education classrooms are using computers for a variety of reading and writing activities, including reading “living books,” designing web pages, and composing e-mails (Lund, Angell, & Atwood, 2000). While the use of technology as a tool for literacy education has been widely researched in relation to students with both learning and physical disabilities, there is little research about its success specifically with students with Down syndrome. Buckley (1995) theorized that the use of computers can benefit students with Down syndrome, allowing them   16 to work at their own pace without fear of failure. Oelwein (1995) described fear of failure as one of the possible reasons why some children with Down syndrome do not learn to read, despite having opportunities to do so. Other possible reasons (aside from a lack of appropriate instructional opportunities) are that they may not be motivated to read or may be physically unable to write. Oelwein suggested that using computers may help children overcome these obstacles. The purpose of this study was to examine the impact of word prediction and symbol- supported writing technologies for supporting literacy education in students with Down syndrome. The specific questions addressed in the study were as follows: Research Questions Related to the Use of Clicker 5 1. In monthly writing samples, how did Clicker 5 affect students’ writing rates, as indicated by number of words written per 10 minutes? 2. In monthly 10-minute writing samples, how did Clicker 5 affect the quality of students’ writing, as indicated by the percentage of correct word pairs; and by holistic scores for conventions and mechanics, and meaning? Research Questions Related to the Use of Co:Writer 1. In monthly writing samples, how did Co:Writer affect students’ writing rates, as indicated by number of words written per 10 minutes? 2. In monthly 10-minute writing samples, how did Co:Writer affect the quality of students’ writing, as indicated by the percentage of correct word pairs; and by holistic scores for conventions and mechanics, and meaning?   17 3. In monthly 10-minute writing samples, how did Co:Writer affect the spelling accuracy of students’ writing, as indicated by the percentage of correctly spelled words? 4. Was there a significant difference between handwritten samples and computer samples (using Co:Writer) for these students, with regard to writing rate, percentage of correct word pairs, percentage of correctly spelled words, conventions and mechanics, and meaning? Research Questions Related to the Monthly Surveys 1. What did teachers report about the impact of Clicker 5 (symbol supported software) and Co:Writer (word prediction software) on students’ academic, social, and communication skills, over 2 years? 2. According to teacher reports, how were Clicker 5 and Co:Writer used in the context of students’ overall reading and writing programs? a) How many minutes per week did students spend using the technology? b) How many minutes per week did teachers spend programming the technology? c) What were the other components of students’ literacy programs? 3. What barriers to implementing the technology did teachers report? 4. Was there a relationship between improvement on any of the dependent measures and a) mean hours of student use of the software? b) mean amount of time teachers spend programming the software? c) mean number of components in a student’s overall literacy program?   18 CHAPTER 2 Method Recruitment and Informed Consent  In September 2006, SET-BC began the Technology Access Project (TAP), a pilot project that was established to provide assistive technology to 170 students with moderate/severe intellectual disabilities in British Columbia, Canada. Students were eligible if they (a) had Down syndrome; (b) were enrolled in a public school in BC, grades kindergarten-grade 12; (c) were exposed to English as their primary language at home and at school; (d) had no significant visual or hearing impairments; and (e) were supported by a school team with access to a high speed internet connection and the ability to devote at least 2 days per month of planning and release time to the project. School district personnel throughout BC nominated students for the project based on SET-BC guidelines, and selected one of three technology “bundles” for each student. Bundle 1 was designed to support early and emergent literacy development and included Clicker 5 with Picture Communication Symbols (Crick Software, Inc.) and Balanced Literacy software (Intellitools, Inc.). Bundle 2 was designed to support emergent writing and included Clicker 5 with Cloze Pro (Crick Software, Inc.) and Co:Writer (Don Johnston, Inc., 2001). Bundle 3 targeted communication development and included Speaking Dynamically Pro/Boardmaker software with Functionally Speaking (Mayer Johnson, LLC). Only students using software that targeted writing (Clicker 5 and Co:Writer) were included in the current study. In September 2006, the parents of all TAP students were sent a package containing: (a) a letter of invitation from the Provincial Coordinator of SET-BC (Appendix B); (b) a letter from the research team explaining the purpose of the research and inviting their child to   19 participate (Appendix C); and (c) a stamped, self-addressed envelope. In May of 2007, the project was extended for another year. The parents of students included in the first year of the study were sent packages containing a letter from the research team inviting their child to continue participating in the research component of TAP (Appendix D) and a stamped, self- addressed envelope. Participants Year 1  Informed consent was obtained from the parents of 101 students for Year 1 (January – June, 2007). Sufficiently complete data sets were obtained during Year 1 for 61 of these students (see data analysis section for details). Participants were categorized by the technology they received. Students who received Clicker 5 had early and emergent literacy skills, and were unable to write prior to receiving the technology. This group consisted of 29 males and 14 females (n = 43) ranging in age from 6;0 to 18;0, with an average age of 11;0. Students from all grades were represented in this group, from Kindergarten to Grade 12, with 30% of students in Kindergarten to Grade 3, 54% of students in Grades 4 to 7, and 16% of students in Grades 8 to 12. For this group, the school contact person was a classroom teacher (2.6%), a district worker or speech-language pathologist (13.1%), a special education teacher (44.8%), or a special education assistant (39.5%). The majority of students were from urban school districts (67%), whereas only 33% were from rural districts (Figure 1).  The Co:Writer group targeted students who had some beginning writing skills, but required additional support for producing written output. These students consisted of 9 males and 9 females (n = 18) ranging in age from 9;8 to 15;6, averaging 12;3 years. There were no   20 students in the primary grades; 79% of students were in Grades 4 to 7 and 21% of students were in Grades 8 to 12. School contacts consisted of classroom teachers (4.5%), district workers/speech-language pathologists (13.6%), special education teachers (50%), and special education assistants (31.8%). Students were spread across the province with 88.9% in urban school districts and 11.1% in rural districts (Figure 2). Year 2   Informed consent to participate in the TAP research for Year 2 (October 2007 – May 2008) was obtained for 38 of the 43 Year 1 students who used Clicker 5 (88.3%). However, sufficiently complete Clicker 5 data sets were obtained for only 17 of the 38 students (44.7%). These students included 13 males and 4 females, between the ages of 7;1 and 15;7, with an average age of 10;4 years. Forty-one percent of the students were in Grades 1 to 3, 47% were in Grades 4 to 7, and 12% were in Grades 8 to 12. School contacts were district personnel/speech-language pathologists (17.7%), special education teachers (52.9%), and special education assistants (23.5%). Seventy-one percent of students in the Clicker 5 group attended schools in urban districts, and 29% of these students went to school in rural districts (Figure 1).  Informed consent was obtained for 12 of the 18 Year 1 students who used Co:Writer (66.7%). However, only 2 of the 12 Co:Writer students (16.7%) submitted sufficiently complete data sets to be included in the Year 2 analysis. These students were a 13-year old male in Grade 8 in an urban school district, and a 17-year old female in Grade 11 in a rural school district (see Figure 2). School contacts were a resource teacher and special education assistant, respectively. Data for these two students were evaluated qualitatively.   21 Figure 1. Geographic distribution of students in the Clicker 5 group.  2/1 3/2 1/1 4/0  2/1 1/1 1/1 5/1 2/2 1/0 1/1 2/1 1/1 1/0 2/0 1/0 1/0 3/1 Year 1/Year 2 2/1 5/2 2/0   22 Figure 2. Geographic distribution of students in the Co:Writer group.  1/0 1/0  1/0 1/0 1/0 3/0 4/1 1/0 2/0 1/0 Year 1/Year 2 1/0 1/1   23 Technology Training  In Year 1, in order to train teachers and support workers to use the technology provided through TAP, SET-BC offered a series of synchronous and asynchronous on-line training sessions to all individuals supporting TAP students. Previously recorded (i.e., asynchronous) training sessions were available for teams through SET-BC’s online Learning Centre, 24 hrs/day, 7 days/week. These sessions were 30-45 minutes in length and were produced by the SET-BC TAP coordinator to demonstrate key aspects of each piece of software. Synchronous on-line training was also provided by the SET-BC TAP coordinator and consisted of four 1-hour training sessions that were available at designated times every Monday to Friday for three weeks in November and December of 2006. In addition, for the three largest regions in British Columbia (i.e., Vancouver Island, the Okanagan Valley, and the Lower Mainland), face-to-face training workshops were offered by the SET-BC TAP coordinator in January and February 2007. Five sessions of 5 hours each were offered to teams in Vancouver Island and the Lower Mainland, and three 5-hour sessions were provided for teams in the Okanagan. Finally, additional support was also available from the SET-BC TAP coordinator via phone and e-mail or from a regional SET-BC consultant. In Year 2, returning support workers and those who were new to TAP were supported by request by SET-BC regional consultants, through a variety of training formats; data were not collected about the nature or quantity of Year 2 training. Setting  All literacy instruction with TAP software and all data collection took place in the students’ schools. For the writing samples collected as outcome data, school contact personnel were instructed to have the student positioned in front of his or her computer in a   24 quiet, non-distracting environment before proceeding. Monthly survey forms were posted on SET-BC’s web site, so the school contacts could complete the survey on-line. Activities and Materials Year 1 In January 2007, e-mail messages were sent to all school contacts introducing the research component of TAP and informing them of the data collection dates for the 2006-07 school year. Data collection occurred during two weeks of each month: the on-line survey was available in the first week; and both the writing sample topic for Co:Writer and Clicker 5 grid set were provided in the third week of the month. Directions for data collection were posted on the SET-BC web site (Appendix E). Reminder e-mails were sent prior to each data collection week, and thank-you e-mails were sent to the contact person after each submission. Monthly writing samples. Students were asked to submit monthly writing samples on assigned topics during each month from January-June 2007. The monthly topics were selected by the co-investigators and the SET-BC TAP coordinator and (in order) were pets, weather, transportation, food, music, and people at school. Instructions for completing the writing samples were posted on the SET-BC web site for one week each month. School district contacts received an e-mail at the beginning of the week, prompting them to visit the web site and complete the monthly activities. The instructions consisted of (a) the procedure for obtaining assent from the student, (b) a link to the monthly Clicker 5 grid, (c) the monthly topic (d) instructions for collecting the writing sample (Appendix E).   25 Students who received Clicker 5 with Picture Communication Symbols completed a monthly 10-minute writing sample using a pre-made Clicker 5 grid that was created by the SET-BC TAP coordinator on the assigned topic. Students using Co:Writer were asked to complete two 10-minute writing samples about the assigned topic. One of these samples was handwritten and the other was completed on the computer, using Co:Writer. Monthly surveys. An online survey was constructed to gather information about each student’s experiences with TAP technology. It was also aimed at learning teachers’ perceptions of how the technology affected their students’ academic, social, and communication skills (Appendix F). Survey items were written in check-box format to facilitate completion within a short amount of time. Questions were in multiple choice and checklist formats. Teachers submitted the survey on-line and it was forwarded directly to the secure TAP Research e-mail account. Literacy survey. At the beginning of the 2007-08 school year, surveys were sent via e-mail to all school contacts from Year 1. The survey was designed to gather information about students’ overall literacy programs during the previous (2006-07) school year (Appendix G). Survey items were written in checklist and multiple choice formats to facilitate quick and easy completion. Teachers were asked to return the survey via e-mail or regular post. Year 2 In September 2007, e-mail messages were sent to the school contacts of all students whose parents agreed to continue with the TAP research, re-introducing the research project and informing them of the data collection dates for the 2007-08 school year. Data collection in Year 2 occurred during the third week of each month for both the on-line survey and the   26 writing sample. Modified directions for data collection were posted on the SET-BC web site (Appendix H). Reminder e-mails were sent prior to each data collection week, and thank-you e-mails were sent to the contact person after each submission. Additionally, in order to encourage a higher rate of return, each time a submission (writing sample or survey) was received, the student’s name was entered into a monthly draw for a Super Hero comic book and CD-ROM, provided by SET-BC. Monthly writing samples. Students enrolled in the second year of the TAP research used the same technology from SET-BC as during the first year. Students in the Clicker 5 group were asked to complete a 10-minute writing sample using one of two pre-made Clicker 5 grids on a monthly topic. Both a beginner grid and an advanced grid were created by the researcher, and support staff were asked to select the grid that best reflected each student’s current literacy ability. Students using Co:Writer were asked to complete two 10-minute writing samples about the monthly topic. One of these samples was handwritten and the other was completed on the computer, using Co:Writer. The monthly topics were as follows: super heroes, rock stars, the Olympics, Valentine’s Day, spring break, recycling, and nutrition. Monthly surveys. For Year 2, the on-line survey was revised to gather information about each student’s experiences with TAP technology and their overall literacy programs. It was also aimed at learning teachers’ perceptions of how the technology affected their students’ academic, social, and communication skills (Appendix I). Survey items were written in checklist and multiple choice formats to facilitate completion within a short amount of time. Teachers submitted the survey on-line and it was forwarded directly to the secure TAP Research e-mail account.   27 Dependent Variables Writing Samples  Clicker 5. In order to evaluate effects of Clicker 5 on students’ writing rates and quality, there were four dependent variables: (a) number of words written per 10 minutes, (b) percent of correct word pairs per 10 minutes, (c) a score for conventions and mechanics, and (d) a score for overall meaning.  Co:Writer. The dependent variables related to the effects of Co:Writer on writing rates, spelling accuracy, and quality included: (a) number of words written per 10 minutes, (b) percent of correctly spelled words per 10 minutes, (c) percent of correct word pairs per 10 minutes, (d) a score for conventions and mechanics, and (e) a score for meaning. Survey  Clicker 5. The dependent variables related to the data obtained from the monthly surveys included: (a) number of minutes per week students spent using Clicker 5, (b) number and nature of components in students’ overall literacy programs, (c) number of minutes per week teachers spent programming Clicker 5, and (d) teachers’ perceptions of the impact of Clicker 5 on students’ academic, communication, and social participation.  Co:Writer. The dependent variables related to the data obtained from the monthly surveys included: (a) number of minutes per week students spent using Co:Writer, (b) number of components in students’ overall literacy programs, (c) number of minutes per week teachers spent programming Co:Writer, and (d) teachers’ perceptions of the impact of Co:Writer on students’ academic, communication, and social participation.   28 Measures Writing samples for Clicker 5 were analyzed using two of the analytic measures described by Hasbrouck, Tindal, and Parker (1994)—namely, writing rate and writing quality. In addition, holistic quality ratings adapted from Needels and Knapp (1994) were used to assess writing conventions and mechanics as well as overall writing quality. Writing samples for Co:Writer were analyzed using all of these measures as well as a measure of spelling accuracy from Hasbrouck et al. (1994). If a student’s teacher indicated that the student wrote for more or less than the 10 minutes that were specified for the writing sample (see Appendices E and H), the results for the sample were converted proportionately. Writing Rate (Hasbrouck et al., 1994) Writing rate was determined by calculating the number of words written per 10 minutes. For Clicker 5, each cell in the monthly Clicker grid was counted as one word, regardless of the number of words contained in the cell. For Co:Writer, clusters of letters separated by spaces were considered to be words, including single-letter words such as “a” and “I,” common abbreviations, and compound words, regardless of correct or incorrect spelling. Numerals were not counted as words. Writing Quality (Analytic) (Hasbrouck et al., 1994)  An analytic measure of sentence structure quality was obtained by calculating the percent of correct word pairs. A correct word pair is a sequence of two correctly spelled adjacent words that is acceptable to a native speaker of the English language within the context of the phrase or sentence in which it is found. For example, the sentence “The dog ate my homewrk last night” contains four correct word pairs: the dog, dog ate, ate my, last night and two incorrect word pairs: my homewrk, homewrk last. Word sequences stopped at   29 the end of a sentence, indicated by a period. If there was no period, but it was clear that a new sentence had begun, raters treated a word pair as a new sentence. The percentage of correct word pairs was calculated by dividing the number of correct word pairs by the total number of possible word pairs (correct word pairs + incorrect word pairs). Writing Quality (Holistic) (Needels & Knapp, 1994) Two scales adapted from the work of Needels and Knapp (1994) were used to examine holistic writing quality. Table 1 displays these two scales. Table 1. Holistic quality rating scales (adapted from Needels & Knapp, 1994)  Acceptable (Score = 3) Neutral (Score = 2) Unacceptable (Score = 1) Conventions/ Mechanics • Correct sentence structure • Correct punctuation and capital letters • Accurate word choice or forms • Occasional errors in sentence structure • Occasional errors with punctuation or capital letters • Occasional errors with word choice or forms • Frequent errors in sentence structure • Frequent errors in punctuation and capitalization • Frequent errors with word choice or forms Meaning • Good topic development • Clearly organized • Uses appropriate detail in sufficient quantity  • Adequate topic development • Somewhat organized • Uses a few details that are mostly appropriate • Poor topic development • Mostly disorganized • Uses few details or inappropriate details   Conventions and mechanics. Three elements comprised the scale for conventions and mechanics: sentence structure, punctuation and capitalization, and word choice or form. Trained research assistants rated each of these elements as unacceptable (1), neutral (2), or acceptable (3) for each writing sample. An overall conventions and mechanics score was   30 obtained by assigning the most common rating across the three elements. For example, if sentence structure = 2, punctuation and capitalization = 3, and word choice or form = 2, the overall score assigned was 2. If all three elements received different ratings (i.e., scores of 1, 2, and 3), a score of 2 was assigned. Meaning. As with conventions and mechanics, three elements were identified to assess meaning: topic development, organization, and amount and appropriateness of details. These elements were each rated in the same way as conventions and mechanics (unacceptable, neutral, or acceptable), with one overall meaning score assigned per sample. Spelling Accuracy (Hasbrouck et al., 1994) For Co:Writer only, spelling accuracy was measured by calculating the percent of correctly spelled words per 10 minutes. Spelling was evaluated within the context of the sentence. A word was considered misspelled if it was in the incorrect tense or form for the context of the sentence. For example, “I eat pizza last night” would consist of four correctly spelled words (“I”, “pizza”, “last”, and “night”) and one incorrectly spelled word (“eat”). Compound words written as two separate words were counted as one misspelled word (e.g., snow man). Likewise, two words written as one word were considered as one incorrectly spelled word (e.g. eachother). The percent of correctly spelled words was calculated by dividing the number of correctly spelled words by the total number of words in the sample. Research Assistant Training  Three undergraduate students at the University of British Columbia were trained to code the writing samples for both Clicker and Co:Writer, using the Hasbrouck et al. (1994) conventions and the holistic scales in Table 1. Two one-hour sessions were conducted by the researcher with each student to demonstrate the scoring procedures and to observe as the   31 research assistants scored several writing samples from each software application. Training was terminated when the researcher and coder agreed 100% across three consecutive samples for all dependent measures. Data Analysis Writing Samples Only TAP students with sufficiently complete data sets were included in the analysis. Sufficiently complete data sets included a minimum of four out of six possible writing samples in the Year 1, and four out of seven in Year 2. Due to low response rates, all data for months 1, 6, and 8 were excluded from the analysis. Data for month 12 was also excluded because the writing sample analyses suggested that the topic for that month (recycling) was unusually difficult or unmotivating for the students. Altogether, 4 months of data were included in the analysis for Year 1 (Y1) and 9 months of data were included in the analysis for Years 1-2 combined (Y1-2). Data from the writing samples were entered into the Statistical Program for the Social Sciences (SPSS, version 15.0). For Co:Writer data, paired samples t-tests were used to compare handwritten and Co:Writer samples for all dependent measures. Repeated measures analyses of variance (ANOVAs) were conducted to examine change over time for the dependent measures related to Clicker 5 and Co:Writer. Finally, mean scores were calculated for rate, spelling accuracy (Co:Writer only), and all measures of writing quality. These were used in Pearson product moment correlations to examine associations between the mean scores and (a) the number of hours the students spent using TAP technology; (b) the number of literacy activities included in students’ overall literacy programs; and (c) the number of minutes per week teachers spent programming TAP technology.   32 Monthly Surveys For both Y1 and Y1-2, data from the monthly surveys were also entered into SPSS. Descriptive statistics (e.g., frequency, mean, and standard deviations) were calculated for all of the survey data. Inter-Rater Reliability  Three undergraduate research assistants scored 20% of the writing samples from each group (Clicker 5 and Co:Writer) to establish inter-rater reliability, which was calculated using the formula: number of agreements divided by number of disagreements + number of agreements, multiplied by 100. Ten percent of the data entered from surveys and samples was also randomly selected and re-entered to determine inter-rater reliability. Disagreements were resolved by consensus. The mean reliability scores for all dependent measures are displayed in Table 2. Table 2. Mean inter-rater reliability Dependent measure Percent agreement Total number of words 99.4% Percent of correct word pairs 95.4% Percent of correctly spelled words (Co:Writer only) 93.6% Conventions and mechanics 92.6% Meaning 92.9%  The overall inter-rater agreement across all dependent measures was 94.8%. Reliability for survey data entry was 99.7%.   33 CHAPTER 3 Results  The purpose of this study was to investigate the use of assistive technology as part of literacy curricula for students with Down syndrome. The students were recruited through SET-BC and represented both urban and rural communities in British Columbia. The data were analyzed according to the software that was used, either Clicker 5 or Co:Writer. Writing samples for both groups were analyzed with a series of one-way repeated measures analyses of variance (ANOVAs) to determine whether or not there were significant differences over time for writing rate, spelling accuracy (Co:Writer only), and quality. Data from the monthly surveys were analyzed graphically, and using a correlational analysis to determine whether or not other components of the students’ literacy curricula (such as number of hours using the technology, number of literacy components in the curriculum, and amount of time teachers spent programming the technology) were associated with the results of their writing samples, as analyzed by rate, spelling accuracy, and quality. The results of these analyses are summarized in this chapter. Clicker 5 Writing Sample Results  Writing samples were analyzed using one-way repeated measures ANOVAs for writing rate and quality.   34 Rate. The mean total number of words produced during Year 1 (Y1) and Years 1-2 combined (Y1-2) are shown in Figure 3.  Figure 3. Clicker 5: Mean number of words produced per 10 minutes 0 10 20 30 40 50 60 M ea n  n u m be r o f w o rd s 1 2 3 4 5 6 7 8 9 Time All - Y1 Subset - Y1-2   No significant change over time was found for Y1 (n = 43) or for the subset, Y1-2 (n = 17), although there is clearly an increasing trend in mean number of words for the latter data set. Lack of significance is likely due to large inter-participant variability (SD = 32.39) and the small sample size.   35 Quality (analytic). The mean percent of correct words pairs for Y1 and Y1-2 participants is displayed in Figure 4.  Figure 4. Clicker 5: Percent of correct word pairs 0 20 40 60 80 100 Pe rc e n t c o rr e c t w o rd  pa irs /1 0 m in . 1 2 3 4 5 6 7 8 9 Time All - Y1 Subset - Y1-2 p < .05p < .05 Pe rc e n t c o rr e c t w o rd  pa irs /1 0 m in .  Significant increases in the mean percent of correct word pairs for Y1 participants (n = 43) were found between month 1 (60.6%) and months 2 (72.2%), 3 (78.2%), and 4 (83.6%) (p < .05). In contrast, significant decreases in the mean percent of correct word pairs for the subset of participants in Y1-Y2 (n = 17) were found between month 4 (89.1%) and both months 7 (65.6%) and 9 (65.8%) (p < .05). It appears that, as these students wrote more total words (see Figure 3), some sentence structure correctness was sacrificed.   36 Conventions and mechanics. The mean scores for conventions and mechanics for Y1 and Y1-2 are displayed in Figure 5.  Figure 5. Clicker 5: Average scores for conventions and mechanics 0 0.5 1 1.5 2 2.5 3 A v er a ge  Sc o re 1 2 3 4 5 6 7 8 9 Time All - Y1 Subset - Y1-2 p < .01 A v er a ge  Sc o re  For the Y1 group, mean conventions and mechanics scores increased significantly between month 1 (1.77) and months 3 (2.02) and 4 (2.12) (p < .01). For the subset of students included in Y1-2, there was a significant increase between month 1 (1.71) and month 6 (2.29) (p < .01) in mean conventions and mechanics scores. Although this holistic measure of grammar and correctness showed some improvement over time, it appears that this improvement “peaked” in Y2 and remained relatively stable for the reminder of that year.   37 Meaning. Figure 6 displays the mean scores for meaning for Y1 and Y1-2.  Figure 6. Clicker 5: Average scores for meaning 0 0.5 1 1.5 2 2.5 3 A v e ra ge  Sc o re 1 2 3 4 5 6 7 8 9 Time All - Y1 Subset - Y1-2 p < .05 A v e ra ge  Sc o re  Average scores for meaning were significantly higher in month 4 (2.3) than in month 1 (2.0) (p < .05) for Y1 participants only. As with the total number of words, although there was a general increasing trend across all time points, no other significant effects were found for this measure. This is likely related to the small sample size for Y1-2. Monthly Survey Results: Reported Impact Clicker 5 data for Y1 were analyzed once for all students (n = 43) together and separately for the subset of students (n = 17) who also participated in Y2. Reported effect on participation in academic activities. Survey questions asked support staff to rate the extent to which the technology affected or enhanced students’ ability to participate in academic activities or lessons in the classroom. Possible responses were (a) no effect, (b) minimal effect, (c) moderate effect, (d) substantial effect, or (e) exceptional   38 effect. Teacher reports of the effect of the technology on participation in academics are displayed in Figure 7.  Figure 7. Clicker 5: Reported effect on participation in academic activities 0 10 20 30 40 50 60 70 80 90 100 %  re po rt in g a t l e a s t m o de ra te  e ffe c t 1 2 3 4 5 6 7 8 9 Time All - Y1 Subset - Y1-2  The percent of teachers reporting at least a moderate effect on their student’s participation in academics as a result of Clicker 5 use ranged from 25-57% in Y1 and 0-78% for the Y1-Y2. Reported effect on communication with peers. Support staff were also asked to rate the extent to which Clicker 5 use affected students’ ability to communicate with their peers in the classroom. The responses from this question are displayed in Figure 8.    39 Figure 8. Clicker 5: Reported effect on communication with peers 0 20 40 60 80 100 %  re po rt in g a t l e a s t m o de ra te  e ffe c t 1 2 3 4 5 6 7 8 9 Time All - Y1 Subset - Y1-2  For the Clicker 5 group, between 12-34% of teachers reported at least moderate effect in Y1 and 0-33% reported at least moderate effect for the Y1-2 subset. Compared with the data in Figure 7 for academics, these results suggest that students benefited more from Clicker 5 use for academics than for communication with peers. Reported effect on social interactions. The monthly survey also asked respondents to report the extent to which the technology affected students’ ability to participate in social activities of their classrooms. These results are summarized in Figure 9.   40 Figure 9. Clicker 5: Reported effect on participation in social activities 0 20 40 60 80 100 %  re po rt in g a t l e a s t m o de ra te  e ffe c t 1 2 3 4 5 6 7 8 9 Time All - Y1 Subset - Y1-2  For the Clicker 5 group, 13-34% of teachers reported that the technology had at least moderate effect on students’ participation in social activities in Y1, and 0-33% reported the same for the Y1-2 subset. Monthly Survey Results: Overall Technology Use Students’ use of technology. In the monthly survey, school support staff was asked to report the number of hours per day each student used Clicker 5 at school. A summary of the responses is provided in Table 3. Table 3. Number of hours per day of student Clicker 5 use at school Time Year 1 (n = 43) Year 2 (n = 17) Less than 30 minutes/day 32.4% 25.0% 30 min – 1 hour/day 48.6% 75.0% 1-2 hours/day 18.9% 0% More than 2 hours/day 0% 0%   41  On average, students were reported to spend 30-60 minutes per day using Clicker 5, with no students using the technology more than 2 hours per day in Y1 and no students using it more than 3 hours per day in Y2. Other literacy components. Respondents were asked to complete a questionnaire (Appendix G) at the beginning of Y2 to report on components of their Y1 students’ literacy programs in addition to Clicker 5. These questions were embedded in the monthly on-line surveys for Y2. A summary of the overall literacy components reported by school staff is provided in Table 4. Table 4. Literacy components in common use across Clicker 5 students Component Year 1 (n = 43) Year 2 (n = 17) Sight word instruction 79% 100% Structured phonics instruction 70% 67% Teacher read-alouds 67% 93% Buddy reading 67% 60% Balanced Literacy software 64% 67% Guided reading 61% 67% Self-selected reading 58% 67% Journal writing 52% 67% Interactive writing 39% 33% Other formal reading program 4% 40%    42  On average, students were reported to have seven components in their overall literacy programs. At least half of the students received one or more of sight word instruction, structured phonics instruction, teacher read-alouds, buddy reading, balanced literacy, guided reading, self-selected reading, and/or journal writing. Teacher programming. In the monthly survey, support staff was also asked to report the number of minutes per week they spent programming the software. Their responses are summarized in Table 5. Table 5. Number of minutes per day support staff spent programming Clicker 5 Time Year 1 (n = 43) Year 2 (n = 17) Less than 15 minutes/day 45.9% 37.5% 15 - 30 min/day 43.2% 50.0% 30 min - 1 hour/day 10.8% 12.5% More than 1 hour/day 0% 0%  On average, teachers spent 15-30 minutes per day programming the software (e.g., making Clicker grids). Reported programming time ranged from less than 15 minutes to 1 hour per day in both Y1 and Y2. Monthly Survey Results: Barriers A checklist included in the monthly surveys asked support staff to identify barriers they experienced in implementing Clicker 5 software. Table 6 summarizes the barriers that were noted most commonly.   43 Table 6. Percentage of support staff reporting barrier in implementing Clicker 5 Barrier Y1 (n = 43) Y2 (n = 17) Insufficient programming/planning time 66% 62% Insufficient training 26% 68% Insufficient classroom time devoted to literacy 12% 9% Computer did not work properly at times 14% 18% Software did not work properly at times 6% 9%  Insufficient time for planning and programming was the most commonly reported challenge to implementing Clicker 5. Interestingly, many more teachers supporting students in the Clicker 5 group reported insufficient training as a barrier in Y2 than in Y1. This is likely the result of changes in how training was provided in Y2; teachers who were new to the project in Y2 were offered support only by request through SET-BC regional consultants and may not have sought out this training. Correlations  Pearson product-moment correlations were calculated to determine the strength of associations for all dependent measures between the mean scores for the entire Y1 sample (n = 43); and for Y1 alone (months 1-4), Y2 alone (months 5-9), and Y1-2 (months 1-9) for the subset of students who participated over both years (n = 17). Table 7 provides a summary of these results.   44 Table 7. Correlations between Clicker 5 dependent variables  Mean Number of Words Mean Percent of Correct Word Pairs  Variable Y1 1-4 Y2 1-4 Y2 5-9 Y2 1-9 Y1 1-4 Y2 1-4 Y2 5-9 Y2 1-9 Mean hours of student use -.039 -.032 .240 .319 .147 .110 .065 -.107 Mean time spent  programming  -.047  -.057  .285  .049  .121  -.219  .049  .077 Mean number of literacy  components  .053  .252  -.265  .000  .203  .523*  .431  .567* *p < .05  The results indicated no significant correlations between the means for any of the dependent variables and the mean number of hours students used the technology or the mean time teachers spent programming the software. However, there were significant correlations (p < .05) between the mean percent of correct word pairs and the mean number of components in students’ overall literacy programs for the Y2 students between months 1-4 and 1-9 only. Co:Writer Writing Sample Results: Year 1  Writing samples were analyzed using repeated measures ANOVAs for rate, spelling accuracy, and quality to measure change over time. Paired samples t-test were used to compare Co:Writer and handwritten samples.   45 Rate. The total number of words per 10 minutes for handwritten and Co:Writer samples are displayed in Figure 10.  Figure 10. Co:Writer: Mean number of words per 10 minutes 0 5 10 15 20 25 30 35 40 45 M e a n  n u m be r o f w o rd s 1 2 3 4 Time Handwritten Co:Writer p < .01 M e a n  n u m be r o f w o rd s  Figure 10 shows that no significant change over time was found for the total number of words produced per 10 minutes. However, in month 3, students wrote significantly more words using handwriting than Co:Writer (p < .01).   46 Spelling accuracy. The mean percent of correctly spelled words is shown in Figure 11.  Figure 11. Co:Writer: Percent of correctly spelled words per 10 minutes 0 20 40 60 80 100 %  c o rr e c tly  s pe lle d w o rd s 1 2 3 4 Time Handwritten Co:Writer p < .01p < .01 p < .01 %  c o rr e c tly  s pe lle d w o rd s  Although no significant change over time was found for spelling accuracy, the mean percent of correctly spelled words was significantly higher for Co:Writer than for handwritten samples in months 2, 3, and 4 (p < .01).   47 Quality (analytic). Figure 12 shows the mean percent of correct word pairs for Co:Writer and handwritten samples.  Figure 12. Co:Writer: Percent of correct word pairs per 10 minutes 0 10 20 30 40 50 60 70 80 90 100 %  c o rr e c t w o rd  pa irs 1 2 3 4 Time Handwritten Co:Writer p < .01p < .01p < .01 %  c o rr e c t w o rd  pa irs  Again, although the proportion of correct word pairs did not improve significantly over time, Co:Writer samples, on average, had significantly higher percentages of correct word pairs compared to handwritten samples in months 2, 3, and 4 (p < .01). This indicates that students’ sentence structure was better with Co:Writer than handwriting.   48 Conventions and mechanics. Mean scores for conventions and mechanics for handwritten and Co:Writer samples are displayed in Figure 13.  Figure 13. Co:Writer: Average scores for conventions and mechanics p < .05 0 0.5 1 1.5 2 2.5 3 A v e ra ge  s c o re 1 2 3 4 Time Handwritten Co:Writer  Although scores for conventions and mechanics were generally higher for Co:Writer across all four time points, no significant change over time was found and the difference between Co:Writer and handwritten samples was significant only in month 3 (p < .05).   49 Meaning. Finally, Figure 14 displays the average meaning scores for Co:Writer and handwritten samples.  Figure 14. Co:Writer: Average scores for meaning 0 0.5 1 1.5 2 2.5 3 A v e ra ge  s c o re 1 2 3 4 Time Handwritten Co:Writer  Average meaning scores were slightly higher for Co:Writer samples at all time points and there was a slight increase from months 1-4, but no significant effects were found for this measure. The broad scope of this measure may account for this, as subtle differences in the quality of the samples could be missed by assigning one of three scores.   50 Writing Sample Results: Year 2  Rate. Figures 15 and 16 display the number of total words per 10 minutes for the two Co:Writer students in Y2. Figure 15. Co:Writer: Total number of words for Student 1 0 5 10 15 20 25 30 W o rd s/ 10  m in . 1 2 3 4 5 6 Time Handwritten Co:Writer  Figure 16. Co:Writer: Total number of words for Student 2 0 5 10 15 20 25 30 W o rd s/ 10  m in . 1 2 3 4 5 6 Time Handwritten Co:Writer    51 The writing rates for these two students were quite different. While Student 1 wrote increasingly more through the school year and generally wrote more using Co:Writer than handwriting, the writing rate for Student 2 was quite low with both Co:Writer and handwriting.  Spelling accuracy. The percent of correctly spelled words for the two Co:Writer students are displayed in Figures 17 and 18.  Figure 17. Co:Writer: Percent of correctly spelled words for Student 1 0 20 40 60 80 100 %  co rr ec tly  s pe lle d w o rd s 1 2 3 4 5 6 Time Handwritten Co:Writer   52 Figure 18. Co:Writer: Percent of correctly spelled words for Student 2 0 20 40 60 80 100 %  co rr ec tly  sp el le d w o rd s 1 2 3 4 5 6 Time Handwritten Co:Writer  As was the case for rate, results for spelling accuracy were quite different for these two students. Student 1 demonstrated slight improvement in spelling over time using Co:Writer and generally spelled better using Co:Writer than handwriting. However, the percent of correctly spelled words was variable for student 2. Spelling was better with handwriting in the first 3 months but was better with Co:Writer in months 4-6. This variability may be accounted for by the student’s low writing rate.    53 Quality (analytic). The percent of correct word pair results for both Co:Writer students are displayed in Figures 19 and 20.  Figure 19. Co:Writer: Percent of correct word pairs for Student 1 0 20 40 60 80 100 %  co rr ec t w o rd  pa irs 1 2 3 4 5 6 Time Handwritten Co:Writer   Figure 20. Co:Writer: Percent of correct word pairs for Student 2 0 20 40 60 80 100 %  co rr ec t w o rd  p ai rs 1 2 3 4 5 6 Time Handwritten Co:Writer  Again, the results for these two students were quite different. Student 1 had consistently better sentence structure with Co:Writer than with handwriting, but Student 2 had variable percentages of correct word pairs, perhaps as a result of not writing very much at all.   54  Conventions and mechanics. Figures 21 and 22 display the conventions and mechanics scores for both students using Co:Writer and handwriting.  Figure 21. Co:Writer: Conventions and mechanics scores for Student 1 0 1 2 3 O v er al l sc o re 1 2 3 4 5 6 Time Handwritten Co:Writer  Figure 22. Co:Writer: Conventions and mechanics scores for Student 2 0 1 2 3 O v er al l sc o re 1 2 3 4 5 6 Time Handwritten Co:Writer  Student 1 displayed some variability with regard to conventions and mechanics scores but always performed either better or the same using Co:Writer compared to handwriting. On the other hand, student 2 displayed quite poor use of conventions and mechanics using both handwriting and Co:Writer, showing no improvements over time.   55 Meaning. Overall meaning scores for the two Co:Writer students are shown in Figures 23 and 24.  Figure 23. Co:Writer: Meaning scores for Student 1 0 1 2 3 O v er al l sc o re 1 2 3 4 5 6 Time Handwritten Co:Writer  Figure 24. Co:Writer: Meaning scores for Student 2 0 1 2 3 O v er al l sc o re 1 2 3 4 5 6 Time Handwritten Co:Writer  Although neither student showed improvement over time with regard to raters’ impressions of meaning in their stories, student 1 demonstrated consistently better meaning scores with Co:Writer than handwriting. Scores for student 2 remained low throughout the study.   56 Monthly Survey Results: Reported Impact  Data from the monthly on-line survey for the Co:Writer group were analyzed for Y1 (n = 18) only. Reported effect on participation in academic activities. Teacher reports of the effect of Co:Writer on students’ participation in academic activities are displayed in Figure 25.  Figure 25. Co:Writer: Reported effect on participation in academic activities 0 10 20 30 40 50 60 70 80 90 100 %  re po rt in g at  le as t m o de ra te  ef fe ct 1 2 3 4 Time  For the Co:Writer group, the percentage ranged from 46-67%, indicating that many students were able to use the technology to enhance their participation in the academic curriculum of their classrooms. Reported effect on communication. Teacher ratings of the extent to which the TAP technology affected students’ ability to communicate with their peers in the classroom are outlined in Figure 26.    57 Figure 26. Co:Writer: Reported effect on communication with peers 0 10 20 30 40 50 60 70 80 90 100 %  re po rt in g a t l e a s t  m o de ra te  e ffe c t 1 2 3 4 Time  In this group, 0-18% of teachers reported at least a moderate effect on students’ communication with peers. These results indicate that most students did not use Co:Writer to enhance communication participation. Reported effect on social interactions. Teachers were also asked to report the extent to which Co:Writer affected students’ ability to participate in the social activities of his/her classroom. These results are summarized in Figure 27.    58 Figure 27. Co:Writer: Reported effect on participation in social activities 0 10 20 30 40 50 60 70 80 90 100 %  re po rt in g a t l e as t m o de ra te  e ffe ct 1 2 3 4 Time  Between 0-18% of teachers reported at least moderate effect of Co:Writer on their student’s participation in social activities. This is not surprising, since word prediction software was not designed to be socially-oriented. Monthly Survey Results: Overall Technology Use Students’ use of technology. A summary of the number of hours per day teachers reported that their students used the TAP technology at school is provided in Table 8. Table 8. Number of hours per day of student Co:Writer use at school Time Year 1 (n = 18) Less than 30 minutes/day 38.9% 30 min – 1 hour/day 5.6% 1-2 hours/day 44.4% 2-3 hours/day 11.1% More than 3 hours/day 0%   59 Students were divided in two groups, with one group averaging less than 30 minutes per day and the other group averaging 1-2 hours per day of Co:Writer use. No students spent more than 3 hours per day using the technology. Other literacy components. Respondents were asked to complete a questionnaire (Appendix G) at the beginning of Y2 to report all other components of their Y1 students’ literacy programs. A summary of the overall literacy components reported by school staff is provided in Table 9. Table 9. Literacy components in common use across Co:Writer students Component Year 1 (n = 18) Journal writing 100% Guided reading 85% Teacher read-alouds 77% Sight word instruction 69% Self-selected reading 69% Structured phonics instruction 62% Interactive writing 62% Buddy reading 46% Other formal reading program 46% Balanced Literacy software 8%   On average, students were reported to have seven components in their overall literacy programs. Components reported for half or more of the students included journal writing, guided reading, teacher read-alouds, sight word instruction, self-selected reading, structured   60 phonics instruction, and interactive writing. Few students in this group used Balanced Literacy software, but they were reported to use other formal reading programs such as those that were school district-specific. Teacher programming. In the monthly survey, support staff were also asked to report the number of minutes per week they spent programming the software. Their responses are summarized in Table 10. Table 10. Number of minutes per day support staff spent programming Co:Writer Time Year 1 (n = 18) Less than 15 minutes/day 50.0% 15 - 30 min/day 50.0% 30 min - 1 hour/day 0% More than 1 hour/day 0%  On average, teachers spent 15 minutes per day programming Co:Writer (e.g. writing topic dictionaries). No teachers reported spending more than 30 minutes per day programming the software.   61 Monthly Survey Results: Barriers A summary of the most commonly reported barriers to implementing Co:Writer is provided in Table 11. Table 11. Percentage of support staff reporting barrier in implementing Co:Writer Barrier Y1 (n = 18) Insufficient programming/planning time 76% Insufficient training 67% Insufficient classroom time devoted to literacy 19% Computer did not work properly at times 31% Software did not work properly at times 14%  Insufficient time for planning and programming and insufficient training were the most commonly reported challenges to implementing the technology. Hardware failure was also reported as a barrier by several teachers. Correlations  Pearson product-moment correlations were calculated to determine the strength of associations between the means of all dependent measures for Y1 and the mean hours of student use, time teachers spent programming Co:Writer, and number of literacy components for Y1. These results are summarized in Table 12.   62 Table 12. Correlations between Co:Writer dependent variables   Variable  Mean Number of Total Words Mean Percent of Correctly Spelled Words Mean Percent of Correct Word Pairs Mean hours of student use .118 .264 .149 Mean time spent programming .407 .326 .357 Mean number of literacy  components  -.221  .048  .178   None of the relationships between the dependent measure means and the mean hours of student use, time teachers spent programming Co:Writer, or number of literacy components were statistically significant. Summary This study offers a first look at the effectiveness of assistive technology for students with Down syndrome and provides some preliminary insight into the use of Clicker 5 and Co:Writer for this population. A summary of the major findings is provided in Table 13.   63 Table 13. Summary of major findings  Clicker 5 Co:Writer Variable Y1 (n = 43) Y1-2 (n = 17) Y1 (n = 18) Writing rate No significant change over time No significant change over time, but upward trend No significant change over time, but significantly higher with handwriting than Co:Writer Spelling accuracy Not measured Not measured No significant change over time, but significantly better with Co:Writer than with handwriting Quality (percent  correct word pairs) Significant improvement over time Significant decrease over time No significant change over time, but significantly better with Co:Writer than with handwriting Quality (conventions  and mechanics) Significant improvement over time Significant improvement over time No significant change over time, but significantly better with Co:Writer than with handwriting Quality (meaning) Significant improvement over time No significant change over time, but upward trend No significant change over time, but significantly better with Co:Writer than with handwriting   64 CHAPTER 4 Discussion  This study explored the use of assistive technology for literacy education of students with Down syndrome. It examined variables such as writing rate, spelling accuracy, writing quality, teacher impressions of the effectiveness of the technology, the number of other literacy program components in the student’s curriculum, and the amount of time students and teachers spent using and programming the technology, respectively. Conclusions were based on the analyses of these variables for the Clicker 5 and Co:Writer groups. Conclusions, limitations of the study, implications of the findings, and areas for future research are discussed in this chapter. Clicker 5 Writing Samples Rate. In the first year of the study, there was no significant change in the mean number of words students produced per 10 minutes (i.e., writing rate). The small subset of participants included in the second year of data analysis did appear to demonstrate an increased writing rate; however, due to a small sample size and large standard deviation, statistical significance was not achieved. Thus, no firm conclusions can be made about the effects of this software on writing rate. Unfortunately, since there have been no other studies of the impact of symbol-supported writing in this or in other populations, it is not possible to compare these results with those from previous research. One reason for the delayed improvement in writing rate that was observed in the Y1-Y2 students could be that they were at the emergent literacy stage and thus required more time to familiarize themselves with literacy activities in general before acquiring fluency in   65 writing. Perhaps, over a longer period of time, symbol-supported writing technology might be found to result in enhanced writing rate for these beginning writers with Down syndrome. Another potential reason for the delay could be that the students were more concerned with grammar and correctness in the first year than they were in the second year and, in order to write more correctly, they wrote more slowly. Finally, the grids provided in the first year were quite simple, whereas the grids provided for the study in the second year were slightly more complex so that they would be more motivating for all students. It is possible that students wrote more in the second year simply because they were more motivated to do so. Students may also have benefited from increased scaffolding from the more complex grid. Quality (grammar and correctness). Students demonstrated improved quality (as measured by the percent of correct word pairs, an analytic measure; and ratings for conventions and mechanics, a holistic measure) in the first year, steadily increasing over time. There was a statistically significant improvement from the beginning to the end of the first year for both of these measures. Based on these results, it seems that continued use of Clicker 5 enhances writing grammar and correctness for students with Down syndrome. In contrast, in the Y1-2 data set, the percent of correct word pairs decreased over time. This difference was statistically significant for percent correct word pairs for two months during Year 2, but not for conventions and mechanics. Nevertheless, the results of both the analytic and holistic measures seem to mirror each other closely. Seemingly, as students produced more writing during Year 2, the quality of their writing was negatively affected. For students who have had little exposure to literacy activities, it is not surprising that they would compromise some quality for quantity, or vice versa. It would be interesting to know if the quality of students’ writing improved again after their writing rate became   66 more consistent. Since no previous research has been published about symbol-supported writing, there is no way to know whether this result is typical. Given these mixed results, it is difficult to draw strong conclusions about the effectiveness of symbol-supported writing for enhancing writing quality, at least as measured by the percent of correct word pairs and by conventions and mechanics. Quality (meaning).Writing quality based on raters’ overall impressions of the meaning of written text showed improvement over time. Nonetheless, as with writing rate, statistical significance was only achieved between two time points (the first and last months of Year 1), despite the fact that the results for Y1-2 students showed an upward trend. Again, the failure to obtain significance is likely a result of the large standard deviation and small sample size for Y1-2. Nonetheless, as students wrote more, their writing samples appeared to become more meaningful and well organized. The results show promise for the use of Clicker 5 in supporting students to produce meaningful text. Monthly Surveys: Reported Impact Effect on academic participation, communication, and social interactions. The results of the monthly surveys did not provide strong evidence that Clicker 5 had a significant impact on students’ participation in academics. On average, less than half of the support staff surveyed reported at least moderate effects. In some months, a relatively large percentage of staff reported at least a moderate impact, while in other months very few or no teachers reported this effect. Based on teacher reports in the monthly surveys, Clicker 5 had a positive effect on communication and social interactions for some of the students in the study. Teacher anecdotes suggested that some students enjoyed showing their peers how to use the software,   67 and that it helped boost students’ self-confidence. Very few (18.6% in Y1, 29.4% in Y1-2) teachers reported that their student was not interested in the technology. It appears from these results that most students found the technology appealing and were motivated to use it. Monthly Surveys: Overall Technology Use Student use. The majority of students were reported to use Clicker 5 between 30-60 minutes per day. Although no information was collected regarding the nature of the activities during which the students used Clicker, this indicates that students were not using the software for all of their class work, although they were provided with opportunities for literacy activities on a daily basis. It can be quite time consuming for teachers to program the software, so it should be no surprise that students used it for a limited amount of time each day. Since no significant correlation was found between the time students spent using the technology and any of the dependent measures, it seems that increased use of software might not have resulted in improved writing, as measured by rate, spelling accuracy, and quality. Teacher programming. Almost all of the teachers surveyed spent less than 30 minutes per day programming the software or planning lessons related to its use. Many teachers reported that this was not a sufficient amount of time, so it is reasonable to assume that teachers supported the implementation of the technology but simply did not have a great deal of time to spend planning or programming. This suggests the need for additional resources, support, and training in order to allocate additional planning time to enhance software use. Although there was not a significant relationship found between teacher programming time and students’ performance on the dependent variables, very few teachers spent more than 30 minutes per day on programming and/or planning. It is possible that a   68 different result would have been seen if there had been more variability in the amount of time teachers reported. Literacy components. The majority of students in the present study were reported to participate in a literacy curricula that included several components in addition to the technology received from SET-BC. These consisted of a balance of components such as sight word instruction, phonics, reading, and writing activities. Correlational analyses suggested that the number of components in a student’s literacy program was related to the quality of their written output, as measured by the percent of correct word pairs. This suggests that literacy programs that are more balanced and robust can promote more accurate writing, which is congruent with previous findings that rich and varied literacy experiences enhance the acquisition of literacy skills (Erickson, 2006). Monthly Surveys: Barriers Overall, insufficient programming and/or planning time, insufficient training, and technical difficulties were the most frequently cited barriers to implementing Clicker 5 for both groups of students. Interestingly, in Y2 the number of teachers reporting insufficient training as a barrier more than doubled from Y1. In the second year, teachers new to the project were not offered the same level of training as those in the first year. Sturm et al. (2006) encountered similar barriers reported by primary school teachers whose students were using AAC to enhance literacy learning. This suggests that, in order to gain support from teachers for using assistive technology in literacy curricula, teachers must be provided with adequate time, training, and technical support to implement the software.   69 Co:Writer Writing Samples  Rate. Similar to the results reported by Mirenda et al. (2006) and MacArthur (1998), word prediction software did not result in improved writing rates for the students in this study. While students in Y1 wrote consistently more using handwriting than Co:Writer, this difference was only statistically significant in the third month. Writing rates for both of the Co:Writer students in Y2 were variable, with no pattern emerging with regard to rate for samples using either handwriting or word prediction. One possible reason for this result is that each time a student wishes to type a word with word prediction software, there are a number of steps that must be executed. First, the student must decide whether or not to use the list of predicted words. Next, if the student chooses to use the list, he or she must search the list for the desired word. If the word is on the list, the student must push the corresponding number to input the word into the text being composed; and if the word is not on the list, the student must type the next letter of the word and repeat the search. Thus, in the end, it is likely that the cognitive effort required to use word prediction software counterbalances the amount of time saved by using fewer keystrokes (Mirenda et al., 2006). Spelling accuracy. In the first year of the study, students demonstrated better spelling accuracy with Co:Writer than handwriting in all four months (the difference was statistically significant in three of these months). Similar results were obtained for the two students in Y2 of the study. Student 1 spelled better using Co:Writer in five of the six sets of writing samples while Student 2 spelled better using Co:Writer in three of the six sets of samples. However, the writing rate was quite low for Student 2, so the percent of spelling errors may   70 have been slightly inflated. Overall, the results for the percent of correctly spelled words support the hypothesis that the use of word prediction software improves spelling. These results are consistent with previous research that has found that word prediction software can result in improvements in spelling accuracy for students with a wide range of disabilities. MacArthur (1998) noted increases in the percent of correctly spelled words for students with learning disabilities; and Mirenda et al. (2006) observed higher mean percentages of correctly spelled words using word prediction software compared to handwriting for students with physical disabilities. Quality (grammar and correctness). Improvements in writing quality as measured by both the percent of correct word pairs and ratings for conventions and mechanics were also observed. Students wrote more correctly using Co:Writer than handwriting in all four months of Y1, although this difference was only significant in three of the four months for the percent of correct word pairs and for one of the four months for conventions and mechanics. The lack of statistical significance for conventions and mechanics may be due to the broad nature of the three-point scale that was used to rate this variable, making it difficult to assess minor differences in quality. The Y2 results for Student 1 also demonstrate improved quality with Co:Writer compared to handwriting, while the results for Student 2 were quite varied and showed no consistent pattern. However, the results from Y1 support previous research that found that word prediction software enhances writing quality for individuals with physical disabilities (Mirenda et al., 2006). One possible explanation for this result is that Co:Writer’s linguistic prediction feature (that uses linguistic and grammatical rules to predict the next word in a sentence) provides helpful cues to students about the most logical next word in their composition. This allows students using word prediction software   71 to produce written output that is more grammatically correct than they may have produced using handwriting. Quality (meaning). The results from the holistic meaning scores suggest that word prediction software did not improve the quality of students’ writing sample content. One possible reason for this is that students were asked to write both the handwritten and Co:Writer samples about the same topic, so the samples were often quite similar in meaning. Another reason for the lack of significance in meaning scores could be the length of the writing samples. Students did not write a great deal in the 10 minutes allotted for each sample; perhaps if their stories were longer, a more noticeable difference in quality would have been evident. Finally, it is also likely that students who benefit from word prediction software for spelling accuracy and grammar already have sufficient ability to construct a meaningful story, so this aspect of their writing was not improved with Co:Writer use. Monthly Surveys: Reported Impact Effect on academic participation, communication, and social interactions. Approximately half of the support staff believed that Co:Writer had at least a moderate effect on their student’s participation in academics. This is not unexpected, since Co:Writer can be used to complete any assignment requiring written output. It is somewhat surprising that so few teachers reported at least moderate effect of Co:Writer on academics; however, 38.9% of students reportedly used the technology for 15 minutes or less per day, which minimized its impact on academics. The responses from the surveys indicate that Co:Writer did not enhance either communication participation or social interactions between students with Down syndrome and their peers. Since students used the technology exclusively for writing, improved   72 communication and/or social interaction with peers were not expected. However, it would be interesting to know if peer attitudes toward Co:Writer students changed for the better as a result of their use of this software. Monthly Surveys: Overall Technology Use Student use. There appeared to be two groups of students in the Co:Writer group: one group used the technology very little (i.e., 15 minutes or less per day), while the other group used it between 1-2 hours per day. It was clear from the handwritten samples that some students wrote quite proficiently without Co:Writer and may not have needed to use the software as often as those who relied on Co:Writer for the majority of their written output. Regardless, there was no correlation between amount of time students spent using Co:Writer and writing rate, spelling accuracy, or writing quality. Teacher programming. No teachers supporting students in the Co:Writer group reported spending more than 30 minutes per day programming the software or planning lessons related to its use. As with the Clicker 5 group, many teachers reported that the time available to them for programming was insufficient, so it is reasonable to presume that this barrier is related to lack of resources rather than to teachers’ unwillingness to spend more time on planning. Teachers may require additional resources, support, and training in order to be able to implement the technology successfully. Literacy components. According to teacher reports, students in this group participated in a wide variety of literacy activities such as journal writing, guided reading, and teacher read-alouds, in addition to using Co:Writer. This is encouraging because such balanced literacy environments set the stage for acquisition of reading and writing beyond emergent skills (Erickson et al., 2006).   73 Monthly Surveys: Barriers Similar to the Clicker 5 group, the most frequently reported barriers to implementing Co:Writer were insufficient programming and/or planning time, insufficient training, and technical difficulties. This result is consistent with previous research findings (Lund et al., 2000; Sturm et al., 2006) that have noted common barriers related to teacher time and training. This suggests that teachers implementing assistive technology benefit from extra time, training, and technical support. Implications Regarding symbol-supported writing software (i.e., Clicker 5), this study did not find significant improvements in the analytic dependent measures used to assess the written output of students with Down syndrome. However, positive outcomes in holistic measures were found for the larger sample of students in Y1. These included improvements in sentence structure, punctuation, capitalization, word choice, organization, and topic development. In addition, based on teacher reports, the majority of Y1-2 students accumulated approximately 225 hours of writing practice using Clicker 5 across both school years (assuming that they used it between 30-60 minutes per day, 5 days per week, over 30 weeks in the school year). Virtually none of the students who used Clicker 5 in this study had handwriting skills that were adequate for text composition; hence, it is unlikely that they would have accumulated anything near the 225 hours of practice if Clicker 5 was not available to them. This level of involvement in written activities is likely to yield positive benefits in the long term with regard to both student’s overall literacy acquisition and enhanced motivation to produce written text (Downing, 2005).   74 In order for Clicker 5 to have a substantial effect, teachers must have sufficient time to design or adapt symbol grids that support the ongoing classroom curriculum, which may not always be possible. In addition, if teachers do not feel comfortable with the technology, it may be difficult for them to have their students use it. In spite of these potential barriers, anecdotal reports of students’ motivation to use the software were encouraging. For example, two teachers noted high motivation and student self-confidence because their students enjoyed using the technology and sharing their work with peers. Another teacher said that the student for whom she was responsible displayed a great deal of excitement in relation to using the technology. When this student was asked about his use of the computer, he responded: “It helps me learn stuff.” In addition, several teachers reported that their students were engaged and motivated to read and write with their computers. Regarding word prediction software (i.e., Co:Writer), both analytic and holistic measures provide support for use by students with Down syndrome who are able to write by hand but for whom spelling and writing quality (i.e., grammar and correctness) are issues. Almost half of the students in the Co:Writer group reportedly spent between 1-2 hours per day using the software. Over the course of one school year, this could add up to more than 225 hours of written output. For students who were reported to use Co:Writer less than 30 minutes per day, this would result in up to 75 hours of writing. Regardless, access to Co:Writer provided important opportunities for these students to write on a regular basis. Limitations and Future Research There are several limitations to consider when interpreting the results of this study. First, although specific instructions were provided to teachers (Appendices E & H) regarding data collection, no measurement was made of treatment integrity. Thus, it is possible that   75 some samples were collected in ways that were different from those specified in the directions. Second, no data were collected about students’ writing abilities before the project began, so comparisons cannot be made between their literacy skills with and without the software. Third, no data were collected about students’ previous experience with computers. It is possible that some students performed better than others because they were more “computer-savvy” from the outset. Fourth, Clicker 5 grids in Y2 differed from those in Y1. In Y1, all of the grids were designed using the same formula and using very similar wording every month. While a similar formula was used in Y2, the wording was more variable because some teachers reported that the grids were becoming too easy for their students. As noted previously, this provides a potential explanation for the increase in the percent of correct word pairs in Y1, followed by a decrease in Y2. Fifth, no information was gathered about the nature of students’ literacy programs before receiving the TAP technology. Knowing how much time students devoted to literacy activities and the nature of these activities prior to the study would have allowed for more in-depth analysis of the impact of the assistive technology on these students’ literacy programs. Sixth, the sample size decreased in Y2 for both groups, limiting the power that was available for data analysis. While the Y1-2 subset appeared to be comparable to the Y1 group based on demographic and outcome data, caution must be used in interpreting the Y2 data. Finally, no control group was used in this study, so it is not clear how much progress might have occurred for these students without the use of assistive technology. Future research is needed to correct for these limitations and to establish more solid conclusions about the effectiveness of both symbol-supported and word prediction software programs on the writing ability of students with Down syndrome. Another important avenue for future research is to address the extent   76 to which assistive technology designed to enhance written output impacts reading skills for students with disabilities. Conclusion Clicker 5 appears to show some promise for students with Down syndrome in augmenting their ability to produce meaningful written output, a useful skill for any student. Co:Writer appears to be effective at enhancing spelling accuracy and writing quality compared to handwriting for students in this population. Accordingly, word prediction provides a useful tool for students with Down syndrome to compose written work of higher quality. Teacher reports about the impact of the technology were positive in terms of students’ participation in academic activities for both the Co:Writer and the Clicker 5 groups, and for communication and social interactions with peers for the Clicker 5 group. Furthermore, students were generally interested in the technology and motivated to use it. Overall, reported barriers to implementing the technology were related to teacher time and training. This suggests that in order to achieve optimal results with these technologies, additional support for teachers may be needed. The results from this study encourage further examination of literacy strategies for these students, promoting this important skill among all individuals with Down syndrome.   77 References BC Ministry of Education. (2007). Special education services: A manual of policies, procedures and guidelines. Retrieved September 12, 2007, from http://www.bced.gov.bc.ca/specialed/awareness/43.htm Beukelman, D.R., & Mirenda, P. (2005). Augmentative & alternative communication: Supporting children and adults with complex communication needs, 3rd ed. Baltimore: Paul H. Brookes. Boudreau, D. (2002). Literacy skills in children and adolescents with Down syndrome. Reading and Writing: An Interdisciplinary Journal, 15, 497-525. Browder, D.M., Courtade-Little, G., Wakeman, S., & Rickelman, R.J. (2006). From sight words to emerging literacy. In D.M. Browder, & F. Spooner (Eds.), Teaching language arts, math, & science to students with significant cognitive disabilities (pp. 63-91). Baltimore: Paul H. Brookes. Browder, D.M., Wakeman, S.Y., Spooner, F., Ahlgrim-Delzell, L., & Algozzine, B. (2006). Research on reading instruction for individuals with significant cognitive disabilities. Exceptional Children, 72, 392-408. Browder, D.M., & Xin, Y.P. (1998). A meta-analysis and review of sight word research and its implications for teaching functional reading to individuals with moderate and severe disabilities. Journal of Special Education, 32, 130-153. Buckley, S. (1995). Teaching children with Down syndrome to read and write. In L. Nadel, & D. Rosenthal (Eds.), Down syndrome living and learning in the community (158- 169). New York, NY: Wiley-Liss.   78 Byrne, A., MacDonald, J., & Buckley, S. (2002). Reading, language, and memory skills: A comparative longitudinal study of children with Down syndrome and their mainstream peers. British Journal of Educational Psychology, 72, 513-529. Colak, A., & Uzuner, Y. (2004). Special education teachers and literacy acquisition in children with mental retardation: A semi-structured interview. Educational Sciences: Theory & Practice, 4, 264-270. Crick Software. (1993-2006). Clicker 5 (version 5.20) [Computer software]. Northampton, UK: Author. Don Johnston, Inc. (2001). Co:Writer 4000 (version 4.0) [Computer software]. Wauconda: IL: Author. Downing, J.E. (2006). Building literacy for students at the presymbolic and early symbolic levels. In D.M. Browder, & F. Spooner (Eds.), Teaching language arts, math, & science to students with significant cognitive disabilities (pp. 39-61). Baltimore: Paul H. Brookes. Downing, J.E. (2005). Teaching literacy to students with significant disabilities. Thousand Oaks, CA: Corwin Press. Dykens, E.M., Hodapp, R.M., & Finucane, B.M. (2000). Genetics and mental retardation syndromes: A new look at behavior and interventions. Baltimore: Paul H. Brookes. Erickson, K.A., Koppenhaver, D.A., & Cunningham, J.W. (2006). Balanced reading intervention and assessment in augmentative communication. In R.J. McCauley, & M.E. Fey (Eds.), Treatment of language disorders in children (pp. 309-345). Baltimore: Paul H. Brookes.   79 Fletcher, H., & Buckley, S. (2002). Phonological awareness in children with Down syndrome. Down Syndrome Research and Practice, 8 (1), 11-18. Fossett, B. (2003). Sight word reading in children with developmental disabilities using picture communication symbols: A comparative study. Unpublished master’s thesis, University of British Columbia, Vancouver, British Columbia, Canada. Fossett, B., & Mirenda, P. (2006). Sight word reading in children with developmental disabilities: A comparison of paired associate and picture-to-text matching instruction. Research in Developmental Disabilities, 27, 411-429. Gallaher, K.M., van Kraayenoord, C.E., Jobling, A., & Moni, K.B. (2002). Reading with Abby: A case study of individual tutoring with a young adult with Down syndrome. Down Syndrome Research and Practice, 8, 59-66. Hasbrouck, J., Tindal, G., & Parker, R. (1994). Objective procedures for scoring students’ writing. Teaching Exceptional Children, 26 (2), 18-22. Hedrick, W.B., Katims, D.S., & Carr, N.J. (1999). Implementing a multimethod, multilevel literacy program for students with mental retardation. Focus on Autism and Other Developmental Disabilities, 14, 231-239. Huttinger, P.L., Bell, C., Daytner, G., & Johanson, J. (2006). Establishing and maintaining an early childhood emergent literacy technology curriculum. Journal of Special Education Technology, 21, 39-54. Kaderavek, J.N., & Rabidoux, P. (2004). Interactive to independent literacy: A model for designing literacy goals for children with atypical communication. Reading & Writing Quarterly, 20, 237-260.   80 Kemp, C., & Carter, M. (2006). The contribution of academic skills to the successful inclusion of children with disabilities. Journal of Developmental and Physical Disabilities, 18, 123-147. Kennedy, E.J., & Flynn, M.C. (2003). Training phonological awareness skills in children with Down syndrome. Research in Developmental Disabilities, 24, 44-57. Kliewer, C., Biklen, D., & Kasa-Hendrickson, C. (2006). Who may be literate? Disability and resistance to the cultural denial of competence. American Educational Research Journal, 43, 163-192. Koppenhaver, D.A. (2000). Literacy in AAC: What should be written on the envelope we push? AAC: Augmentative and Alternative Communication, 16, 270-279. Koppenhaver, D.A., Hendrix, M.P., & Williams, A.R. (2007). Toward evidence-based literacy interventions for children with severe and multiple disabilities. Seminars in Speech and Language, 28, 79-90. Lange, A.A., McPhillips, M., Mulhern, G., & Wylie, J. (2006). Assistive software tools for secondary-level students with literacy difficulties. Journal of Special Education Technology, 21 (3), 13-22. Lloyd, J., Moni, K.B., & Jobling, A. (2005). Breaking the hype cycle: Using the computer effectively with learners with intellectual disabilities. Down Syndrome Research and Practice, 9 (3), 1-7. Lund, D., Angell, V., & Atwood, K. (2000, July). Integrating computer technology into a balanced literacy approach: Beyond interactive books and word processors. Paper presented at the 18th World Congress on Reading of the International Reading Association, 2000.   81 MacArthur, C.A. (1998). Word processing with speech synthesis and word prediction: Effects on the dialogue journal writing of students with learning disabilities. Learning Disability Quarterly, 21, 151-166. Mirenda, P., Turoldo, K., & McAvoy, C. (2006). The impact of word prediction software on the written output of students with physical disabilities. Journal of Special Education Technology, 21 (3), 5-12. Moni, K.B., & Jobling, A. (2001). Reading-related literacy learning of young adults with Down syndrome: Findings from a three year teaching and research program. International Journal of Disability, Development and Education, 48, 377-394. Needels, M.C., & Knapp, M.S. (1994). Teaching writing to children who are underserved. Journal of Educational Psychology, 86, 339-349. Newell, A.F., Arnott, J.L., Booth, L., Beattie, W., Brophy, B., & Ricketts, I.W. (1992). Effect of the “PAL” word prediction system on the quality and quantity of text generation. Augmentative and Alternative Communication, 8, 304-311. Oelwein, P.L. (1995). Teaching reading to children with Down syndrome. A guide for parents and teachers. Bethesda, MD: Woodbine House. SET-BC. (2006). Annual Report 2005-06. Retrieved October 23, 2007, from http://www.setbc.org/Download/Public/0506_Annual_Report.pdf Smith, M. (2005). Literacy and augmentative and alternative communication. Burlington, MA: Elsevier. Sturm, J., Spadorcia, S.A., Cunningham, J.W., Cali, K.S., Staples, A., Erickson, K., et al., (2006). What happens to reading between first and third grade? Implications for students who use AAC. Augmentative and Alternative Communication, 22, 21-36.   82 Trenholm, B., Mirenda, P. (2006). Home and community literacy experiences of individuals with Down syndrome. Down Syndrome Research and Practice, 10, 30-40. Tumlin, J., & Heller, K.W. (2004). Using word prediction software to increase typing fluency with students with physical disabilities. Journal of Special Education Technology, 19 (3), 5-14. Wepner, S.B., & Bowes, K.A. (2004). Using assistive technology for literacy development. Reading and Writing Quarterly, 20, 219-223. Widgit Software. (2000). Writing With Symbols 2000 [Computer software]. Leamington Spa, UK: Author. Zhang, Y. (2000). Technology and the writing skills of students with learning disabilities. Journal of Research on Computing in Education, 32, 467-478.   83 Appendix A UBC Ethics Approval   84 Appendix B Letter of Support from SET-BC  October, 2006   Dear Parents/Guardians:  In this package, you will find a letter describing a research project being done through SET- BC in conjunction with the University of British Columbia, along with a request for your son or daughter to participate. This study will help us understand how to meet the needs of students with Down syndrome and other intellectual disabilities who use assistive technology to participate in schools.  We would like to get this project underway as quickly as possible so that we can use the information to plan future SET-BC supports for students who need them. Please read the enclosed letter carefully, as it explains what your child will be asked to do and what will be done with the information. We would appreciate it if you could mail the permission slip in the enclosed stamped self-addressed envelope within one week.  Thank you in advance for helping SET-BC to make school an even better place to be for all students.   Yours Sincerely,    Mike Bartlett Provincial Coordinator, SET-BC   85 Appendix C Year One Letter of Invitation and Consent Form October, 2006  Dear Parent/Guardian:  In collaboration with the staff of Special Education Technology-BC (SET-BC), I am conducting a research project entitled “Effectiveness of Assistive Technology for Students with Down Syndrome” in schools through the province. Your child has been selected for participation in this study because he or she is part of the SET-BC Technology Access Project (TAP) during the 2006-2007 school year and has received computer technology to support communication and/or literacy development. I am writing to ask your permission for your child’s participation in a research study associated with TAP. I have described below several aspects of this project that you need to know.  Purpose: The purpose of this project is to collect information about the effectiveness of assistive technology for supporting communication and/or literacy development in students with Down syndrome. All children in BC who are part of the SET-BC Technology Access Project are invited to participate. We hope that the results of this study will help SET-BC and educators to understand how technology can be used to support learning in students with Down syndrome. This project is funded by SET-BC.  Study Procedures:  Once a month during the 2006-2007 school year, your child will be asked to use his/her TAP technology to produce a writing or language sample at school. Some students will write only one 10-minute story using a computer. Others will write two 10-minute stories, one using a pencil and paper and a second using a computer. Still other students will use their computers to talk to a teacher for 10 minutes about a specific topic. The type of sample your child produces will depend on the TAP software he or she received at school. The total time required for your child to produce the writing or language sample will be between 15-30 minutes each month. In addition, your child’s teacher will be asked to complete a short on-line form each month to answer questions about how often your child used the TAP technology at school and how it affected his/her learning and social interactions with peers. The names of all children who participate in the study will be entered into a monthly draw for an educational software program that can be used at home or at school.  Confidentiality: All children will be assigned a code number that will be used instead of their names on all of the writing and language samples. All of the samples will be completely confidential and will not be available to teachers, parents, or other school personnel.  No specific child will be referred to by name or identified in any way in a report of the results. All information will be kept in a locked file cabinet at UBC and on password-protected computers at UBC and SET-BC.    86 Contact:  If you have any questions, please do not hesitate to call me at (604) 822-6296. If you have any concerns about the treatment of your child or his/her rights as a research subject, you may contact the Research Subject Information Line in the Office of Research Services at the University of British Columbia, (604) 822-8598.  Participation in this study is entirely voluntary. You may refuse to have your child participate or may withdraw your child from the study at any time, even after signing this consent form. We will also ask your child each month if he or she wants to participate in the writing or language sample and will respect his/her response. Refusing to participate or withdrawal from the study will not jeopardize your child's education or his/her participation in the SET-BC Technology Access Project in any way.  Please indicate on the consent form provided on the next page whether or not your son/daughter has permission to participate. Please sign and date the form and return it in the enclosed stamped, self-addressed envelope within the next 7 days, regardless of whether or not you give consent.  Thank you very much for considering this request.  Sincerely,  Pat Mirenda, Ph.D. Professor   87 PARENT CONSENT FORM  EFFECTIVENESS OF ASSISTIVE TECHNOLOGY FOR STUDENTS WITH DOWN SYNDROME Principal Investigator Pat Mirenda, Ph.D., Professor Department of Educational and Counseling Psychology & Special Education University of British Columbia 2125 Main Mall Vancouver, B.C. V6T 1Z4 (604) 822-6296 ------------------------------------------------------------------------------------------ (KEEP THIS PORTION FOR YOUR RECORDS)  I have read and understand the attached letter regarding the study entitled “Effectiveness of Assistive Technology for Students with Down Syndrome.”  I have also kept copies of both the letter describing the study and this permission slip.  _____ Yes, my son/daughter has my permission to participate.  _____ No, my son/daughter does not have my permission to participate.  Parent's Signature     School:  Son or Daughter's Name (please print)  Date                                                                        CUT HERE AND RETURN THE FORM BELOW IN THE ENCLOSED ENVELOPE                                                                        I have read and understand the attached letter regarding the study entitled “Effectiveness of Assistive Technology for Students with Down Syndrome.”  I have also kept copies of both the letter describing the study and this permission slip.  _____ Yes, my son/daughter has my permission to participate.  _____ No, my son/daughter does not have my permission to participate.  Parent's Signature     School:  Son or Daughter's Name (please print)  Date   88 Appendix D Year Two Letter of Invitation and Consent Form September, 2007  Dear Parent/Guardian:  In the 2006-07 school year, you consented to your child’s involvement in a UBC research project entitled “Effectiveness of Assistive Technology for Students with Down Syndrome.” Your child received computer technology to support literacy development through SET-BC in conjunction with this project. I want to thank you for allowing your child to be involved in this study. We are currently finishing data collection and will be analyzing the results over the summer months. You will receive a summary report of the research outcomes once this has been completed.  We would like to continue this research for one additiona1 school year with the children who received SET-BC technology to support literacy development, in order to see if they continue to make progress. In addition, Joanne McCartney, a graduate student at UBC, would like to use the data collected for the study for her thesis. I have described below several aspects of the second year of this project that you need to know.  Purpose: The purpose of the second year of this project is to collect additional information about the effectiveness of assistive technology for supporting literacy development in students with Down syndrome. All children who were part of the SET-BC Technology Access Project in 2006-2007 are invited to participate. We hope that the results of this study will help SET-BC and educators to understand how technology can be used to support literacy learning in students with Down syndrome. This project is funded by SET-BC.  Study Procedures:  Once a month during the 2007-2008 school year, your child will be asked to use his/her assistive technology to produce a writing sample at school. Some students will write only one 10-minute story using a computer. Others will write two 10- minute stories, one using a pencil and paper and a second using a computer. The type of sample your child produces will depend on the software he or she received at school. The total time required for your child to produce the writing sample will be between 10-15 minutes each month. In addition, your child’s teacher will be asked to complete a short on- line form each month to answer questions about how often your child used the technology at school and how it affected his/her learning and social interactions with peers.    89 Confidentiality: All children will be assigned a code number that will be used instead of their names on all of the writing samples. All of the samples will be completely confidential and will not be available to teachers, parents, or other school personnel.  No specific child will be referred to by name or identified in any way in a report of the results. All information will be kept in a locked file cabinet at UBC and on password-protected computers at UBC and SET-BC.  Contact:  If you have any questions, please do not hesitate to call me at (604) 822-6296. If you have any concerns about the treatment of your child or his/her rights as a research subject, you may contact the Research Subject Information Line in the Office of Research Services at the University of British Columbia, (604) 822-8598.  Participation in this study is entirely voluntary. You may refuse to have your child participate or may withdraw your child from the study at any time, even after signing this consent form. We will also ask your child each month if he or she wants to participate in the writing sample and will respect his/her response. Refusing to participate or withdrawal from the study will not jeopardize your child's relationship with UBC, his or her school, or the supports provided to him or her by SET-BC in any way.  Please indicate on the consent form provided on the next page whether or not your son/daughter has permission to participate. Please sign and date the form and return it in the enclosed stamped, self-addressed envelope within the next 7 days, regardless of whether or not you give consent.  Thank you very much for considering this request.  Sincerely,  Pat Mirenda, Ph.D. Professor    90 PARENT CONSENT FORM  EFFECTIVENESS OF ASSISTIVE TECHNOLOGY FOR STUDENTS WITH DOWN SYNDROME, 2007-2008 Pat Mirenda, Ph.D., Professor, Principal Investigator Joanne McCartney, Graduate Student (Masters), Co-Investigator Department of Educational and Counseling Psychology & Special Education University of British Columbia 2125 Main Mall Vancouver, B.C. V6T 1Z4 (604) 822-6296 ------------------------------------------------------------------------------------------ (KEEP THIS PORTION FOR YOUR RECORDS)  I have read and understand the attached letter regarding the study entitled “Effectiveness of Assistive Technology for Students with Down Syndrome.”  I have also kept copies of both the letter describing the study and this permission slip.  _____ Yes, my son/daughter has my permission to participate during the 2007-2008 school year.  _____ No, my son/daughter does not have my permission to participate during the 2007-2008 school year.  Parent's Signature     School: Son or Daughter's Name (please print) Date                                                                        CUT HERE AND RETURN THE FORM BELOW IN THE ENCLOSED ENVELOPE                                                                        I have read and understand the attached letter regarding the study entitled “Effectiveness of Assistive Technology for Students with Down Syndrome.”  I have also kept copies of both the letter describing the study and this permission slip.  _____ Yes, my son/daughter has my permission to participate during the 2007-2008 school year.  _____ No, my son/daughter does not have my permission to participate during the 2007-2008 school year.  Parent's Signature     School: Son or Daughter's Name (please print) Date   91 Appendix E Year One Directions for Data Collection   SET-BC TAP Project  BUNDLE 1: Early/Emergent Literacy  Dear SET-BC Support Person: Please read these instructions carefully and thoroughly before collecting the monthly writing sample from the student.  Step 1: Getting Ready  a) Check the TAP section of the SET-BC website for the Clicker grid for this month’s writing sample. Save it onto a memory stick and then download it to the student’s computer. PLEASE HAVE THE STUDENT USE THIS GRID ONLY.   Step 2: Introduction to the Student  Say or read this to the student each month before collecting the writing sample. It’s okay to use your own words, as long as the student understands that he or she can decide whether or not to write the story:  “Your (parents/guardians) signed a letter saying that it’s okay for you to be part of a project about how computers help kids to write better. If you want to be part of the project, we’ll look at the story about _________ (monthly topic) together. Then, you’ll write a story about ________ (topic), on the computer. You can write for as long as you want. If you think you want to write and then change your mind, that’s okay – we’ll stop if you say so. Do you want to write a story now?”  The student must agree to participate in order to continue. If the student agrees, go on to Step 3. If the student refuses, discontinue the session for the day. Within 1-4 days later, conduct the “Introduction to the Student” again. If the student agrees after the second invitation, go on to Step 3. If the student refuses after the second invitation, send an e-mail to Joanne McCartney at UBC-TAP (tap@interchange.ubc.ca) with the student’s name, school, the dates you attempted the sample, and the message STUDENT DID NOT AGREE TO PARTICIPATE.  Step 3: Conduct the Practice Writing Activity and Collect the Writing Sample   a) Make sure the student is positioned optimally to write with the TAP computer and software, in a quiet, non-distracting environment. b) Open the Clicker grid for this month. You will see that the first part of the file is a practice writing activity on the month’s topic. Follow the instructions to assist the   92 student to complete the practice writing activity. Use the red “information” buttons for instructions. You may assist the student to complete the practice writing activity.  c) At the end of the practice writing activity, you will see a page that says “Research Writing Sample”. The Clicker grids for the writing sample follow this page.  d) Ask the student to write a story about the monthly topic, using ONLY the Clicker grids provided, NOT the word bank or other Clicker features. The student should write for as long as he/she wants about the topic. Please note the time he/she starts and ends the writing sample and note these times after you print it out (START TIME: _______; END TIME: _____).  e) PLEASE DO NOT TELL THE STUDENT WHAT TO WRITE OR PROVIDE ASSISTANCE WITH WORD ORDER – The sample should represent the student’s writing ability, not yours! You can help with navigation around the Clicker grid if you need to, but not with content.       d) When the student indicates that his/her story is finished, please save it, print it, write the student’s name on it along with the date and start time/stop time, and mail it to:  Pat Mirenda, UBC, 2125 Main Mall, Vancouver, BC V6T 1Z4  PLEASE MAIL THE SAMPLE IMMEDIATELY! Thank you for your assistance!!!!   93 BUNDLE 2: Beginning Writing  Dear SET-BC Support Person: Please read these instructions carefully and thoroughly before collecting the monthly writing samples from the student.  Step 1: Getting Ready  a) Check the TAP section of the SET-BC website for the topic of this month’s writing sample. PLEASE HAVE THE STUDENT WRITE ON THIS TOPIC ONLY.  Step 2: Introduction to the Student  Say or read this to the student each month before collecting the monthly writing sample. It’s okay to use your own words, as long as the student understands that he or she can decide whether or not to write the story:  “Your (parents/guardians) signed a letter saying that it’s okay for you to be part of a project about how computers help kids to write better. If you want to be part of the project, the first thing you’ll do is write a short story with a pencil and paper about _______________(topic for the month). Then, you’ll write another story about _______________(topic for the month) on your computer. If you think you want to help with this project and then change your mind, that’s okay – we’ll stop if you say so. Do you want to help with this project?”  The student must agree to participate in order to continue. If the student agrees, go on to Step 3.  If the student refuses, discontinue the session for that day. Within 1-4 days later, conduct the “Introduction to the Student” again. If the student agrees after the second invitation, go on to Step 3.  If the student refuses after the second invitation, send an email to Joanne McCartney (tap@interchange.ubc.ca) at UBC with the students’ name, school, the dates you attempted the sample, and the message STUDENT DID NOT AGREE TO PARTICIPATE.  Step 3: Hand-Written Story  a) Make sure the student is positioned optimally to write and is in a quiet, non- distracting environment.  Give the student a paper and writing implement that is appropriate for him or her to use (pencil, pen, marker, etc.).  b) Say: “Now, I need you to try to write a story about _______________(topic for the month). Try to write for 10 minutes, but if you get tired and have to stop sooner, that’s okay.  Do the best you can, and don’t worry if it’s hard to read or if you spell words wrong – this isn’t a test and you won’t get any grades on it!  No one will see the story you write except me and the people who are doing this project. You can start writing when you’re ready.”  c) Give the student 10 minutes to write; set a timer so that you know when this time is completed. PLEASE DO NOT HELP THE STUDENT WHILE HE OR SHE IS   94 WRITING, UNLESS ABSOLUTELY NECESSARY. The writing sample should represent the student’s writing ability, not yours! After 10 minutes (or less, if the student indicates that he or she needs to stop writing), collect the story and label it “Hand-written Sample by (Student’s Name) on (Date) after (# of Minutes) of Writing.” Thank the student for his or her work and give him,/her a short break.  Step 4: Computer Story  a) This story can be written on either the same day or on another day in the same week as the hand-written story. Make sure the student is positioned optimally to write with the TAP computer and software, in a quiet, non-distracting environment.  b) DO NOT give the student access to a special Topic Dictionary in Co:Writer. Please use the Intermediate Dictionary. Please turn on BOTH the Predict Ahead and FlexSpell features in Co:Writer before the student begins to write.  c) Say:  “This time, I want you to write a story on the computer about ____________(topic for the month). I’d like you to write for 10 minutes, but if you get tired and have to stop sooner, that’s okay.  Do the best you can, and don’t worry if you spell words wrong.  Remember, this isn’t a test, so you won’t be graded on it, and no one will see the story you write except me and the people who are doing this project.  You can start writing when you’re ready.”  d) Give the student 10 minutes to write; set a timer so that you know when this time is completed. Provide no assistance to the student with the story. PLEASE DO NOT HELP THE STUDENT WHILE HE OR SHE IS WRITING, UNLESS ABSOLUTELY NECESSARY. The writing sample should represent the student’s writing ability, not yours! After 10 minutes (or less, if the student indicates that he or she needs to stop writing), collect the story and label it “Computer Sample by (Student’s Name) on (Date) after (# of Minutes) of Writing.” Thank the student for his or her work and end the session.  Please mail both writing samples to Pat Mirenda, UBC, 2125 Main Mall, Vancouver BC V6T 1Z4  PLEASE MAIL THE SAMPLES IMMEDIATELY! Thank you for your assistance!!!!   95 Appendix F Year One On-Line Survey  University of British Columbia and SET-BC TAP Project Monthly Data Collection Form  Student’s Code Number:  Name of person completing this form:  Position of person completing this form (e.g., resource teacher, special education teacher, etc.):  Today’s date:   1. Which technology bundle did this student receive through TAP? (check one)   Early/Emergent Literacy (Clicker 5 with Picture Communication Symbols and  Balanced Literacy  Writing (Clicker 5 with Cloze Pro and CoWriter S)  Communication (Speaking Dynamically Pro/Boardmaker with Functionally  Speaking)   2. Over the past month, on average, how many hours per day did the student use  TAP technology at school?   Less than 30 minutes/day  30min-1 hour/day  1-2 hours/day  2-3 hours/day  3-4 hours/day  4-5 hours/day  more than 5 hours/day  Please state the educational goals related to the TAP technology and rate this student’s progress with regard to each goal, over the past month:   3. Goal #1:   no progress (1)  minimal progress (2)   96  moderate progress (3)  substantial progress (4)  student has met the goal (5) 4. Goal #2:   no progress (1)  minimal progress (2)  moderate progress (3)  substantial progress (4)  student has met the goal (5)   5. Goal #3:   no progress (1)  minimal progress (2)  moderate progress (3)  substantial progress (4)  student has met the goal (5)   6. Goal #4:  no progress (1)  minimal progress (2)  moderate progress (3)  substantial progress (4)  student has met the goal (5)   7. To what extent did the TAP technology effect/enhance this student’s ability to participate in the academic activities/lessons of his/her classroom?   no effect (1)  minimal effect (2)  moderate effect (3)   97  substantial effect (4)  exceptional effect (5)  8. To what extent did the TAP technology effect/enhance this student’s ability to communicate with his/her peers in the classroom?   no effect (1)  minimal effect (2)  moderate effect (3)  substantial effect (4)  exceptional effect (5)  9. To what extent did the TAP technology effect/enhance this student’s ability to participate in the social activities of his/her classroom (e.g., recess, other social opportunities)?   no effect (1)  minimal effect (2)  moderate effect (3)  substantial effect (4)  exceptional effect (5)  10. Over the past month, how many hours per day on average did teachers and other staff spend programming and planning student use of TAP technology?   Less than 15 minutes/day  15-30 minutes/day  30 minutes-1 hour/day 1-2 hours/day  more than 2 hours/day  11. Over the past month, what challenges did the educational team face in implementing the TAP technology?   Insufficient planning/programming time for staff  Insufficient training on use of the technology by staff   98  Insufficient classroom time devoted to literacy/writing/communication instruction  Hardware (computer) did not work properly and required repair  Software did not work properly and required replacement  Student was not interested in the technology  Student was frequently absent from school because of illness/other reasons  Other challenges (describe):  Thank you for completing this report.  Please click Save to submit it on-line as a password-protected file.   99 Appendix G Overall Literacy Program Survey for the 2006-07 School Year University of British Columbia and SET-BC Technology Access Project (TAP) Literacy Program Survey 2006-07  Please complete this brief survey about the literacy instruction (reading, writing, spelling) that the TAP student who you supported LAST YEAR received during the 2006-2007 school year. If you supported more than one TAP student last year, please complete a separate form for each student.  If you want to complete the survey on your computer, just click on a grey text field to type; or double click on a check box and change the Default Value to “Check.” Remember to save your completed form and email it to us at tap@interchange.ubc.ca. Or, you may print the form, complete it, and then mail it to the address at the end of the survey.  TAP Student’s Name:  Name of person completing this form:  Position of person completing this form:        Today’s date:  1. Which technology bundle did this student receive through TAP? (check one)   Early/Emergent Literacy (Clicker 5 with Picture Communication Symbols and  Balanced Literacy)  Writing (Clicker 5 with Cloze Pro and Co:Writer)  Communication (Speaking Dynamically Pro/Boardmaker with Functionally  Speaking)  2. Which components of the TAP technology bundle did this student use regularly?  Bundle 1  Balanced Literacy  Clicker 5  Boardmaker/Picture Communication Symbols  Bundle 2  ClozePro  Co:Writer  Clicker 5  Boardmaker/Picture Communication Symbols    100 Bundle 3  Speaking Dynamically Pro/Boardmaker with Functionally Speaking  Speaking Dynamically Pro/Boardmaker without Functionally Speaking 3. Please list any other literacy software programs used by or with this student last year (if any).      4. In addition to the TAP software and other software for literacy, what other types of literacy instruction did your student receive last year as part of his or her educational program?   Structured phonics instruction (i.e., learning the sounds of letters, rhyming)  Word Wall activities  Sight word instruction (Dolch words, functional words, etc.)  Guided reading with basal readers or other books selected by the teacher  Teacher read-alouds  Self-selected reading, either individually or in small groups  Buddy reading (students read to one another)  Reader’s Chair  Writer’s Workshop  Journal writing  Interactive writing (teacher and student write together)  Other:____________________________________________________________  5. Did your student receive literacy instruction using a specific program or formal approach such as one of the following?  (check all that apply):   Balanced Literacy  Corrective Reading  Edmark Reading Program  Firm Foundations  Four Blocks  Language!  MEville to WEville  Open Court  Reading A to Z  Reading First  Reading Mastery  Reading Recovery  Success for All  Other formal reading program:____________________________________    101 6. (BUNDLE 1 STUDENTS ONLY) On average, how many times did this student practice with the Clicker 5 grid before doing the monthly writing sample?   Not at all  Once  Twice  Three times  More than three times  7. (BUNDLE 3 STUDENTS ONLY) Overall, did you notice any speech benefits from this student’s use of the technology (that is, did your student talk more or talk more clearly as a result of the technology)?   No  Yes  If yes, please describe.       THANK YOU FOR COMPLETING THIS SURVEY!   Please email it to tap@interchange.ubc.ca or snail mail it to UBC/TAP, c/o Pat Mirenda, 2125 Main Mall, Vancouver, BC V6T 1Z4   102 Appendix H Year Two Directions for Data Collection SET-BC TAP Research Project: CLICKER 5 GROUP  Instructions for Collecting Monthly Writing Sample  Dear SET-BC Teacher:  The purpose of this writing sample is to assess your student’s progress in using Clicker 5 to write about various topics. The student’s educational team, together with the SET- BC consultant assigned to the student, is responsible for designing an appropriate literacy program. The Clicker 5 grids used for the writing sample are not meant to be used as the student’s entire reading and writing curriculum --  they should be just one component of the overall literacy program.  Please read these instructions carefully and thoroughly before collecting each monthly writing sample from the student.  Section 1: Introducing the Monthly Clicker Grid  b) On the Research section of the TAP website (http://www.setbc.org/tap/research .asp), you will find TWO Clicker 5 grids related to the monthly writing topic; one is labeled “Beginner” and the other is labeled “Advanced.” If your student was just learning to use Clicker last year or is a beginning writer, use the Beginner grid. If your student was fairly proficient with Clicker last year and is using it to write short sentences, use the Advanced grid. If you are not sure which one to use, download them both and explore before deciding! PLEASE HAVE THE STUDENT USE ONE OF THESE GRIDS to produce the writing sample.  c) For each Clicker grid, there are three parts: a Reading section that introduces the topic for the month; a Practice section, that allows the student to practice writing about the topic; and the Writing section, which is what you will print and send to us as the writing sample. Please have the student use the first two sections at least once before producing the writing sample. If you want to have the student read and practice more than once, that’s okay – as long as he or she completes the writing sample within one week.  Section 2: Collect the Writing Sample  1. Say or read this to the student each month before collecting the writing sample: “Your (parents/guardians) signed a letter saying that it’s okay for you to be part of a project about how computers help kids to write better. If you want to be part of the project, you’ll write a story on about (computer topic for the month) on your   103 computer. If you think you want to help with this project and then change your mind, that’s okay – we’ll stop if you say so. Do you want to help with this project?”  The student must agree to participate in order to continue.  If the student agrees, proceed with the next steps.  If the student refuses, discontinue the session for that day. Within 1-4 days later, read the instructions to the student again. If the student agrees after the second invitation, proceed with the next steps.  If the student refuses after the second invitation, send an email to tap@interchange.ubc.ca to tell us that you will not be submitting a sample for that month.  2. Make sure the student is positioned optimally to write with the TAP computer and software, in a quiet, non-distracting environment.  3. When the student is ready to produce the writing sample, say:  “Now, I’d like you to try to write a story on the computer about (computer topic for the month). I’d like you to write for 10 minutes, but if you get tired and have to stop sooner, that’s okay. Do the best you can.  Remember, this isn’t a test, so you won’t be graded on it, and no one will see the story you write except me and the people who are doing this project.  You can start writing when you’re ready.”  4. Give the student 10 minutes to write; set a timer so that you know when this time is completed. Provide no assistance to the student with the story.  5. After 10 minutes (or less, if the student indicates that he or she needs to stop writing), collect the story and label it “Writing Sample by (Name) on (Date) after (# of Minutes) of Writing. The student used the Practice section on Clicker for this topic (#) times before writing this story. The student used Beginner/Advanced grid.” Thank the student for his or her work and end the session.  Please mail the writing sample to Pat Mirenda, UBC, 2125 Main Mall, Vancouver BC V6T 1Z4. PLEASE MAIL THE SAMPLE IMMEDIATELY! Thank you for your assistance!!!!      104 SET-BC TAP Research Project; CO:WRITER GROUP  Instructions for Collecting Monthly Writing Samples  Dear SET-BC Teacher:  The purpose of this writing sample is to assess your student’s progress in using Co:Writer to write about various topics. The student’s educational team, together with the SET-BC consultant assigned to the student, is responsible for designing an appropriate overall literacy program.  Please read these instructions carefully and thoroughly before collecting each monthly writing sample from the student.  Section 1: Introduction to the Student  Say or read this to the student each month before collecting the writing samples: “Your (parents/guardians) signed a letter saying that it’s okay for you to be part of a project about how computers help kids to write better. If you want to be part of the project, the first thing you’ll do is write a short story with a pencil and paper about (handwriting topic for the month). Then, you’ll write another story about (computer topic for the month) on your computer. If you think you want to help with this project and then change your mind, that’s okay – we’ll stop if you say so. Do you want to help with this project?”  The student must agree to participate in order to continue.  If the student agrees, go on to Section 2.  If the student refuses, discontinue the session for that day. Within 1-4 days later, read the instructions to the student again. If the student agrees after the second invitation, go on to Section 2. If the student refuses after the second invitation, send an email to tap@interchange.ubc.ca to tell us that you will not be submitting a sample for that month.  Section 2: Hand-Written Story  d) Please check the Research section of the TAP website (http://www.setbc.org/tap/research .asp), for the writing sample topic for this month. PLEASE HAVE THE STUDENT WRITE ON THIS TOPIC ONLY.  e) Make sure the student is positioned optimally to write and is in a quiet, non- distracting environment.  Give the student a paper and writing implement that is appropriate for him or her to use (pencil, pen, marker, etc.).  f) Say: “Now, I need you to try to write a story about (topic for the month). I’d like you to write for 10 minutes, but if you get tired and have to stop sooner, that’s okay.  Do the best you can, and don’t worry if it’s hard to read or if you spell words wrong – this isn’t a test and you won’t get any grades on it!  No one will see the story you   105 write except me and the people who are doing this project. You can start writing when you’re ready.”  g) Give the student 10 minutes to write; set a timer so that you know when this time is completed. Provide no assistance to the student with the story. After 10 minutes (or less, if the student indicates that he or she needs to stop writing), collect the story and label it “Hand-written Sample by (Name) on (Date) after (# of Minutes) of Writing.” Thank the student for his or her work and give him/her a short break.  Section 3: Co:Writer Story  d) PLEASE HAVE THE STUDENT WRITE ABOUT THE SAME TOPIC THAT WAS USED FOR THE HAND-WRITTEN SAMPLE.  e) For this writing sample, DO NOT give the student access to a special Topic Dictionary in Co:Writer. Please use the Intermediate Dictionary. Please turn on BOTH the Predict Ahead and FlexSpell features in Co:Writer before the student begins to write.  f) This story can be written on either the same day or on another day in the same week as the hand-written story. Make sure the student is positioned optimally to write with the TAP computer and software, in a quiet, non-distracting environment.  g) Say:  “Now, I want you to write a story on the computer about (topic for the month). I’d like you to write for 10 minutes, but if you get tired and have to stop sooner, that’s okay.  Do the best you can, and don’t worry if you spell words wrong.  Remember, this isn’t a test, so you won’t be graded on it, and no one will see the story you write except me and the people who are doing this project.  You can start writing when you’re ready.”  h) Give the student 10 minutes to write; set a timer so that you know when this time is completed. Provide no assistance to the student with the story. After 10 minutes (or less, if the student indicates that he or she needs to stop writing), collect the story and label it “Computer Sample by (Name) on (Date) after (# of Minutes) of Writing.” Thank the student for his or her work and end the session.  Please mail both writing samples to Pat Mirenda, UBC, 2125 Main Mall, Vancouver BC V6T 1Z4. PLEASE MAIL THE SAMPLES IMMEDIATELY! Thank you for your assistance!!!!   106 Appendix I Year Two On-Line Survey University of British Columbia and SET-BC Technology Access Project (TAP) Monthly Survey, 2007-08  Student’s Name:  Name of person completing this form:  Position of person completing this form (e.g., resource teacher, special education teacher, etc.):  Today’s date:  1. Which TAP software bundle is the student currently using? (check one)   Early/Emergent Literacy (Clicker 5 with Picture Communication Symbols and  Balanced Literacy)  Writing (Clicker 5 with Cloze Pro, and Co:Writer)  3. Over the past month, on average, how many hours per day did the student use  TAP technology at school?   Less than 30 minutes/day  30 min-1 hour/day  1-2 hours/day  2-3 hours/day  3-4 hours/day  4-5 hours/day  more than 5 hours/day  3. To what extent did the TAP technology effect/enhance this student’s ability to participate in the academic activities/lessons of his/her classroom?   no effect  minimal effect  moderate effect  substantial effect  exceptional effect    107 4. To what extent  did the TAP technology effect/enhance this student’s ability to communicate with his/her peers in the  classroom?   no effect  minimal effect  moderate effect  substantial effect  exceptional effect  5. To what extent  did the TAP technology effect/enhance this student’s ability to participate in the social activities of his/her classroom (e.g., recess, other social opportunities)?   no effect  minimal effect  moderate effect  substantial effect  exceptional effect  6. Over the past month, how many hours per day on average did teachers and other staff spend programming and planning student use of TAP technology?   Less than 15 minutes/day  15-30 minutes/day  30 minutes-1 hour/day  1-2 hours/day  more than 2 hours/day  7. Over the past month, what challenges did the educational team face in implementing the TAP technology?   Insufficient planning/programming time for staff  Insufficient training on use of the technology by staff  Insufficient classroom time devoted to literacy/writing/communication  instruction  Hardware (computer) did not work properly and required repair  Software did not work properly and required replacement  Student was not interested in the technology  Student was frequently absent from school because of illness/other reasons  Other challenges (describe): ______________________________________   108 8. In addition to the TAP software, what other types of literacy instruction did your student receive last year as part of his or her educational program ?    Structured phonics instruction (i.e., learning the sounds of letters, rhyming)   Word Wall activities   Sight word instruction (Dolch words, functional words, etc.)   Guided reading with basal readers or other books selected by the teacher   Teacher read-alouds   Self-selected reading, either individually or in small groups   Buddy reading (students read to one another)   Reader’s Chair   Writer’s Workshop   Journal writing   Interactive writing (teacher and student write together)      Other:_________________________________________________________  9. Did your student receive literacy instruction using a specific program or formal approach such as one of the following?  (check all that apply):    Balanced Literacy   Corrective Reading   Edmark Reading Program   Firm Foundations   Four Blocks   Language!   MEville to WEville   Open Court   Reading A to Z   Reading First   Reading Mastery   Reading Recovery   Success for All   Other formal reading program:___________________________________  10. On average, how much TOTAL TIME was spent on literacy instruction for this student per week, over the past month?   Less than 30 minutes/week  30 min-1 hour/week  1-2 hours/week  2-3 hours/week  3-4 hours/week  4-5 hours/week  more than 5 hours/week 

Cite

Citation Scheme:

        

Citations by CSL (citeproc-js)

Usage Statistics

Share

Embed

Customize your widget with the following options, then copy and paste the code below into the HTML of your page to embed this item in your website.
                        
                            <div id="ubcOpenCollectionsWidgetDisplay">
                            <script id="ubcOpenCollectionsWidget"
                            src="{[{embed.src}]}"
                            data-item="{[{embed.item}]}"
                            data-collection="{[{embed.collection}]}"
                            data-metadata="{[{embed.showMetadata}]}"
                            data-width="{[{embed.width}]}"
                            async >
                            </script>
                            </div>
                        
                    
IIIF logo Our image viewer uses the IIIF 2.0 standard. To load this item in other compatible viewers, use this url:
https://iiif.library.ubc.ca/presentation/dsp.24.1-0066694/manifest

Comment

Related Items