Open Collections will be undergoing maintenance Monday June 8th, 2020 11:00 – 13:00 PT. No downtime is expected, but site performance may be temporarily impacted.

Open Collections

UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

Special education administrators and workload determination for teachers of students with visual impairments… Wilton, Adam 2017

Your browser doesn't seem to have a PDF viewer, please download the PDF to view this item.

Item Metadata

Download

Media
24-ubc_2017_may_wilton_adam.pdf [ 15.81MB ]
Metadata
JSON: 24-1.0343976.json
JSON-LD: 24-1.0343976-ld.json
RDF/XML (Pretty): 24-1.0343976-rdf.xml
RDF/JSON: 24-1.0343976-rdf.json
Turtle: 24-1.0343976-turtle.txt
N-Triples: 24-1.0343976-rdf-ntriples.txt
Original Record: 24-1.0343976-source.json
Full Text
24-1.0343976-fulltext.txt
Citation
24-1.0343976.ris

Full Text

 SPECIAL EDUCATION ADMINISTRATORS AND WORKLOAD DETERMINATION FOR TEACHERS OF STUDENTS WITH VISUAL IMPAIRMENTS: A DELPHI STUDY by ADAM WILTON B.Sc., The University of Toronto, 2007  M.A., The University of Toronto, 2009  A THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF DOCTOR OF PHILOSOPHY in THE FACULTY OF GRADUATE AND POSTDOCTORAL STUDIES (Special Education)  THE UNIVERSITY OF BRITISH COLUMBIA (Vancouver)  April 2017  Ó Adam Wilton, 2017      ii  ABSTRACT  Itinerant teachers of students with visual impairments (TSVIs) support access to the curriculum for students with visual impairments by promoting the use of adaptive tools, materials, and strategies to mitigate the impact of visual impairment on learning and development. Traditionally, the number of students served by the TSVI is referred to as a “caseload” and is used as an indicator of the breadth of the TSVI’s professional responsibilities. This study uses “workload” as a more inclusive term that encompasses the full scope of the itinerant TSVIs’ professional practice, including direct, consultative, and indirect service to students with visual impairments and their educational teams. Given the low incidence of visual impairment among children and youth, many special education administrators responsible for staffing TSVI positions do not have an awareness of the specialized educational programming needs of these learners that factor into TSVI workload. Resources targeted to special education administrators that consider a broad scope of educational programming, personnel, and policy factors are required to support data-driven workload determinations for itinerant TSVIs.         The purpose of the study was to develop a set of factors that experts rate as important considerations in the process of TSVI workload determination. This study was conducted using the Delphi approach, an iterative process through which consensus is built among a panel of knowledgeable experts on a topic of specialized interest. Panelists rated the importance of 45 initial educational programming, personnel, and policy-level factors with 22 panelist-nominated factors added in the second survey round.  Four survey rounds were required to arrive at a set of 45 confirmed factors. Each factor included in the final set of confirmed factors had a rating at a high level of importance, strong consensus among panelists, and stability across survey rounds. After adjusting for the total number of initial and nominated factors in each thematic cluster, educational programming factors accounted for the greatest proportion of confirmed factors, followed by personnel-level factors and policy-level factors. The results of the study are intended to provide special education administrators with a set of evidence-based factors to inform the process of workload determination for itinerant TSVIs.             iii  PREFACE This dissertation is submitted for the degree of Doctor of Philosophy at the University of British Columbia. The research described herein is the unpublished, independent work of the author, A. Wilton. The work is, to the best of the author’s knowledge, original except where references are made to previous work.   This research, entitled Special Education Administrators and Workload Determination for Teachers of Students with Visual Impairment: A Delphi Study, was issued a Certificate of Approval (H14-03350) by the Behavioural Research Ethics Board of the University of British Columbia on September 21, 2015. The certificate was renewed on July 25, 2016 and expires on July 25, 2017.       iv  TABLE OF CONTENTS ABSTRACT .................................................................................................................................... ii PREFACE ...................................................................................................................................... iii TABLE OF CONTENTS ............................................................................................................... iv LIST OF TABLES ....................................................................................................................... viii LIST OF FIGURES ........................................................................................................................ x ACKNOWLEDGEMENTS ........................................................................................................... xi DEDICATION .............................................................................................................................. xii CHAPTER ONE ............................................................................................................................. 1      Introduction ................................................................................................................................ 1      Background to the Problem ....................................................................................................... 2      Statement of the Problem ........................................................................................................... 3      Identification of the Problem in Professional Literature ......................................................................6      Background to the Research Questions ..................................................................................... 7      Research Questions ..............................................................................................................................8      Goals of the Study ...................................................................................................................... 9      Significance of the Study ......................................................................................................... 10      Significance for the TSVI ...................................................................................................................11      Significance for Personnel Preparation Programs ..............................................................................11      Delimitations of the Study ....................................................................................................... 12      Overview of Research Design and Methodology .................................................................... 13      Organization of the Dissertation .............................................................................................. 14 CHAPTER TWO .......................................................................................................................... 15      Current Profile of Students with Visual Impairments in Inclusive Settings ............................ 16      Incidence of Visual Impairment in the K-12 Population ....................................................................17      Increasing Heterogeneity Among Students with Visual Impairments ...............................................17      The Tradition of Inclusive Education for Students with Visual Impairments ......................... 19      Early Promotion of Inclusion in Community Settings .......................................................................19      Population Shift and Steps Toward Mainstreaming ...........................................................................20      The Pine Brook Report .......................................................................................................................20      From Mainstreaming to Inclusion and Universal Access ...................................................................23      Recent Trends in Inclusive Education for Students with Visual Impairments ...................................23      TSVI Workload Determination: The Role of the Special Education Administrator ............... 25     v       The Role and Function of the Itinerant TSVI .......................................................................... 26      Direct instruction ................................................................................................................................26      Consultative Service ...........................................................................................................................28      Working with paraprofessionals. ...................................................................................................29      Accessibility of the classroom environment. .................................................................................29      Indirect service ...................................................................................................................................30      Materials in alternate formats. ........................................................................................................30      Distinguishing Between Caseload and Workload .................................................................... 30      Caseload Analysis Tools ....................................................................................................................33      Teacher Challenges in the Itinerant Model of Service Delivery .............................................. 35      Travel and Non-Instructional Duties ..................................................................................................36      Meeting the Disability-Specific Learning Needs of Students ............................................................37      Number of Students Served - Implications for Workload ..................................................................40      Summary of Challenges in the Itinerant Model of Service Delivery .................................................42      Policy-Level Determinants of Itinerant TSVI Workload ......................................................... 42      Educational Policy Factors .................................................................................................................43      Stakeholder position statements and standards. .............................................................................45      Educational Service Guidelines. ....................................................................................................46      Legislative Factors ..............................................................................................................................48      Caseload Policy. .............................................................................................................................48      Personnel Factors ................................................................................................................................52      Summary of Administrative Determinants of Itinerant TSVI Workload ...........................................55      Consequences of Unmanageable TSVI Workloads ................................................................. 55      Shift from Direct to Consultative Service ..........................................................................................56      Adverse Effects on Curriculum Accessibility and Achievement .......................................................57      Teacher Role Dissonance, Stress, and Attrition .................................................................................59      Summary of Consequences of Unmanageable Workloads ................................................................61      Summary of the Literature Review .......................................................................................... 62 CHAPTER THREE ...................................................................................................................... 64      The Delphi Approach ............................................................................................................... 65      Delphi Research in the Field of Visual Impairment ...........................................................................67      Research Sample ...................................................................................................................... 68      Sample Size ........................................................................................................................................68      Sampling Criteria ................................................................................................................................70      Sampling Procedure ............................................................................................................................73      Study Administration ............................................................................................................... 75      Prior to Round One .............................................................................................................................78      Round One Survey Development .......................................................................................................78      Participant Demographics ..............................................................................................................78      Delphi Survey – Round One. .........................................................................................................82      Round One Data Analysis ..................................................................................................................83      Round Two Survey Development ......................................................................................................85     vi       Participant Demographics ..............................................................................................................85      Delphi Survey – Round Two. .........................................................................................................86      Round Two Data Analysis ..................................................................................................................87      Round Three Survey Development ....................................................................................................92      Participant Demographics. .............................................................................................................92      Delphi Survey – Round Three. .......................................................................................................93      Round Three Data Analysis ................................................................................................................94      Round Four Survey Development ......................................................................................................94      Round Four Data Analysis .................................................................................................................95      Methodological Limitations ..................................................................................................... 95      Validity ...............................................................................................................................................96      Reliability ...........................................................................................................................................97 CHAPTER FOUR ......................................................................................................................... 99      Round One Data Analysis ...................................................................................................... 100      Educational Programming Factors ...................................................................................................100      Personnel Factors ..............................................................................................................................105      Policy Factors ...................................................................................................................................108      Nominated Factors ............................................................................................................................110      Round Two Data Analysis ..................................................................................................... 111      Round Two Data Analysis – Initial Factors .....................................................................................112      Educational Programming Factors. ..............................................................................................112      Personnel Factors. ........................................................................................................................116      Policy Factors. ..............................................................................................................................118      Round Two Analysis – Nominated Factors ......................................................................................120      Educational Programming Factors. ..............................................................................................121      Personnel Factors. ........................................................................................................................124      Policy Factors. ..............................................................................................................................126      Round Three Data Analysis ................................................................................................... 128      Round Three Analysis – Initial Factors ............................................................................................128      Educational Programming Factors. ..............................................................................................128      Personnel Factors. ........................................................................................................................130      Policy Factors. ..............................................................................................................................132      Round Three Analysis – Nominated Factors ....................................................................................133      Educational Programming Factors. ..............................................................................................133      Personnel Factors. ........................................................................................................................136      Policy Factors. ..............................................................................................................................138      Round Four Data Analysis ..................................................................................................... 139      Additional Data Analysis ....................................................................................................... 140      Conditional Ratings of Initial and Nominated Factors .....................................................................141      Final Set of Confirmed Factors .............................................................................................. 144 CHAPTER FIVE ........................................................................................................................ 149      Summary of Findings .......................................................................................................................151     vii       Curricular Access ................................................................................................................... 152      Literacy Media ..................................................................................................................................152      Availability of Materials in Alternate Format ..................................................................................154      Assessment ............................................................................................................................. 156      Clinical and Functional Data ............................................................................................................156      Data Collection .................................................................................................................................158      Caseload Analysis ............................................................................................................................159      The Expanded Core Curriculum ............................................................................................ 160      Orientation and Mobility ..................................................................................................................161      ECC in the Community ....................................................................................................................162      Developmental Profile of the Learner .................................................................................... 163      Students with Deafblindness ............................................................................................................164      Early Intervention .............................................................................................................................164      Consultative Service .............................................................................................................. 165      Paraprofessionals ..............................................................................................................................165      The Educational Team ......................................................................................................................167      TSVI Succession and Mentorship ....................................................................................................169      Administrative Resources ...................................................................................................... 170      An Ecological Framework for TSVI Workload Determination ............................................ 174      Summary of Study Implications ............................................................................................ 176      Recommendations .................................................................................................................. 177      Limitations of the Study ......................................................................................................... 180      Conclusion ............................................................................................................................. 182 REFERENCES ........................................................................................................................... 184 APPENDICES ............................................................................................................................ 211 Appendix A – Consent Form .................................................................................................................211 Appendix B – Round One Email to Potential Panelists and Round One Survey Tool .........................214 Appendix C – Round Two Email to Panelists and Round Two Survey Tool .......................................247 Appendix D – Round Three Email to Panelists and Round Three Survey Tool ...................................294 Appendix E – Round Four Email to Panelists and Round Four Survey Tool .......................................333        viii  LIST OF TABLES 2.1 References for Initial Educational Programming Factors................................................38 2.2 Number of States Employing Criteria for Caseload Policy, Per Criterion.......................49 2.3 References for Initial Policy Factors.................................................................................51 2.4 References for Initial Personnel Factors...........................................................................53 3.1 Timeline for Data Collection, Rounds One to Four..........................................................76 3.2 Distribution of Round One Panelists’ Professional Roles.................................................79 3.3  Geographic Distribution of Round One Panelists.............................................................81 3.4 Distribution of Round Two Panelists’ Professional Roles................................................86 3.5  Distribution of Round Three Panelists’ Professional Roles..............................................93 3.6 Distribution of Round Four Panelists’ Professional Roles...............................................94 4.1 Round One Results for Initial Educational Programming Factors.................................100 4.2 Round One Results for Initial Personnel Factors............................................................105 4.3  Round One Results for Initial Policy Factors..................................................................108 4.4  Round Two Results for Initial Educational Programming Factors.................................112 4.5 Round Two Results for Initial Personnel Factors............................................................116 4.6 Round Two Results for Initial Policy Factors..................................................................119 4.7  Round Two Results for Nominated Educational Programming Factors.........................121 4.8  Round Two Results for Nominated Personnel Factors....................................................124 4.9 Round Two Results for Nominated Policy Factors..........................................................126 4.10  Round Three Results for Initial Educational Programming Factors..............................128 4.11  Round Three Results for Initial Personnel Factors.........................................................130 4.12  Round Three Results for Initial Policy Factors...............................................................132     ix  4.13  Round Three Results for Nominated Educational Programming Factors......................133 4.14  Round Three Results for Nominated Personnel Factors.................................................136 4.15  Round Three Results for Nominated Policy Factors.......................................................138 4.16 Round Four Results for All Factors.................................................................................140 4.17  Factors with the Greatest Differential Between ImpLOA Conditional Ratings..............141  4.18 Complete Listing of the Set of Confirmed Factor by Final ImpLOA Percentage Ratings.............................................................................................................................144       x  LIST OF FIGURES 3.1 Diagram of the Delphi Process………………………………………………………………….77 3.2 Example of Likert-type Scale for All Items……………………………………………....…….85 5.1 An Ecological Framework for TSVI Workload Determination.......................................176     xi  ACKNOWLEDGEMENTS  Inspiration and support for this research sprang from many. While these few paragraphs will not allow me to fully express my gratitude to those who have encouraged me on my career path thus far, I will acknowledge those whose impact has left a lasting impression. First and foremost, my deepest gratitude to my research supervisor and mentor, Dr. Cay Holbrook. Thank you for being such a strong model of professionalism and strength and for inspiring me to always do better for all students with visual impairments. To Dr. Kim Zebehazy, thank you for your expert guidance as both a teacher and collaborator. All our work together over the years has made me a more skilled vision professional. Thanks also to Dr. Janet Jamieson for your input and advice throughout the research process.            I would like to acknowledge and thank the expert panelists who participated in the study. Their engagement and expertise provided a rich foundation upon which further inquiry into the process of workload determination can be built. Also, I would like to acknowledge grants from the Social Sciences and Humanities Research Council of Canada and the Canadian National Institute for the Blind. This financial support greatly aided the completion of this research.   To my family, my unending gratitude not only for your support every step of the way but also for being that great constant wherever my life may lead. Finally, profoundest thanks to ALM for her encouragement and for challenging me to aim higher from the outset.      xii  DEDICATION This work is dedicated to the students and families that I have had the privilege to serve. I am eternally grateful and humbled to have a place in the lives of these learners.                1 CHAPTER ONE Introduction One of the hallmarks of special education policy in North America is the prevailing belief that every effort should be made to educate students with disabilities in classrooms with typically developing peers located in community schools. Students with visual impairments have a long tradition of placement in community schools in both the United States (Hatlen, 2000) and Canada (Aylesworth, 1938). Early experiments in inclusive education included resource rooms and special classrooms for students with visual impairments in local education agencies and school districts (Lowenfeld, 1941/1983). These integrated placements existed in addition to those at specialized schools for students with visual impairments, which was the predominant model of special education service delivery for these learners in both countries at the turn of the last century (Hatlen, 2000). At this early point, special classrooms for students with visual impairments housed in local education agency (LEA) schools, known as "braille classes," offered the opportunity for some interaction with non-disabled peers (Wallace, Wrighstone, & Gall, 1954).  Over the course of the twentieth century, broad changes in educational and social policy, as well as changes in the population of students with visual impairments, resulted in a substantial shift away from placements perceived as less inclusive (e.g., day programs, specialized schools) to those perceived as more inclusive (e.g., general education classrooms with support; Kavale & Forness, 2000). Current estimates of the number of students with visual impairments placed in general education classrooms range from 90-100% of the total population of school-aged children and youth with visual impairment in North America, depending on the breadth of      2 placement options available under the jurisdiction of a given State Department of Education or Ministry of Education (Wall & Corn, 2004).  Of the many options for integrated service delivery in use in the 1950s and 1960s in North America, the itinerant model has become the predominant choice of service delivery for students with visual impairments in inclusive settings (Wall & Corn, 2004). In the itinerant model, the specialist teacher travels to multiple school sites within an LEA or is contracted across LEAs (Bullard, 2003). At each school, the itinerant teacher of students with visual impairments (TSVI) provides service to students through one of three general modes: via direct instruction, consultation with the school-based team (e.g., working with the classroom teacher), or through indirect service (e.g., connecting the family to community service organizations). Background to the Problem More inclusive practices in special education service delivery require that the school take greater responsibility to create accessible learning environments for students with special needs (Erten & Savage, 2011). This is in contrast to the expectation that students will adapt to the school environment, an outlook associated with the earlier process of mainstreaming (Hatlen, 2000). In North America, LEA or school district administration is primarily responsible for ensuring that legal and policy obligations regarding inclusion are met (Crockett, 2002). Administrators play an instrumental role in fostering and promoting inclusive practices at both the school and LEA levels (Di Paola & Walther-Thomas, 2003). With an increasing number of students with visual impairments placed in inclusive settings located in LEAs, administrators assume greater responsibility in assuring appropriate educational programs are in place for these learners (Alonso, 1990). At this point, an important distinction should be made. While the school-based administrator (e.g., principal) is a part of the educational team for individual      3 students with visual impairments, typically the administrator does not directly oversee the work of the itinerant staff (Furman, 1988). Since the itinerant TSVI works at multiple school sites across the LEA, this responsibility typically falls under the purview of district-level administration in the LEA, namely the special education administrator (Lashley & Boscardin, 2003; McCarty, Hazelkorn, & Boreson, 2003). The total number of students with visual impairments within the LEA to which the TSVI is assigned constitutes the TSVI's caseload (Seitz, 1994). Specifically, an itinerant caseload is the number of students with visual impairments to whom the TSVI provides service in accordance with goals and objectives collaboratively developed by the educational team and stated in the Individualized Education Plan or Program (IEP; Russ, Chiang, Rylance, & Bongers, 2001). Across Canada (e.g., BC Ministry of Education, 2016) and in the United States under the Individuals with Disabilities Education ACT (IDEA), any student meeting criteria for "visual impairment" must have an IEP outlining the goals and objectives for his or her educational program, as well as a listing of the professionals tasked with delivering that program (Lewis & Allman, 2000). Special education guidelines and legislation, as well as professional standards for TSVIs, require that the level of TSVI service be determined by the assessed needs of the student (National Coalition for Vision Health, 2003; Pugh & Erin, 1999). There is, however, ample evidence indicating that there are factors external to the educational needs of the student that determine service levels in inclusive settings (Mason & Davidson, 2000).  Statement of the Problem  Surveys and observational studies of the professional practice of itinerant TSVIs indicate that these teachers believe that ensuring adequate instructional time with students is the most challenging aspect of itinerant service delivery (Correa-Torres & Howell, 2004; Olmstead,      4 1995). Time concerns extend not only to instructional duties, but also to non-instructional duties such as paperwork, travel between school sites, and professional development activities (Griffin-Shirley et al., 2004). Taken together, these professional obligations amount to an itinerant TSVI's workload. In the field of special education, there is growing recognition that the number of students on an itinerant caseload (i.e., caseload size) is an unsuitable metric for defining the total professional responsibilities of itinerant staff (American Speech-Language-Hearing Association, 2002; Katz et al., 2010). "Workload" has been proposed as an alternate term that encompasses both the instructional and non-instructional duties of the itinerant professional (ASHA, 2002). The current study uses a workload approach to seek greater accuracy in characterizing the sum of the professional duties of the itinerant TSVI. In short, workload describes the sum and scope of the professional responsibilities of the itinerant TSVI, encompassing both teaching and non-teaching duties. Despite “workload” being the more informative and accurate means of conceptualizing the scope of the professional role of the TSVI, “caseload” is currently the predominant term in both research and professional literature in visual impairment. As a result, the term “caseload” appears in the chapters that follow when referring to professional writing and research in which “caseload” is used by the author to account for the total set of students served by the itinerant TSVI.  As an education professional, the TSVI determines the strategies and tools best suited to meet the needs of the student in accordance with the goals and objectives outlined in the IEP (Pugh & Erin, 1999). However, special education administrators oversee the work of teachers and are responsible for decision-making in LEA staffing and other issues related to personnel requirements (McCarty, Hazelkorn, & Boreson, 2003; Tyler & Brunner, 2014). While the special education administrator plays a central role in the determination of TSVI workloads, he or she      5 faces a number of challenges in doing so. Several educational variables (e.g., knowledge of unique educational needs, available services for students) may have an impact on the effectiveness of TSVI workload determinations. Given the low incidence of visual impairment in relation to other exceptional populations, special education administrators are not likely to have had any experience or contact with students with visual impairments (Praisner, 2003). Furthermore, special education administrators without a background in visual impairment are more likely to be unaware of the programs, policies, and services that exist for these students in the LEA (Brown & Glaser, 2014; Smith, Geruschat, & Huebner, 2004).  In addition to educational variables, there may be a number of personnel variables that have an impact on workload determinations. For example, estimates of the number of qualified TSVIs required to provide service to students in inclusive settings consistently exceed estimates of the actual supply of TSVIs (Kirchner & Diament, 1999; Mason, McNerney, & Davidson, 2000). Thus, administrators may contend with a shortage of qualified personnel when making decisions regarding TSVI workloads. Finally, legislative variables may have an impact on administrators' TSVI workload determinations. In the United States, IDEA does not offer policy guidance on workload determination or the maximum size of special education caseloads (Russ et al., 2001). There is great variation in states' caseload policy, ranging from prescribed teacher-to-student ratios to placing the responsibility for all caseload policy with the LEA (Jackson, 2003). In Canada, where there is a shared responsibility between the respective provincial Ministry of Education and LEA, there are currently no provinces with mandated limits to caseload size for TSVIs (Zuvela, 2009). According to Zuvela (2009), “existing educational policies recognize the need for services and supports for students with vision loss, but typically      6 they do not specify the level, intensity, or the type of support. How the service is delivered is generally left to the discretion of school boards” (p. 108).  The paucity of evidence-based resources to inform TSVI workload determination underscores a larger issue: relatively little is known regarding the process by which special education administrators determine workloads for itinerant TSVIs. Since the TSVI is charged with providing service in inclusive settings, the complexity of the TSVI's workload has implications for the frequency and intensity of service that the TSVI can provide to students with visual impairments and their educational teams.   Identification of the Problem in Professional Literature Authors in the field of visual impairment note the potential impact that administrative variables have on the delivery of high-quality educational programming to students with visual impairments. In 2003, Corn and Spungin issued a report for the Center for Personnel Studies in Special Education titled "Free and Appropriate Public Education and the Personnel Crisis for Students with Visual Impairments and Blindness." Corn and Spungin outlined the current state of supply and demand for TSVIs in the United States as well as the factors that influence growth in each state. They suggested avenues for future research to investigate factors that contribute to unmanageable TSVI workloads. One of their research questions directly addresses the relationship between administrative-level variables, time issues, and service delivery for students: "What administrative factors result in case loads so large that students with visual impairments are not receiving sufficient time with qualified professionals to meet IEP goals and objectives?" (p.24).  This question has significant historical context in the field. Writing 62 years before Corn and Spungin, Berthold Lowenfeld (1941/1983), an administrator and scholar in the field of visual      7 impairment, told an audience of school administrators that “the effectiveness of work in braille classes is frequently hampered by the excessive load of the braille class teacher, the result of having too many pupils” (p. 7). Lowenfeld's remarks indicate that the unmanageable workload of the specialist teacher has historically been of concern to leaders in the field.  Using Corn and Spungin's question as a starting point, the current study seeks to develop consensus ratings from an expert panel regarding administrative-level variables that have an impact on workload determinations for itinerant TSVIs. Panelists have professional credentials in visual impairment (i.e., bachelor/graduate degree or applicable credential to qualify to serve as a TSVI), have worked as a TSVI, and are currently serving, or have recently served, in an administrative role overseeing programming for students with visual impairments at the LEA- or state/provincial-level in North America. An additional group of recognized experts in service delivery for students with visual impairments were also included in the research sample.  Background to the Research Questions The original impetus for the current study was derived from the professional experience of the researcher. Over each of the last several years working as a TSVI and one year as a Certified Orientation and Mobility Specialist, the researcher has been engaged in advocacy efforts to ensure adequate service levels for students with visual impairments in a large suburban school district in British Columbia, Canada. Working with a series of special education administrators over that period, the researcher recognized that there was a dearth of resources to support TSVI workload determination that could be readily shared with these administrators, none of whom had a prior understanding of the unique educational requirements of students with visual impairments. The researcher contacted several colleagues seeking guidance around advocacy efforts, and realized that significant expertise existed in the field. Now working as an      8 administrator of a provincial resource program in British Columbia, the researcher is more keenly aware of the need for perspectives derived from expert opinion on the process of workload determination for itinerant TSVIs. By examining the issue from a provincial scope and by working with special education administrators without a background in visual impairment, the researcher determined that the need for this expert-driven perspective is clear. Special education administrators face a challenging task when determining workloads for itinerant TSVIs, especially when they lack adequate information on the unique educational needs of students with visual impairments.  Research Questions Three primary research questions and three secondary research questions frame the current study. The key distinction made by the research questions is between experts' perceptions of current conditions (i.e., factors that do influence workload determination) and optimal conditions (i.e., factors that should influence workload determination).  1. How do experts in special education administration and visual impairment rate the level of importance of factors that influence actual workload determinations for itinerant TSVIs? 2. What factors do experts in special education administration and visual impairment believe should be considered in workload determinations for itinerant TSVIs? a. What educational programming factors do experts in special education administration and visual impairment believe should be considered in workload determinations for itinerant TSVIs?      9 b. What policy-level factors do experts in special education administration and visual impairment believe should be considered in workload determinations for itinerant TSVIs? c. What personnel factors do experts in special education administration and visual impairment believe should be considered in workload determinations for itinerant TSVIs? 3. How do experts in special education administration and visual impairment rate the level of importance of factors they believe should influence workload determinations for itinerant TSVIs? Goals of the Study By addressing the research questions, two research goals were realized: (1) to address Corn and Spungin's (2003) call to identify the administrative variables that have a significant influence on the determination of itinerant TSVI workloads; and (2) to develop a series of consensus ratings regarding the factors that experts believe should impact the determination of workloads. These ratings provide insight into current practice as identified by an expert panel. These experts were drawn from three groups of special education administrators and one group of recognized experts in service delivery: 1) administrators at state/provincial education agencies devoted to supporting educational programs for students with visual impairments; 2) special education administrators in individual LEAs with professional experience as TSVIs; 3) administrators of outreach programs at state/provincial specialized schools for students with visual impairments; and (4) recognized experts in the area of service delivery for students with visual impairments. All panelists were recruited with the help of a nomination panel comprised of leaders in the field to ensure that highly qualified individuals were invited to participate.        10  To ensure that the findings of the current study are of value to professionals in applied settings, the consensus ratings regarding factors important to the process of workload determination may be formatted as a series of guidelines. These guidelines will provide special education administrators with a current account of important factors, confirmed by an expert panel, that should enter into workload determinations for TSVIs. This information is intended to supplement existing educational service guidelines for students with visual impairments and to serve as a reference for special education administrators.   Significance of the Study The dearth of recent research in the field of visual impairment relating to special education administration may have contributed to the persistence of the research problem. In 1999, the National Association of State Directors of Special Education (NASDSE) sponsored the publication of educational service guidelines for students with visual impairments in the United States (i.e., Pugh & Erin, 1999). No guidelines exist that are specifically tailored for Canadian administrators. The NASDSE guidelines are intended to "provide a road map for agencies, education service providers, and parents" in the implementation and administration of high-quality educational programming for students with visual impairments (Pugh & Erin, 1999, p. xii). Since the publication of these guidelines over 15 years ago, few resources that address administrative factors in itinerant service delivery have been published. As a result, special education administrators are left with little up-to-date guidance on fostering and sustaining high quality educational programs for students with visual impairments in LEAs. The findings of the study will provide administrators, many of whom may have little knowledge of the unique needs of students with low-incidence disabilities, with expert-driven guidance to inform the process of determining TSVI workloads (Rude et al., 2005)       11 Significance for the TSVI  It is hypothesized that expert-driven guidance for special education administrators will increase the likelihood that TSVIs will be assigned manageable workloads. Unmanageable special education teacher workload has been cited as a key contributor to low job satisfaction, teacher burnout, and attrition (Billingsley, 2004; Fore, Martin, & Bender, 2002). As a result of the relatively limited number of TSVIs compared to other specialist teacher populations, TSVIs may be excluded from the research samples of studies examining special education teacher burnout/attrition (e.g., Zabel & Zabel, 2001).  A limited number of studies have examined burnout/attrition in samples of TSVIs and have identified similar contributors to low job satisfaction (e.g., Seitz, 1994). By extension, more manageable workloads for TSVIs will increase the likelihood that students’ unique programming needs will be met. Unmanageable workloads for teachers are associated with depressed academic growth (ASHA, 2000) and poorer outcomes for students (Algozzine et al., 1993). Special education administrators can work to mitigate the potential impact of unmanageable TSVI workloads by following available service guidelines, including data-driven implications for special education leadership resulting from the current study, to ensure manageable workloads for TSVIs and appropriate service levels for students.  Significance for Personnel Preparation Programs TSVI workloads also have implications for university programs preparing new TSVIs for service in the field. In their report, Corn and Spungin (2003) highlighted the reciprocal relationship that exists between "the expressed needs of LEAs for personnel and the ability of universities to sustain programs to supply personnel" (Corn & Spungin, 2003, p.16). Therefore, if workloads for TSVIs were unmanageable, this would artificially deflate demand for new      12 TSVIs in the LEA. Manageable workloads for TSVIs provide LEA administration with an accurate estimate of the personnel required to maintain high-quality programs for students with visual impairments. If current staff cannot meet this estimate and administrators publicize this need, personnel requirements can be filled by TSVIs trained through the aforementioned programs (Silberman, Ambrose-Zaken, Corn, & Trief, 2004).  The ultimate significance of the study will be realized if special education administrators can engage the study's findings to promote more manageable workloads for TSVIs to ensure that students with visual impairments receive appropriate levels of service in inclusive settings. In their report on personnel issues in the field of visual impairment, Mason and Davidson (2000) commented on the global impact of enhancing services for students with visual impairments. The motivation of the current study echoes that of Mason and Davidson (2000):   Improving services for students with visual impairments will not only enhance their quality of life, but will also benefit the whole of society. If individuals who are visually impaired receive appropriate educational services as infants and children, they will flourish into independent adults with an increased capacity to contribute to their community and beyond. (p. 9)   Delimitations of the Study The goals of the study are to (1) identify factors that have a significant influence on the determination of workloads for itinerant TSVIs, and (2) to develop a set of expert-confirmed factors that administrators should consider when making these determinations. Workload determination for itinerant staff is a multifaceted, complex process and it is beyond the scope of this study to account for all of the factors that enter into the process of determining workloads. This study is intended to expand upon existing educational service guidelines by directly addressing key variables that enter into the process of TSVI workload determination (i.e., educational programming, personnel-, and policy-level factors).      13 Overview of Research Design and Methodology The scope and content of the research questions together with the specialized expertise of the panel warranted the use of the Delphi approach. The Delphi approach is an "iterative, multistage process designed to combine opinion into group consensus" (Hasson, Keeney, & McKenna, 2000, p.1010). A group of experts was asked to complete a series of iterative questionnaires, and was provided feedback on the aggregated results from previous "rounds" (Powell, 2003). The findings of studies using the Delphi approach are based on systematically organized judgments from experts that establish priorities and consensus, and as a result, provide significant value to complex decision-making processes (Clayton, 1997). Finally, there are notable logistic advantages to the Delphi approach in that the geographic dispersion of panelists has no impact on sampling (Hsu & Sandford, 2007). In requiring anonymity between panelists and implementation via correspondence, the Delphi approach is particularly useful when sampling highly specialized experts who are working in disparate regions.  Since many panelists hold an administrative position at the LEA or state/provincial-level, the research sample was composed of a relatively small group of highly specialized experts from across North America. These experts were asked to elucidate variables that have an impact on administrative decision making processes as well as provide their perception of the importance of each. Since the current study is an examination of expert opinion on a topic that has received little research attention, and since these experts are located in various states and provinces across North America, the Delphi approach is uniquely suited to both the research topic and sample of the current study.       14 Organization of the Dissertation This dissertation is organized into five chapters. This chapter provides an introduction to the design and purpose of the study. Chapter Two is a review of the relevant literature, including research detailing the role and function of the itinerant TSVI, the challenges of itinerant service delivery, and the myriad consequences of unmanageable itinerant workloads. In Chapter Three the methodology and evolution of the research design are outlined in detail. This chapter includes a rationale for applying the Delphi approach to the research questions, as well as procedures for data collection, analysis, and instrumentation. Methodological limitations are discussed in the context of evidence-based practice in the field of special education. Chapter Four outlines the application of the Delphi approach in the current study by detailing the findings of each successive survey round. The results of data analyses following each round detail the development of consensus across the panel, culminating in a final set of confirmed factors at the conclusion of Round Four. Chapter Five presents a summary of findings and situates these findings in the broader evidence base devoted to service delivery for students with visual impairments in inclusive settings. Implications for the process of workload determination for itinerant TSVIs are provided, specifically tailored to special education leadership.        15 CHAPTER TWO LITERATURE REVIEW Students with visual impairments were historically among the first exceptional learners to be educated alongside typically developing peers in community schools (Hatlen, 2000). The tradition continues today, with the majority of students with visual impairments placed in inclusive settings and served by specialist teachers of students with visual impairments (TSVI). As a low incidence disability, visual impairment presents TSVIs with the challenge of delivering specialized instruction to a relatively small, heterogeneous group of learners widely dispersed in urban, suburban, and rural/remote communities across the Unites States and Canada (Griffin-Shirley et al., 2004; MacCuspie, 2002). In order to maximize the efficiency of service delivery, the TSVI serves students in inclusive settings often on an itinerant basis – travelling to multiple school sites within the LEA to provide direct service to students and consultative service to students' school-based teams (Bullard, 2003; Seitz, 1994).  Despite a long-standing tradition of inclusive education for students with visual impairments, there are a number of challenges inherent in providing comprehensive special education services to this unique population. One of the most frequently cited is the challenge of achieving manageable teacher workloads to sustain appropriate service levels for individual students (Corn & Spungin, 2003; Olmstead, 1995). To date, the issue of workloads for itinerant staff has largely been examined from the perspective of TSVIs. There is comparatively little research examining the administrative processes that determine these workloads. This requires that the perspective of special education administrators be documented, in particular those who oversee the work of TSVIs and are tasked with maintaining a viable special education workforce (Corn & Spungin, 2003; Voltz & Collins, 2010).       16 This chapter outlines the various roles and functions of the TSVI and the special education administrator as they relate to workload determination and service delivery. The educational policy, personnel, and policy contexts for administrative decision-making in Canada and the United States are reviewed to provide a starting point for the content of the initial survey sent to panelists. Finally, the consequences of unmanageable TSVI workloads are reviewed, to validate the rationale of the current study in the context of outcomes for TSVIs and students. First, however, the major stakeholders of the current study are presented, beginning with students with visual impairments educated in the LEA. Current Profile of Students with Visual Impairments in Inclusive Settings Yearly estimates of the incidence of visual impairment among children and adolescents living in Western nations typically range between 5-10 per 10,000 (Mervis, Boyle, & Yeargin-Allsopp, 2002; Rahi & Cable, 2003). By comparison, the mean overall estimate of the incidence of autism spectrum disorders in the United States is 90 per 10,000 (Rice, 2009). Visual impairment is considered a low incidence disability category under the Individuals with Disabilities Education Act (IDEA) in the United States, which defines "low incidence" as "a visual or hearing impairment, or simultaneous visual and hearing impairments; […] for which a small number of personnel with highly specialized skills and knowledge are needed in order for children with that impairment to receive early intervention services or a free appropriate public education" (Ludlow, Conner, & Schecter, 2005, p.16). Under IDEA, a "free and appropriate education" must a) be provided at public expense, b) meet the standards of the state educational agency, c) include an appropriate preschool, elementary, or secondary school education, and d) conform with the IEP (Sumbera, Pazey, & Lashley, 2014, p. 299). In Canada, there are similar requirements in provincial Education Acts that mandate that educational programming for the      17 student with special needs conform to the goals, objectives, and timelines outlined in the IEP (Dworett & Bennett, 2002). For students with visual impairments, an "appropriate" educational program will be one that is delivered by a team that includes a specialist teacher with advanced training in strategies and techniques to mitigate the impact of visual impairment on learning and development. This specialist teacher is the TSVI. Incidence of Visual Impairment in the K-12 Population Visual impairment is one of the lowest incidence exceptional conditions of students served federally under the IDEA in the U.S. (Heumann, 1996) and in Canada through provincial Ministries of Education (Zuvela, 2009). In the United States, less than 1% of students are eligible for special education services under the primary disability category of visual impairment (U.S. Department of Education, 2012). Similarly low estimates are found in Canada. A Statistics Canada (2006) report on the Participation and Activity Limitation Survey of 2006 noted that 7.6% of students with disabilities aged 5-14 were identified as having a visual impairment, including blindness. The result is a relatively small number of students with visual impairments dispersed widely across urban, suburban, and rural/remote communities in North America. Research into the social experiences of students with visual impairments finds that when these students are placed in inclusive settings, they are often the only student with a visual impairment at their community school (Rosenblum, 2000).  Increasing Heterogeneity Among Students with Visual Impairments Further complicating the process of service delivery for students with visual impairments in inclusive settings is the changing nature and scope of the population itself. As scientific understanding of the human visual system has increased, so has knowledge of the brain's central role in visual functioning (Jan et al., 2013). Thus, conceptualizations of visual impairment have      18 expanded beyond traditional consideration of ocular visual impairment to include neurological disorders of the visual pathways and centers in the brain. As the incidence of ocular forms of childhood visual impairment (e.g., cataracts, glaucoma) have decreased in North America over the last century, the incidence of childhood visual impairment resulting from neurological disease or disorder has increased (Hatton, Ivy, & Boyer, 2013).  Given that widespread neuronal networks are responsible for visual functioning, the impact of a visual condition resulting from neurological disease or disorder is more likely to extend to functions beyond the visual system such as cognition and motor functioning (Jan et al., 2013). Therefore, many students with neurological visual impairments also have additional disabilities (e.g., cerebral palsy). Hatton and colleagues (Hatton, 2001; Hatton, Schweitz, Boyer, & Rychwalski, 2007) examined a U.S. national registry of preschool children with visual impairments (i.e., American Printing House for the Blind's Babies Count registry) and noted that in 1998-1999, 45% had only visual impairment as a disabling condition. By contrast, only 32% of children registered between 2000-2004 were identified as such. Therefore, descriptive data on the developmental profiles of young children with visual impairments indicate that children with visual impairments entering school may have complex educational needs that extend beyond those resulting from vision loss.    As the population of students with visual impairments in North America has changed over successive decades, so have the models of service delivery charged with educating these learners. Many of the current challenges to providing a free and appropriate public education to students with visual impairments have been previously mentioned. Inclusive placement for students with visual impairments predates inclusive placement of students from most other exceptional populations (Ajuwon & Oyinlade, 2008). Thus, a more complete understanding of      19 the current issues facing administrators of educational programs for students with visual impairments requires a historical perspective.  The Tradition of Inclusive Education for Students with Visual Impairments As mentioned in the previous section, there is an established history of educational programming for students with visual impairments in general education settings in North America. Over the past century, there have been a number of landmark changes that have shifted greater responsibility for the oversight of these programs to special education administrators in the LEA. These changes are outlined in the sections that follow. Early Promotion of Inclusion in Community Settings Education for students with visual impairments has long been of interest to the general public. From Valentin Hauy's public demonstrations of his pupils' talents at the first school for students who are blind to online articles extolling new technologies to create three-dimensional models, there are numerous examples of technological advances and individuals (e.g., Louis Braille, Helen Keller) that have captured the popular imagination. Historically, much of the public's exposure to the education of students with visual impairments came through media reports of these exceptional individuals and public visitations at specialized schools for students with visual impairments in the late 19th and early 20th centuries (Lowenfeld, 1973). However, students with visual impairments have a long history of placement in local community schools, beginning with "braille classrooms" in public schools in 1908 and "sight saving" classrooms for students with low vision (Aylesworth, 1938; Hatlen, 2000). The extent to which students with visual impairments were integrated into classrooms with their sighted peers varied greatly between programs, and so it is difficult to identify a typical or predominant model of service delivery for students educated outside of the specialized school setting at that point in time      20 (Hatlen, 2000). Despite variation in time spent in the general education classroom, there was an early recognition that the braille or sight-saving teacher and the general education classroom teacher had a shared responsibility in delivering educational programming (Kornitzer, 1947).  Population Shift and Steps Toward Mainstreaming  The first half of the 20th century saw a major shift in the etiology of visual impairment in the population of students in North America. The use of high-oxygen incubation in the case of premature birth was, unbeknownst to medical professionals of the day, resulting in hyper-vascularization of the retina in a small proportion of surviving infants – a condition originally known as retrolental fibroplasia (RLP) and today as retinopathy of prematurity (ROP; Hatlen, 2000). When these infants came of school age in the 1940s, 50s, and 60s, educators and administrators were presented with a dilemma. Contemporary models of service delivery could not effectively program for such large cohorts of students. Previously, students in small or rural communities might attend a specialized school setting, and students from larger urban centers might be congregated in one of the full- or part-time day programs mentioned earlier. There were now sufficient numbers of students across the United States and Canada to warrant the provision of specialized instruction in local schools where students with visual impairments would be integrated into classrooms with their same-aged sighted peers.  The Pine Brook Report  In 1954, the American Foundation for the Blind published The Pine Brook Report – a monograph of the proceedings of one of the first professional conventions of leaders in the field of education of students with visual impairments. At the time of publication, The Pine Brook Report was unique in that it detailed best practice for TSVIs and administrators across models of service delivery for students with visual impairments, from full integration in local schools to      21 full-time placement in a specialized school setting. However, with service delivery trends shifting toward integration to meet the needs of successive cohorts of students with RLF/ROP, the integrated models in The Pine Brook Report were of significance to the majority of practicing TSVIs. Estimates from 1960 put the percentage of students with visual impairments educated in community schools at approximately 53% of all students with visual impairments enrolled in public education at the time, a figure that would climb to 76% in 1987 (Hatlen & Curry, 1988). The Pine Brook Report outlined three models of service delivery for implementation in local education settings:  • The Cooperative Plan. Here, the student is enrolled in a special classroom with other students with visual impairments for part of the day. A TSVI provides adaptive instruction in this classroom, and the student is integrated into classrooms with her same-aged sighted peers for the balance of the academic day.  • The Integrated Plan. Here, the student is enrolled full-time in the general education classroom. A TSVI is based at the school in a resource room, which the student attends as necessary. The TSVI is also available to consult in the general education classroom.  • The Itinerant Teacher Plan. In this model, the student is enrolled full-time in the general education classroom. The TSVI travels from school to school, serving a caseload of students from kindergarten through to the end of secondary school. Service is delivered via the cooperative efforts of the classroom teacher and TSVI. Various iterations of these three models of integrated service delivery remain in use today, alongside non-integrated placement options such as full- or part-time placement at a specialized school for students with visual impairments. However, the Itinerant Teacher Plan is employed in the majority of LEAs, as evidenced by the number of students with visual      22 impairments placed in general education classrooms for the entire school day (Ferrell, 2007). In a report to the U.S. Congress on the implementation of IDEA, the U.S. Department of Education (2014) noted that 64.7% of students with visual impairments served under IDEA in 2012 spent, on average, 80% or more of the school day inside the general education classroom. Legislative shifts in the 1970s and 1980s, mirroring the wider societal movement toward greater integration of individuals with disabilities, cemented a trend toward maximizing inclusive placement for students with visual impairments (Hatlen & Curry, 1988; Kavale & Forness, 2000). It is important to note that a general move toward greater integration of students with visual impairments predates these laws, and represented one of the first applications of a more inclusive, democratic outlook in special education. In 1941, Lowenfeld (1941/1983) stated that “the inclusion of blind children in the normal educational process is an outgrowth of democratic principles and constitutes the last great step in the history of the education of the blind” (1981, p. 6). Perspectives such as Lowenfeld's were in keeping with the later process of mainstreaming. Federal and provincial laws, such as P.L. 94-142 in the United States (Larrivee & Cook, 1979) and Bill 83 in Ontario (Morgan, 2003) are early examples of legislation that supported the process of mainstreaming. Mainstreaming refers to the "placement of a child with disabilities for some portion of the school day in a regular classroom with nondisabled classmates, often, for nonacademic subjects" (Hatlen, 2000, p. 21). The concept of mainstreaming is consistent with the integrated modes of service delivery outlined in The Pine Brook Report, most notably in the contexts of the cooperative and integrated plans. These models of service delivery enable the educational team to flexibly distinguish between content that the student can learn alongside sighted peers, and that which requires more intensive instruction from the TSVI in a separate environment (i.e., resource room). Mainstreaming places much of the onus on the student to      23 adapt to the physical, pedagogical, and socioemotional context of the general education classroom (Lindsay, 2007).  Thus, if the unique needs of the student cannot be met in the mainstream classroom, external supports are required to provide the student with the skills, knowledge, and tools needed to be able to be an equal participant in the general education classroom program.  From Mainstreaming to Inclusion and Universal Access  Beginning in the early 1980s in North America, further legislated shifts in special education policy refined the concept of mainstreaming by shifting the onus for adaptation from the student to the general education environment (Erten & Savage, 2011). Known chiefly as inclusion, this philosophical outlook on special education stresses that "schools are responsible for examining environmental factors such as regular classroom dynamics rather than focusing merely on the deficits of individual students" (Erten & Savage, 2011, p. 222). The principle of inclusion seeks to create accessible learning environments for all learners, viewing the educational implications of exceptionality as the result of the complex interplay of individual differences and environmental barriers. Thus, the onus for adaptation to promote inclusion resides with both the learner and the learning environment. This contrasts with the earlier process of mainstreaming, where special education programming is charged with providing the student with a visual impairment with the tools, strategies, and knowledge to adapt to the demands of the general education environment without a corresponding requirement that the environment adapt to the needs of the learner.  Recent Trends in Inclusive Education for Students with Visual Impairments As the legal mandate of inclusion has found wider application in North American LEAs, the itinerant model has become the prevailing means of special education service delivery for      24 students with visual impairments. In a survey of teachers in 62 U.S. LEAs in the late 1980s, Harley, Garcia, and Williams (1989) found that the most commonly cited educational placement for students with visual impairments was in a general education classroom served by an itinerant TSVI. The proportion of students with visual impairments placed in inclusive settings continues to increase. In a survey of special education administrators from across the United States, Arick and Krug (1993) found that of those supervising the educational programming of students with visual impairments, 53% noted that the primary placement for these students in their district was in the general education classroom. In a more recent review of student enrolment data in one U.S. state, Wall and Corn (2004) noted that 98% of students with visual impairments in that state attended inclusive classrooms in public LEAs in the 2001-2002 academic year. When placed in the LEA, students with visual impairments spend an increasing proportion of the academic day in the general education classroom. According to the U.S. Department of Education's (2008) report to the U.S. Congress, 57% of students with visual impairments spent 80% or more of the school day in the general education classroom, compared to 48.1% in 1997. Corresponding federal data from Canada are not available.  The trend that has seen a greater proportion of students with visual impairments educated in inclusive settings has continued to the present day. The fact that the majority of students with visual impairments are placed in general education classrooms suggests that these students are, for the most part, interacting with professionals who have little or no experience in serving students with visual impairments (Wall, 2002). This also implies that administrators with a similarly low level of familiarity with the educational implications of visual impairment oversee the educational programming of the majority of students with visual impairments (Alonso, 1990; Brown & Glaser, 2014).       25 TSVI Workload Determination: The Role of the Special Education Administrator Special education administration is positioned at the intersection of the fields of special education, general education, and educational administration (Lashley & Boscardin, 2003). Special education administrators are charged with “supervising and evaluating educational programs in general, and individual programming in particular” (Crockett, 2002, p. 163). According to Thompson and O'Brian (2007), the administrator's role goes beyond financial and procedural concerns in that he or she is "responsible for cultivating an organizational culture where professional staff are committed to teaching students with special needs using the best available instructional practices and achieving the best possible educational outcomes" (p. 34). In general, the special education administrator "work[s] in school districts to lead, supervise, and manage the provision of special education and related services for students with disabilities" and is responsible for overseeing the implementation of federal, state/provincial, and local "policies and procedures that stipulate a free appropriate public education in the least restrictive environment for all students with disabilities” (Lashely & Boscardin, 2003, p. 63).  For the purpose of the current study, "special education administrator" refers to administrative personnel responsible for supervising the work of itinerant TSVIs to ensure that high-quality educational programs are in place for students, and that these programs are in compliance with local policies and procedures, state/provincial, and federal regulations (Bakken, O'Brien, & Shelden, 2006). Special education administrators are typically responsible for overseeing the practice of itinerant personnel, as these staff members operate across several school sites. In her survey of special education administrators in one U.S. state, Isaac (2014) noted that 84.2% of respondents indicated that they were directly responsible for evaluating the work of itinerant staff. Prior to examining the various contexts in which special education      26 administrators make decisions regarding TSVI workload, it is first necessary to clearly define the content of that workload. The following section details the role and function of the itinerant TSVI.  The Role and Function of the Itinerant TSVI The majority of TSVIs in North America serve students on an itinerant basis - approximately 90% of students with visual impairments in the United States are educated via the itinerant model of service delivery (Hatlen, 2000; Spungin & Corn, 2003). Within the itinerant model of service delivery, the TSVI provides services to students with visual impairments via three modes, as described by Koenig and Holbrook (2000b): (1) direct instruction, (2) consultative service, and/or (3) indirect service. The following sections draw on peer-reviewed research, best practice guidelines, and foundational textbooks to provide a detailed profile of the professional responsibilities of the itinerant TSVI in North American LEAs. The purpose of these sections is to provide the reader with an account of the typical workload of the itinerant TSVI and to provide scope to the educational programming factors that may inform the process of TSVI workload determination (see Table 2.1).   Direct instruction An itinerant teacher provides direct instruction when he or she spends a portion of time actively engaged with the student (Allinder, 1994). Given his or her specialized training and expertise in visual impairment, the TSVI is primarily responsible for providing direct instruction in the use of adapted tools and strategies designed to mitigate the impact of vision loss on learning and development (Lewis & Allman, 2000). Visual impairment constrains the student's ability to learn incidentally by observing other peers and adults, as well as on his or her ability to gain access to new learning opportunities (Koenig & Farrenkopf, 1997). As a result, the TSVI      27 provides direct instruction in nine disability-specific skill areas to ensure that students with visual impairments have access to the same academic and non-academic learning opportunities as their sighted peers (Hatlen, 2009). Together, these skill areas are referred to as the Expanded Core Curriculum (ECC; Hatlen 1996).  The TSVI provides direct instruction in areas of the ECC since most classroom teachers possess neither the instructional time nor expertise to deliver the specialized intervention required by students with visual impairments (Koenig & Holbrook, 2000b). By placing a student with a visual impairment in an inclusive classroom, the educational team is committing to maximize that student's potential for learning as well as his or her meaningful participation in that setting. While this commitment is fully supported by the ideals of inclusive education, research notes equivocal findings on the realization of this commitment and on the efficacy of full-time placement in general education classrooms for students with disabilities (Hocutt, 1996; Zigmond, 2003). Full-time placement in inclusive settings can be effective for some students, particularly those with high-incidence disabilities (Manset & Semmel, 1997). However, emphasizing the "place" of instruction over other factors "leads one to accept the mainstream curriculum […] as immutable and defines the goal of special education as access" (Zigmond & Baker, 1995, p. 246). Given that the unique needs of students with visual impairments extend beyond simple access to the learning environment, more intense and specialized intervention is required than that which can be provided in the general education classroom (i.e., "push-in" support; Spungin & Ferrell, 2007). Some specialized instruction will require that the TSVI and student work in a separate space in the school (i.e., "pull-out" support). Pull-out instruction may be required for a number of reasons, including the functional limitations of the classroom environment (e.g., lack of fixtures/appliances for instruction in kitchen safety), concerns about      28 attention/distraction (e.g., student will be less able to focus in the busy classroom, or the content of the lesson will be distracting to sighted peers), or socio-emotional concerns (e.g., the student does not want to draw attention to his or her visual impairment; Rosenblum, 2000).  The frequency and intensity of direct instruction is determined by the assessed needs of the student, and is explicitly outlined in the student's IEP (Lewis & Allman, 2000). This process is highly individualized and as a result, there are currently no data-driven estimates on the time required to provide specialized instruction (Sapp & Hatlen, 2010). For instruction in areas of the core curriculum, there are service level guidelines derived from surveys of experts in teaching braille literacy skills (i.e., Koenig & Holbrook, 2000a) and teaching literacy skills to students with low vision (i.e., Corn & Koenig, 2002). The TSVI has a mandate to consider the instructional needs of the student requiring direct instruction, and to work with administration to ensure that an appropriate type and level of service is achieved (Lewis & Allman, 2000). Consultative Service In addition to direct instruction, the TSVI also provides consultative support to students' interdisciplinary teams (Koenig & Holbrook, 2000b). The composition of the interdisciplinary team for a student with a visual impairment placed in an inclusive setting will vary based on the individual needs of the student, but is typically composed of the student's family, school-based staff (e.g., classroom teacher, resource teacher) and other itinerant specialists/therapists (e.g., orientation and mobility specialist, occupational therapist; Topor, Holbrook, & Koenig, 2000). The TSVI consults with the classroom teacher and other members of the interdisciplinary team to ensure the accessibility of the student's programming at school. For example, the TSVI works with the classroom teacher to promote social inclusion (Brown, Packer, & Passmore, 2013), or with paraprofessionals to ensure the student receives a level of additional support that is      29 consistent with the student's abilities and needs as a learner (Lewis & McKenzie, 2009; McKenzie & Lewis, 2008).  Working with paraprofessionals. Students with visual impairments, particularly those with additional disabilities, may be assigned service hours from a paraprofessional for all or part of the instructional day (McKenzie & Lewis, 2008). The role of the paraprofessional can be complex and depends on the unique needs of the student, as well as on institutional and administrative issues (e.g., to compensate for insufficient TSVI service hours; MacCuspie, 2002). The results of paraprofessional surveys indicate, however, that most are responsible for material adaptation and assisting the classroom teacher or TSVI by providing instructional support to students (Griffin-Shirley & Matlock, 2004; McKenzie & Lewis, 2008). The TSVI trains paraprofessionals in specific instructional skills and strategies, models these skills and strategies, and maintains active communication with the paraprofessional to monitor progress (Lewis & McKenzie, 2009).  Accessibility of the classroom environment. The TSVI will support inclusion by working with the classroom teacher to ensure the continued use of adaptive tools and strategies by the student (Olmstead, 2005). The TSVI will also consult with the classroom teacher to make the learning environment accessible for the student with a visual impairment (Koenig & Holbrook, 2000b). The nature of this consultative support will depend on the needs of the student and classroom teacher, as well as on the physical layout and design features of the classroom (e.g., lighting, storage). In addition to providing material support to the classroom teacher, the TSVI will also consult with the classroom teacher on instructional strategies. For example, the TSVI may work with the classroom teacher to ensure that instructional language in the classroom      30 is accessible to the student and includes sufficient detail to promote comprehension (Perez-Pereira & Castro, 1997).  Indirect service Finally, TSVIs are responsible for conducting and interpreting specialized assessments related to visual functioning, and disseminating results and recommendations to the educational team (Koenig et al., 2000). In addition to specialized assessments, the TSVI works with the student's family to ensure that strategies and tools that are used at school can also be used at home. The TSVI will also work to connect the family with service organizations and community-based groups serving individuals with visual impairments (Lewis & Allman, 2000).  Materials in alternate formats. To ensure that students with visual impairments have access to the same learning opportunities as their sighted peers, the TSVI will procure materials in alternate formats (i.e., braille, digital audio files) based on the assessed needs of the student (Koenig & Holbrook, 1995). In addition to procuring materials in alternate formats, the TSVI may also be responsible for producing these materials (Herzberg & Stough, 2007). In the United States, TSVIs obtain materials in alternate formats primarily from state instructional materials resource centers (IMRC), specialized schools for students with visual impairments, or directly from LEAs (Wall & Corn, 2002). Despite some jurisdictional differences (e.g., provincial resource centres vs. IMRCs), TSVIs working in Canada obtain materials in alternate formats from a similar range of sources (Zuvela, 2009).  Distinguishing Between Caseload and Workload As noted the in previous section, the role and function of the itinerant TSVI are delineated by the three modes of service delivery (i.e., direct instruction, consultative service, and indirect service). However, mode is only one of two important considerations for itinerant      31 service delivery. Another important consideration is the level of service, or the frequency and intensity of service provided to students with visual impairments. Traditionally, service level has been examined in the context of the raw number of students served by an itinerant TSVI, as well as by the degree of students' vision loss and/or presence of additional disabilities (Olmstead, 2005). However, there is growing recognition in the field of special education that the total number of students with special needs served is not an accurate indicator of the sum of the professional duties of a special education professional (Suter & Giangreco, 2009).  There are two central reasons for which the number of students served (i.e., caseload) is problematic as a means of quantifying the workload of TSVIs. First, as detailed in an earlier section, the population of students with visual impairments is increasingly heterogeneous and as a result, there is significant variation across students in terms of individualized educational priorities. A tally of students served does not adequately represent the heterogeneity of student need the TSVI will encounter. Second, the role of the TSVI includes some duties that are not subsumed under direct and consultative service delivery. For example, the itinerant TSVI will spend a significant portion of the academic day in transit, particularly those TSIVs working in large rural districts (Bina, 1987). The average duration of travel time for an itinerant TSVI is 1.4 hours per day (Griffin-Shirley et al., 2004). Assuming a seven-hour academic day, travel time amounts to an average of 20% of the academic day. This significant use of time is not accounted for in a simple caseload figure. The issue of travel between school sites highlights another important dimension of itinerant service delivery – time. The workload of an itinerant TSVI varies not only in terms of its function (i.e., modes of service delivery) but also in terms of the instructional time allotted to carry out that function. This is another inadequacy of caseload size      32 as a metric of TSVI professional practice, in that it provides no indication of the time the TSVI has to discharge his or her professional duties.  The inadequacy of caseload size as a means of quantifying the professional work of TSVIs is not unique to service delivery for students with visual impairments. Speech-language pathologists (SLPs) working in the K-12 education system serve students predominantly on an itinerant basis (Hutchins, Howard, Prelock, & Berlin, 2010). SLPs report that in the last decade caseload demands have become increasingly more complex as the scope of practice shifts to meet the growing demands of diverse school populations (Woltmann & Camron, 2009). In 2002, the American Speech-Language-Hearing Association (ASHA) published a position statement that differentiated between SLP caseloads and workloads. A "caseload" refers to the number of students with IEPs or Individualized Family Service Plans (IFSP) served by SLPs through direct or indirect service delivery options, while a "workload" refers to all professional activities required of SLPs in the K-12 education system (ASHA, 2002). The position statement discourages any administrative efforts to set maximum caseloads as "arbitrary caseload maximum is inconsistent with a workload analysis approach to setting caseload standards" (ASHA, 2002). Instead, special education administration must "implement a workload analysis approach to setting caseload standards that allow SLPs to engage in the broad range of professional activities necessary to meet individual student needs" (ASHA, 2002). Therefore, there is growing recognition among professional organizations that a distinction should be made between the caseloads and workloads of itinerant related service professionals and teachers in special education.       33 Caseload Analysis Tools  In the professional TSVI community, several caseload analysis tools may be used to determine appropriate service levels for students with visual impairments. A caseload analysis tool in wide use across the United States is The Michigan Vision Services Severity Rating Scale (VSSRS; Michigan Department of Education, 2013a). The purpose of the VSSRS is to assist in the determination of TSVI service levels by “correlate[ing] the degree of need for intervention/instruction from a [TSVI], based on the severity of a student’s visual impairment and educational needs” (Michigan Department of Education, 2013a, p. 4). The VSSRS consists of seven categories on which the TSVI rates the severity of the student’s need for service: Level of vision, functional near vision, reading medium, low vision devices/technology, material preparation, compensatory skills, and communication with student’s team/pertinent individuals. Severity ratings on a scale from NONE to PROFOUND each have an associated score (i.e., 0-4). The sum of these scores is combined with the score from a list of contributing factors (e.g., whether the student’s visual condition is degenerative or stable) to arrive at the final severity of need score. This final score is then converted to a frequency of TSVI service estimate using a conversion table listed in the VSSRS. A second set of ratings scales, The Michigan Vision Services Severity Rating Scale for Students with Additional Needs (VSSRS+), is similar to the VSSRS but is specifically tailored to estimate TSVI service levels for students with visual impairments and additional disabilities (Michigan Department of Education, 2013b). There is evidence supporting the reliability and validity of the VSSRS and VSSRS+. Wall Emerson and Anderson (2014) asked 65 TSVIs to apply the VSSRS and VSSRS+ to two fictional student scenarios and report on how closely rating scale results matched service levels provided to actual students. For the VSSRS, 86% of participants responded that the rating scale      34 results were “very close” or “somewhat close” to service levels for actual students, compared to 71% for the VSSRS+. When participants were asked to account for the inability to enact the service level estimates obtained from the VSSRS, 24% indicated that the primary reasons were unexpected variables and 31% due to “other” variables. Using the VSSRS+, 14% attributed the inability to realize the service delivery estimate to unexpected variables and 33% to “other” variables. These results indicate that while the VSSRS and VSSRS+ provided mostly valid service level estimates, there may be some factors that are not considered in each document that impact service delivery for students with visual impairments.  In addition to the VSSRS and VSSRS+, researchers have sought to determine the validity of another caseload analysis tool, the Visual Impairment Scale of Service Intensity of Texas (VISSIT; Pogrund, Darst, & Munro, 2015). The VISSIT is a scale used “to determine visual impairment service time based on the needs of students with visual impairments in all areas of the [Expanded Core Curriculum]” (Pogrund  et al., 2015, p. 435). Twenty-five TSVIs were asked to complete at least one VISSIT scale on an actual student whom they currently served. In assessing the consequential validity of the VISSIT, the researchers asked participants to rate the degree to which the results of the tool corresponded to their practice as TSVIs. Seventy-five percent of participants agreed that the results of the VISSIT corresponded to their professional judgement of recommended TSVI service time for that student. Estimates of social validity were high, with 96% of participants indicating that they would use the VISSIT to estimate service levels for students in the future.  Preliminary studies have found these caseload analysis tools to be moderately reliable and valid. However, with the VISSIT intently focused on students’ need for instruction in areas of the ECC and the VSSRS/VSSRS+ rating the severity of need based mostly on student-focused      35 factors, it may be that these tools do not consider the broader scope of the itinerant TSVI’s workload. Caseload analysis has been incorporated into program accountability for itinerant TSVI service delivery. Quality Programs for Students with Visual Impairments (QPVI) is a multifaceted program accountability process that engages students, parents, teachers, and administrators in ongoing self-assessment of service levels (Toelle & Blankenship, 2008). The analysis of caseload size is only one of several criteria that enter into determinations of TSVI workloads. The process of QPVI recognizes that the TSVI workload is not accurately reflected as solely the sum of individual service levels for students, and has been implemented in school districts across several U.S. states (Toelle & Blankenship, 2008). In order to use the most accurate terminology that reflects both current trends in the field and the realities of itinerant service delivery, the current study will follow the "caseload" and "workload" distinction outlined in the ASHA (2002) position statement. It should be noted, however, that the term “caseload” is used throughout the chapters that follow. While the term “workload” is the more informative term that refers to the sum of the TSVI’s professional responsibilities, “caseload” has use in describing the set of students to whom the TSVI is assigned. Despite its limitations, the term “caseload” continues to have value when reporting the results of research studies that employ “caseload” as a means of labelling the set of students currently served by the TSVI. For this reason, “caseload” will appear occasionally in the sections that follow.  Teacher Challenges in the Itinerant Model of Service Delivery Given the various modes of itinerant service delivery (i.e., direct, consultative, and indirect) in addition to non-instructional duties (e.g., paperwork, travel), the workload of the itinerant TSVI is diverse (Lewis & Allman, 2000). The execution of professional duties within      36 this workload, in turn, presents several unique challenges to itinerant personnel. Several researchers have surveyed TSVIs (e.g., Brown & Beamish, 2012; Suvak, 1999) in order to elucidate and contextualize these challenges. Correa-Torres and Howell (2004) interviewed 23 itinerant TSVIs on the positive/rewarding aspects of itinerant teaching, as well as the inherent challenges. The following sections outline Correa-Torres and Howell's findings and connect them to those of other studies of itinerant service delivery. Travel and Non-Instructional Duties Travel is inherent to the job of the itinerant teacher. These teachers frequently identify travel and non-instructional duties (e.g., paperwork) as factors that limit the number of hours devoted to direct or consultative service delivery (Correa-Torres & Howell, 2004; Luckner & Ayantoye, 2013). As mentioned earlier, travelling between school sites can occupy a significant portion of the itinerant teacher's workday, particularly for TSVIs working in rural areas (Bina, 1987). In addition to travel time, other non-instructional duties require the attention of the TSVI. Many of these tasks (e.g., obtaining clinical reports from ophthalmologists) can be characterized as "indirect service" according to the tripartite model of TSVI service delivery (Koenig & Holbrook, 2000). TSVIs in Correa-Torres and Howell's (2004) sample indicated that the task of completing administrative paperwork and report writing occupied a significant portion of non-instructional time. In general, special education teachers characterize the completion of excessive reports and other paperwork as burdensome (Vittek, Floyd, & Hayes, 2013). In a large-scale observational study of special education teachers in the Southwestern United States, Vannest, Hagan-Burke, Parker, and Soares (2011) found that teachers spent an average of 17% of the academic day completing paperwork. A specific estimate on the average time spent completing paperwork is not available for itinerant TSVIs. However, Griffin-Shirley et al. (2004) noted      37 TSVIs reported an average of eight hours per week devoted to non-teaching responsibilities (e.g., meetings, completing paperwork).  Meeting the Disability-Specific Learning Needs of Students According to the teachers interviewed by Correa-Torres and Howell (2004), the students they served had a wide range of educational needs – from minor adaptations related to low vision to more extensive and intensive support for students who are blind or have a visual impairment and additional disabilities. The researchers noted that teachers' "love for the job seemed to be intensified by the challenging nature of the work" (p. 429). Interestingly, the most rewarding aspect of itinerant teaching was directly linked to the most pervasive challenge identified by teachers – lack of instructional time with students. Teachers enjoyed providing direct instruction to students with visual impairments, but when faced with the challenging diversity of student needs, did not feel that they had enough time with each student to provide effective instruction to adequately address those needs (Correa-Torres & Howell, 2004).  Research into the roles and responsibilities of TSVIs indicates that these teachers may not have sufficient time to provide instruction to address students' disability-specific needs. Wolffe et al. (2002) conducted an observational study of 18 itinerant TSVIs from several U.S. states, all of whom provided service to students with visual impairments. Analyses of the observational data indicated that most of teachers' instructional activities were devoted to academic support in areas of the core curriculum. These included academically-oriented activities (27%), tutoring (14%), and communication skills (e.g., use of computers, instruction in the braille code; 18%). As mentioned earlier, students with visual impairments have additional learning needs beyond those of the core academic curriculum (i.e., the ECC; Hatlen, 1996). Wolffe et al. (2002) noted that teachers spent comparatively little time addressing students' learning in areas of the ECC –      38 socio-emotional skill development (9% of observed activities), sensory motor skills (8%), skills for independent living (7%). Wolffe et al. (2002) attributed these findings to the limited instructional time the TSVI had with the student, as well as a lack of recognition of students’ unique learning needs on the part of parents, classroom teachers, and administrators. According to Wolffe et al. (2002), TSVIs working in inclusive settings are spending more time supporting students' core academic needs to the detriment of instructional time in the areas of the ECC. These authors noted that TSVIs "are not uniformly providing the quality and quantity of disability-specific services that are deemed appropriate for educating students with visual impairments in public school programs" (p. 302). Other surveys of TSVIs focused on instruction in specific areas of the ECC (i.e., self-determination skills) have also concluded that teachers are not providing instruction in these areas, often because students are perceived to have more pressing instructional needs elsewhere (Agran, Blankenship, & Hong, 2007). In the context of the current study, factors relating to students’ instructional needs in the ECC, as well as other educational programming factors implicated in itinerant TSVI workloads, are listed in Table 2.1. Table 2.1 References for Initial Educational Programming Factors Educational Programming Factors Reference(s) EDU1. The total number of new students entering the LEA who qualify for service from a TSVI.  Michigan Department of Education (2013a) EDU2. The total number of students who are currently receiving service from a TSVI in the LEA.  Corn & Spungin (2003); Griffin-Shirley et al. (2004) EDU3. The number of students who use braille as his or her primary literacy medium in the LEA.  Koenig & Holbrook (2000a) EDU4. The number of students who use print as his or her primary literacy medium in the LEA. Corn & Koenig (2002)      39 Educational Programming Factors Reference(s)  EDU5. The number of students with deafblindness in the LEA.  Riggio & McLetchie (2008) EDU6. The number of students with visual impairment and additional disabilities in the LEA.   Griffin-Shirley et al. (2004); Michigan Department of Education (2013b) EDU7. The amount of preparation time required by TSVIs in the LEA.  Correa-Torres & Howell (2004) EDU8. The amount of time needed for TSVIs to complete indirect service tasks (e.g., report writing, team meetings, liaising with community-based organizations).  Correa-Torres & Howell (2004); Griffin-Shirley et al. (2004) EDU9. The amount of time needed for TSVIs to complete grant proposals for curriculum expansion, including the acquisition of new teaching materials/technology.  Spungin & Ferrell (2007) EDU10. Input from advocacy groups regarding the level of service for individual students with visual impairment in the LEA.  Pugh & Erin (1999); Spungin & Ferrell (2007)  EDU11. Input from parents regarding the level of service for individual students with visual impairment in the LEA.  Lewis & Allman (2000) EDU12. Results of a formal caseload analysis process conducted at LEA-level.  Wall-Emerson & Anderson (2014); Pogrund, Darst, & Munro (2015) EDU13. Results of specialized assessments of student functioning conducted at LEA-level (e.g., Functional Vision Assessment, Learning Media Assessment).  Bowen & Ferrell (2003); Michigan Department of Education (2013a); VISSIT (2014) EDU14. Information on the current visual functioning of individual students from medical reports.  Michigan Department of Education (2013a) EDU15. Information on the prognosis for the visual conditions of individual students in the LEA (e.g., progressive vision loss).  Michigan Department of Education (2013a)      40 Educational Programming Factors Reference(s) EDU16. Information on the core academic needs (e.g., Mathematics, Science) of individual students in the LEA.    Wolffe, Sacks, Corn, Erin, Huebner, & Lewis (2002) EDU17. Information on the disability-specific (i.e., Expanded Core Curriculum) needs of individual students in the LEA.  Pogrund, Darst, & Munro (2015); Wolffe, Sacks, Corn, Erin, Huebner, & Lewis (2002) EDU18. The availability of assistive technology for students accessing learning materials through vision (e.g., ZoomText, MAgic).  Smith et al. (2009) EDU19. The availability of assistive technology for student accessing learning materials through non-visual modalities (e.g., braille notetaker, text-to-speech software).  Smith et al. (2009) EDU20. The availability of opportunities for non-academic instruction (i.e., Expanded Core Curriculum) in the home provided by community-based organizations.   Pugh & Erin (1999) EDU21. The availability of opportunities for individual students in the LEA to attend camps and short-term programming provided by community-based organizations.  Pugh & Erin (1999) EDU22. The availability of short-term placement opportunities for individual students in the LEA at a specialized school or center for students with visual impairment.  Porgund, Darst, & Boland (2013)  Number of Students Served - Implications for Workload As noted earlier, caseload size is a poor proxy for a TSVI’s total workload on account of the individualized programming needs of students and non-instructional duties that are not reflected in caseload figures. However, researchers have made use of caseload size as an expedient means of quantifying special education workloads for the purpose of statistical      41 analysis (e.g., Algozzine, Hendrickson, Gable, & White, 1993; Russ et al., 2001). This approach is also found in the literature devoted to TSVI service delivery, where researchers routinely gather data on teachers' caseload size as an indication of workload (e.g., Griffin-Shirley at el., 2003; Johnstone, Thurlow, Altman, Timmons, & Kato, 2009) or as an indicator of the scope of teachers' professional practice in terms of the number and profiles of students served (e.g., chronological age and/or level of functional vision of students; Murphy, Hatton, & Erickson, 2008). Reported in this research is the perception of unmanageable caseloads as intuitively connected to educators’ perceived inability to meet the disability-specific needs of learners. As the number of students on a special education caseload increases, the teacher has less instructional time to devote to all students (Katz et al., 2010).  Despite being a poor indicator of the actual workload of itinerant TSVIs, caseload size has, traditionally, been reported in the research literature. While there is a growing movement among organizations of itinerant professionals to widen the scope of how workload is measured and assessed, most extant studies use caseload size as the predominant metric for estimating TSVI workload. Unmanageable caseload size is a challenge noted by many of the teachers in Correa-Torres and Howell's (2004) sample. The average caseload size among these teachers was 17 students.  Most researchers have reported a wide range in caseload sizes (e.g., 1-100 students; McKenzie & Lewis, 2008), with means ranging from 14 (Kirchner & Diament, 1999) to 22 students (Griffin-Shirley et al., 2004). However, these figures do not convey the complexity of each student's educational programming requirements. Instead, these caseload figures provide only a superficial indication of a TSVI's workload by listing the total number of students to which the teacher must devote some amount of instructional/consultative service.      42 Summary of Challenges in the Itinerant Model of Service Delivery Travel/non-instructional duties, number of students served, and programming for students' unique learning needs are all fundamentally related to time. Challenges arise when there are insufficient service hours available to the TSVI to address learning needs. Recent surveys of TSVIs continue to highlight time allocation as the most challenging aspect of itinerant service delivery. Griffin-Shirley, Pogrund, and Grimmett (2011) conducted an online survey of 108 TSVIs who are also certified Orientation and Mobility (O&M) specialists. When asked about what they found most challenging about their positions, "time issues" were most frequently identified. Griffin-Shirley et al. (2011) also conducted telephone surveys with 30 participants who consented to a more in-depth interview. Only five of these interviewed participants reported spending more than 10 hours per week teaching skills in the areas of the ECC. While these findings may be attributed to these professionals' dual role as a TSVIs and O&M specialists, observational studies of TSVIs have reported similarly low levels of instructional time devoted to the areas of the ECC (Wolffe et al., 2002).  Policy-Level Determinants of Itinerant TSVI Workload Of the various challenges to effective itinerant service delivery identified in the research literature, most are intrinsically related to instructional time. Time is one of the most critical factors underlying a TSVI's workload, since the amount of time that the TSVI can devote to the specialized needs of students will, in part, determine the effectiveness of that student's educational program (Corn & Spungin, 2003). Studies of itinerant TSVIs (e.g., Correa-Torres & Howell, 2004; Griffin-Shirley et al., 2003; Wolffe et al., 2002) indicate a lack of instructional time devoted to the disability-specific learning needs of students. Thus, the scope and overall effectiveness of educational programming for students with visual impairments in inclusive      43 settings may be compromised, in part, by unmanageable TSVI workloads. As an education professional, the TSVI engages his or her expertise and experience to determine the most effective use of his or her instructional time with the students on his or her caseload. However, the TSVI's workload, that is, the sum of his or her professional responsibilities and the time in which he or she has to discharge them, is determined by staffing and workforce decisions made by administration. Special education administrators have an obligation, under IDEA in the U.S. and various Provincial Education Acts in Canada, to ensure that the placement of students with visual impairments into educational programs, and the quality of programming therein, not be based "solely on factors such as category of disability, significance of disability, availability of special education and related services, availability of space, configuration of the service delivery system, or administrative convenience [italics added]" (Heumann, 1996, p. 77).  As outlined in Chapter One, there is little extant data on the factors that enter into administrative decision-making processes regarding TSVI workload. The following sections outline the educational policy and programming, and special education legislation that apply to the determination of TSVI workload. These factors, drawn from relevant legislation and research literature, inform the content of the first survey iteration of the current Delphi study (see Table 2.3). Educational Policy Factors  Given the low incidence of visual impairment among students receiving special education services in the United States and Canada, it is likely that most administrators will not have experience supervising the professional work of a TSVI (Ban & Masoodi, 1980; Brown & Glaser, 2014). In a study of 408 administrators, Praisner (2003) found that 36.1% of the sample reported no previous experience working with students with visual impairments. Therefore, it is      44 unsurprising that researchers have noted a general dearth of policy and programming knowledge among special education administrators related to visual impairment. Müller (2006) conducted a policy analysis on state infrastructures and programs for students with visual impairments in the United States. Müller interviewed state-level administrative personnel with oversight of state assets and programs devoted to students with visual impairments. Interviewees were asked to provide details on the programs available to these learners in his or her respective state, and to identify barriers to effective service delivery. Interviewees identified "[a] lack of expertise on the part of LEA-level administrators in evaluating teachers of the visually impaired or identifying features of a high quality vision program" (Müller, 2006, p. 20). Therefore, sufficient expertise may not exist among district-level administrators to adequately supervise the educational programs of students with visual impairments in the LEA.  One evident resource for the special education administrator is consultation with the TSVI(s) under his or her supervision. However, special education administrators may have limited access to these professionals. In her survey of 103 early career TSVIs, Seitz (1994) indicated that while 71% of TSVIs reported that they had access to an administrator, only 16% noted that they had an opportunity to interact with that administrator more than once per month. While the TSVI possesses specialized knowledge of the educational implications of visual impairment, the special education administrator may not have sufficient opportunity for meaningful consultation with the TSVI. In response to this lack of expertise among administrators, stakeholder groups and professional organizations have created a number of guideline documents and goal statements that reflect standards for high-quality educational programming for students with visual impairments.       45 Stakeholder position statements and standards. The National Agenda for the Education of Children and Youths with Visual Impairments, Including Those with Multiple Disabilities, first published in 1994 and updated in 2004, is a framework based on the collaboration of professionals and parents of students with visual impairments in the United States (Corn & Huebner, 1998). This document outlines eight goal statements around which advocacy efforts at local, state, and federal levels can be organized. These goal statements allow parents and professionals to align reform movements in the field of education for students with visual impairments with the tenets of the Individuals with Disabilities Education Act (IDEA; Corn & Huebner, 1998). To address the size and composition of TSVI workloads, Goal Statement #4 states that "service providers will determine caseloads based on the needs of students" (Corn & Huebner, 1998, p. 20) Therefore, according to the guidelines of the National Agenda, the size of a TSVI's caseload and by extension, the amount of time he or she can devote to individual students, should be determined by the assessed needs of those students.  In Canada, the Education Committee of the National Coalition for Vision Health (2003) drafted similar guidelines. The Canadian National Standards for the Education of Children and Youth who are Blind or Visually Impaired, Including those with Additional Disabilities outlines 13 standards to achieve quality educational programs for all students with visual impairments. Pragmatic examples are provided to contextualize each standard. To address the issue of instructional time devoted to students in Canada, Standard Two prescribes that "the type and frequency of instruction and the services provided by the teacher of students who are blind or visually impaired will be based on the assessed needs of the student and the level of support required within both the home and school environments" (National Coalition for Vision Health,      46 2003, p.12). To achieve this standard, the use of a formal caseload analysis process is recommended: The caseloads assigned to teachers of students who are blind or visually impaired are determined by using a formal caseload analysis which considers the needs of the students, the direct instruction required for each student, preparation time, travel time, related duties such as classroom teacher and parent consultation, organizational and administrative responsibilities, and time for participation in continuing professional development" (National Coalition for Vision Health, 2003, p.13).   These goal statements and standards recognize the importance of placing the assessed requirements of individual students at the center of the process for determining TSVI workload. Furthermore, the National Coalition (2003) standards recognize other components that better reflect the sum of TSVI professional responsibilities (e.g., travel, preparation time) that stakeholders believe should factor into workload determination. Educational Service Guidelines. In 1999, Pugh and Erin authored a set of educational service guidelines for students with visual impairments. These guidelines are the product of "an advocacy project for the low incidence group of students whose visual impairments present extraordinary challenges in achieving optimum educational growth" (Pugh & Erin, 1999, p.ix). Published under the auspices of the National Association of State Directors of Special Education (NASDE), the service guidelines detail "needed program elements outlining comprehensive quality programs for students with visual impairments" (Pugh & Erin, 1999, p.xii). Sections such as "Foundations for Education of Students Who Are Blind or Visually Impaired" and "Supportive Structure and Administration" provide special education administrators with an awareness of the components of high-quality educational programs for students with visual impairments in inclusive settings. Despite the availability of these documents, there is evidence that administrators are unlikely to be familiar with the policies and guidelines that apply to high-quality educational      47 programs for students with visual impairments. Smith, Geruschat, and Huebner (2003) surveyed TSVIs and administrators on the implementation of national and state policies in the United States pertinent to ensuring curriculum access for students with low vision. Sixty-four administrators and 138 itinerant TSVIs were surveyed. When asked to list relevant policies, 74% of administrators and 38% of teachers responded. At the federal level, respondents listed legislation such as the IDEA, Americans with Disabilities ACT (ADA), or Section 504 of the Rehabilitation Act. At the state level, requirements for learning media assessment and "braille bills" were most commonly cited. However, 71% of administrators and 31% of teachers indicated that they believed that policies were being implemented in practice (Smith et al., 2003). There appears to be a significant discrepancy between administrators' views on the implementation of policies guiding educational programs for students with low vision and the views of TSVIs implementing those policies. In summarizing the overall findings of the study, Smith et al. (2003) noted that "important policies and guidelines are not systematically and clearly finding their way into the knowledge base of administrators of these programs, beyond the level of general awareness" (p. 620).  Therefore, special education administrators may lack the specific awareness of the policies and practices that characterize high quality, highly specialized educational programs for students with visual impairments in the Local Education Authority (LEA) or school district. There is concern among researchers that this cursory understanding may result in formulaic solutions to administrative issues (e.g., staffing) that do not take adequate account of the unique needs of students with visual impairments (Corn, 2007). However, there is reason for some optimism. In a qualitative study of special education administrators, Benson (2001) noted that these administrators valued a greater awareness of the professional practice of the itinerant teachers      48 under their supervision. Specifically, administrators believed that a more complete understanding of the specialized content areas taught by these teachers would lend them greater credibility when evaluating teacher effectiveness and workloads (Benson, 2001). Special education administrators in Smith et al.'s (2003) sample who lacked a background in visual impairment echoed this sentiment. These findings indicate an inclination on the part of special education administrators to obtain a greater awareness of the characteristics of high-quality educational programs for students with visual impairments and to apply that knowledge to TSVI workload determination.  Legislative Factors  The scope of the current study incorporates LEAs from both the United States and Canada. However, there are significant structural and legal differences in special education legislation between the two countries (Dworett & Bennett, 2002). As a result, the following section examines legislation in each jurisdiction separately. Caseload Policy. The IDEA does not set specific requirements for caseload size and as a result, policy development and implementation are left to individual state departments of education (Jackson, 2003). In 2003, 31 states had regulations or policies that addressed the size and composition of special educator caseloads (Jackson, 2003). Policy reviews concluded that there was significant variation among states in terms of the scope and language of these regulations/policies (Russ et al., 2001). In 2003, NASDSE commissioned a review of state policies on special education caseload and class size. The review found that some states set regulations on the ratio of students to special education teachers, while others left these determinations to the discretion of administrators in LEAs. In cases where there was a prescribed ratio of students to special education teacher, some states held this as fixed ratio while others      49 allowed the proportion to shift depending on the presence of a paraprofessional or on the severity of the disability. Jackson (2003) noted that six general factors form the basis for caseload policy in the 31 states where these policies were in effect. Table 2.2 outlines these individual factors and the corresponding number of states that employ each in setting caseload policy. Table 2.2 Number of States Employing Criteria for Caseload Policy, Per Criterion (Jackson, 2003). Factor/Characteristics Number of States Age/grade of student 24 Presence of paraprofessional 23 Educational setting (e.g., self-contained classroom) 22 Type of service (e.g., PT, OT) 20 Federal disability category 20 Severity of disability 15  Of note are the 20 states that rely on the type of service, federal disability category, or severity of disability, or some combination of these factors. Policies that rely on the type of service (e.g., itinerant), disability category (e.g., blindness or visual impairment), or severity of disability (e.g., mild/moderate/profound) distinguish the unique service requirements of students with visual impairments from those of students from other exceptional populations. However, only in those states that set maximum caseload size by IDEA disability category can specific caseload size requirements be ascribed to the work of the TSVI. Some states set a maximum caseload size requirement to cover all students with low-incidence disabilities, while others emphasize the type of service (e.g., itinerant, resource room) to set the maximum caseload size requirement for a special education teacher (Jackson, 2003). Caseload policy that differentiates between IDEA disability categories recognizes that students with visual impairments, as an exceptional population, have educational needs that are unique (e.g., from those with other low incidence disabilities) and highly specialized. At the time that Jackson (2003) completed his review of caseload policy, 20 states used IDEA disability category (i.e., visual impairment) as      50 one factor to set maximum caseload sizes for TSVIs. Jackson (2003), in comparing the results of his review to those of an earlier policy review (i.e., Project FORUM, 2000), noted that caseload policy is constantly evolving – eight states had made significant adjustments to their policy since the Project FORUM (2000) report. Four of these states had completely removed caseload considerations from statewide regulations/policies. This trend appears to have continued, with some state regulations that relied on IDEA disability categories in 2003 now emphasize caseload sizes that are determined by the sum of students' individual service needs, as outlined by the IEP. Inconsistent policies that guide caseload ratios for TSVIs are met with skepticism within the field of professionals serving students with visual impairments. Mason and Davidson (2000) noted that due to “the need for research on the benefits of varying intensity, frequency, and duration of specialized teacher […] services, professionals in the field of blindness and low vision agree that recommendations for the national average service provider to student ratio are highly speculative, and provide little guidance for specific caseload criteria” (pp. 29-30). In the context of the current study, the Delphi process will elucidate the policy variables of significance to an expert panel, since the extant professional and peer-reviewed literature provides little guidance. Since professionals from both Canada and the United States made up the expert panel, workload policy should also be considered in the Canadian context. As mentioned in Chapter One, special education policies and procedures fall under the purview of the Ministry or Department of Education of respective provincial and territorial governments (Dworett & Bennett, 2002). Ontario is the only province that requires specialist credentials for TSVIs through legislation; other provinces mandate specialist qualifications through special education guidelines (McBride, 2008). This dearth of provincial-level policy guidance extends to workload      51 determination for TSVIs. There are no legislated teacher-to-student ratios or any overarching policy guidance to set maximum caseload sizes for TSVIs in Canada. According to Zuvela (2009), "[i]tinerant teacher caseloads are determined differently from province to province and from one area of a province to another" (p. 43). In the U.S. context, there are several approaches taken by U.S. State Departments of Education to provide policy guidance on the number of students served by TSVIs, all under the aegis of IDEA. By comparison, there is no overarching special education policy in Canada under which provinces are required to maintain an explicit policy regarding TSVI-to-student ratios or TSVI workloads.  Table 2.3 References for Initial Policy Factors Policy Factors Reference(s) POL1. The overall budget for special education services in the LEA.  Russ et al. (2001); Sebald et al. (n.d.) POL2. Federal/State/Provincial per-student funding formulae.  Dhuey & Lipscomb (2013) POL3. The total number of students qualifying for special education services in the LEA.   McLesky, Tyler, & Flippin (2004) POL4. TSVI-to-student ratio stipulated in state/provincial legislation or special education policy document.  Jackson (2003) POL5. Resources available through a state/provincial deafblind project/program.    Riggio & McLetchie (2008) POL6. Annual registration data available from state/provincial-level material resource centers.  American Printing House for the Blind (2015) POL7. Position statements from professional organizations in the field of visual impairment.  Sapp, Blades, & Cernkovich (2013); Spungin & Ferrell (2007)       52 Policy Factors Reference(s) POL8. Educational service guidelines published by national/state/provincial associations of special education administrators/directors.  Pugh & Erin (1999) POL9. National statements of standards for the education of students with visual impairment published by stakeholder groups.  Corn & Huebner (1998); National Coalition for Vision Health (2003)  Personnel Factors  In addition to educational programming and policy-level factors, personnel-level factors may impact TSVI workload determination. One of the best documented of these factors is the supply of special education teachers needed to fill positions in both the United States (Arick & Krug, 1993) and Canada (Zaretsky, Moreau, & Faircloth, 2008). These shortages are particularly acute in highly specialized branches of special education service delivery, such as that provided by the TSVI (Harley, 1990). McLeskey, Tyler, and Flippin (2004) detailed the findings of the 24th American Association for Employment in Education (AAEE) study of teacher supply and demand, conducted in 2000. Visual impairment was tied for ninth on a ranked list of teaching fields with the greatest national shortage (U.S.). McLeskey et al. noted that this shortage is likely to have been felt more acutely since, during the three years between the publication of the AAEE report and the publication of their article, supply of qualified TSVIs continued to shrink.  There are few estimates of the number of TSVIs currently working in North American LEAs. Based on analyses by Kirchner and Diament (1999), there were approximately 6,700 full-time equivalent TSVIs working in the U.S. in 1998-1999. Achieving a precise count of students is challenging, but the authors estimated that 93,600 American students received special education services for vision loss in 1998-1999. The resulting ratio of one teacher for every 14      53 students is a conservative one, since it is not possible to be certain that all students who qualified for service ultimately received special education services for visual impairment. By all accounts, this ratio is increasing and an additional 5,000 TSVIs are needed in the U.S. to address chronic TSVI shortages resulting from an aging workforce, with many teachers nearing retirement (Pogrund & Wibbenmeyer, 2008). At this time, few current statistics or estimates exist on the number of TSVIs currently working in Canadian LEAs. In British Columbia, there are approximately 62 itinerant TSVIs working in LEAs across the province, where there are also a number of vacant positions resulting from retirements and a lack of qualified personnel (C. Marshall, personal communication, Dec. 12, 2015). As a result of chronic workforce shortages, special education administrators may be challenged to find qualified candidates to fill vacant positions. The Delphi approach applied in the current study probed experts' personnel considerations and determined which were the most significant determinants of TSVI workloads, and which factors should be considered to promote effective workloads. See Table 2.4 for a complete listing of personnel-level factors drawn from the review of the literature from the fields of special educational leadership and service delivery for students with visual impairments.   Table 2.4 References for Initial Personnel Factors Personnel Factors Reference(s) PERS1. The professional development needs of TSVIs in the LEA (i.e., conference/travel costs, release time).  Correa-Torres & Howell (2004) PERS2. The time required for TSVIs to travel between school sites.  Olmstead (2005); Suvak (1999)      54 Personnel Factors Reference(s) PERS3. The availability of a TSVI to serve students in more than one capacity in the LEA (dually-certified TSVI/O&M specialist vs. TSVI only).  Griffin-Shirley, Pogrund, & Grimmett (2011) PERS4. The total number of TSVIs currently employed by the LEA as permanent staff.  Kirchner & Diament (1999); Mason, McNerney, & Davidson (2000) PERS5. The number of years of experience of individual TSVIs currently employed by the LEA.  Koenig (2000); Pogrund & Cowan (2013) PERS6. Data from performance reviews of current TSVIs in the LEA.  Billingsley (2004); Pogrund & Wibbenmeyer (2008) PERS7. The availability of qualified Orientation and Mobility (O&M) specialists in the LEA.  VISSIT (2014) PERS8. The number of qualified intervenors for students who are deafblind currently employed by the LEA.     Parker & Nelson (2016) PERS9. The availability of braille transcribers in the Local Education Authority (LEA) to produce materials in alternate formats (e.g., braille, tactile graphics, text in electronic format).  Hertzberg & Stough (2007); Wall & Corn (2002) PERS10. The availability of qualified paraprofessionals to support individual students with visual impairment for the entire school day (i.e., one-to-one assignment to the student).  McKenzie & Lewis (2008); Lewis & McKenzie (2009) PERS11. The availability of state/provincial centers to provide material resource support to the LEA.  Wall & Corn (2002) PERS12. The TSVI service needs of neighboring LEAs, in the case of multiple LEAs sharing a TSVI's time.  Pugh & Erin (1999) PERS13. The capacity of the LEA to sponsor current LEA teachers to train to be TSVIs.  Pugh & Erin (1999) PERS14. The geographic proximity of the LEA to the closest university program training new TSVIs.  Pugh & Erin (1999)       55 Summary of Administrative Determinants of Itinerant TSVI Workload There are few data on the factors that enter into the administrative decision-making process to determine the workload of itinerant TSVIs. Surveys of administrators indicate that they are more familiar with relevant state or federal legislation than educational service guidelines or other evidence-based or professionally developed resources (Smith et al., 2004). Legislative guidance on TSVI caseload size is inconsistent, and there are no current data to confirm adherence to these standards. In Canada, the legislative landscape varies by province. Instead, TSVI workloads are largely determined at the level of the LEA. As a result, special education administrators must make decisions regarding TSVI workloads with very little guidance. This lack of guidance and lack of awareness of the indicators of high-quality programming for students with visual impairments increase the risk that administrators will not have an accurate conception of what constitutes a manageable workload for an itinerant TSVI. Consequences of Unmanageable TSVI Workloads After reviewing the challenges that TSVIs face in meeting the demands of itinerant workloads in LEAs and outlining the administrative contexts in which these workloads are determined, the final consideration of the literature review is to examine the consequences of not addressing these challenges. Researchers note a diverse array of consequences for both students and teachers in the face of unmanageable workloads, ranging from shifts away from direct service to students (Corn, 2007) to role dissonance (Gersten, Keating, Yovanoff, & Harniss, 2001) and emotional burnout for teachers (Embich, 2001). While the sections that follow examine student and teacher consequences separately, it is important to note that in an ecological systems framework, the hypothesized bidirectionality of influences between systems implies that what may be a direct consequence of unmanageable workload for the teacher is, in effect, an      56 indirect consequence for the student (e.g., teacher burnout influencing the quality of the teacher-student interaction).  Shift from Direct to Consultative Service Unmanageable caseload sizes that surpass recommended teacher-to-student ratios have a direct impact on the frequency and quality of service provided by the TSVI (Griffin-Shirley et al., 2004). Greater numbers of students on TSVI caseloads are associated with a shift away from the provision of direct service to increased time spent exclusively in consultation with these students' school-based teams (e.g., general education teacher, paraprofessionals; Corn, 2007). Olmstead (1995) noted this trend operating on a personal level – "In the 'old days,' I would teach 80 percent and consult 20 percent, now I teach only about one-third of the day" (p. 547). Griffin-Shirley et al. (2004) averaged the composition of survey respondents' caseloads (e.g., number of blind students served, number of students with low vision served) and found that these TSVIs were providing direct service and consultative services to equal numbers of students, with the exception of students who are blind, who were more likely to receive direct instruction.  This shift toward greater consultation is evident among itinerant teachers and related service professionals serving other populations of exceptional students. National surveys in the United States have concluded that larger caseloads among speech-language pathologists (SLPs) are associated with less individualized programming, and significantly higher rates of group sessions as a means of providing service (ASHA, 2002). Kluwin, Morris, and Clifford (2004) conducted an ethnographic study of itinerant teachers of students with hearing impairment in the United States and reported that teachers who made greater than 12-15 visits per week to individual school sites identified more as "consultants" or "caseworkers" than as teachers. More recent research has confirmed the trend of increased consultation by itinerant teachers of students      57 with hearing impairments. Foster and Cue (2009) surveyed teachers of students with hearing impairments and concluded that "consultation and workshop development are emerging as key skills for itinerants" (p. 160). In contrast to other itinerant teachers of students with low-incidence disabilities (e.g., severe/profound disabilities), itinerant teachers of students with sensory impairments spend comparatively less time providing direct instruction, and more time providing consultation-based services (Sebald et al., n.d.).  The trend toward a greater proportion of TSVI time spent in consultation is problematic when one considers the unique educational needs of students with visual impairments, especially in relation to the specialized instruction required in areas of the core curriculum (Koenig & Holbrook, 2000b) and in areas of the ECC, such as skills for independent living (Lewis & Iselin, 2002). Given that the majority of qualified TSVIs in the U.S. and Canada have undertaken advanced graduate training to gain the necessary knowledge and skills to teach students with visual impairments, other members of the educational team cannot be relied upon to provide a comparable level of knowledgeable service (Ferrell, 2007; Mason & Davidson, 2000).  Adverse Effects on Curriculum Accessibility and Achievement Research with other populations of exceptional learners indicates that there is an inverse relationship between student learning outcomes and caseload size (Russ et al., 2001). Russ et al. (2001) reviewed research literature in special education related to caseload size and student achievement and concluded that "larger caseloads […] minimize opportunities for individualization and academic success" (p. 169) for students from diverse exceptional populations. Research from the field of speech-language pathology provides additional insight into the consequences of unmanageable caseloads on student achievement (Woltmann & Camron, 2009). Analyses of data from ASHA's National Outcomes Measurement System      58 (ASHA, 2000) demonstrated that students served by SLPs with caseloads of less than 40 were significantly more likely (87%) to improve one functional level on articulation skills than students served by SLPs with caseloads of 60 and above (63%). In a review of research examining the relationship between SLP caseload size and students' language outcomes, Cirrin et al. (2003) noted that "large caseloads appear to minimize opportunities for individualization of intervention" and that "[s]tudents on larger caseloads appear to take longer to make progress on communication skills" (p.164).  There is currently no research that connects TSVI caseload size directly to achievement outcomes for students with visual impairments. However, surveys and interviews with itinerant TSVIs confirm that these teachers associate large caseload sizes and a lack of instructional time with less individualized programming (Correa-Torres & Howell, 2004; Olmstead, 1995). Based on evidence from research with other exceptional populations, it is reasonable to speculate that unmanageable TSVI workloads result in fewer opportunities for individualized instruction for students with visual impairments, subsequently increasing the risk for lower achievement. Despite the dearth of outcome research related to perceptions of workload manageability, other work has examined the relationship between caseload size and the use of adaptations and specialized equipment by students with visual impairments. Johnstone et al. (2009) surveyed 197 TSVIs on the accommodations used by students on state-wide literacy assessments. These researchers reported a significant negative correlation between caseload size and the percentage of students using reading accommodations on these assessments (e.g., braille and audio formats, CCTVs, screen-reading software). A more recent survey of TSVIs by Hume (2011) noted a significant negative relationship between the size of TSVIs' caseloads and the use of "high-tech" assistive technology by students (e.g., laptop computers, braille notetakers). Taken together,      59 these findings indicate that students of teachers with larger caseloads are less likely to use advanced assistive technology with their students than are students of teachers with smaller caseloads. As a result, students of teachers with less manageable workloads may be less likely to gain access to the assistive technology devices and instruction that is increasingly essential for success in the workplace and/or post-secondary education (Smith et al., 2009).  Teacher Role Dissonance, Stress, and Attrition Special education teachers frequently identify caseload size and unmanageable workloads as reasons for leaving the field (Billingsley, 2004). The Study of Personnel Needs in Special Education (SPENSE), a nationally (U.S.) representative study of administrators and service providers in the U.S. conducted in 2001, found that of teachers who planned to leave the field as soon as possible, 17% indicated that their caseloads were not at all manageable (Carlson et al., 2002). Specific information related to certain professional groups within the sample is also available. Among SLPs in this sample, caseload size and intent to remain in the profession were negatively associated. SLPs who planned to stay in school-based positions until retirement had a median caseload of 46.2 students, while SLPs who planned to leave as soon as possible had a mean caseload size of 59.7 students (Carlson et al., 2002). Therefore, caseload size and perceptions of workload manageability may be at least one of the reasonable predictors of attrition in inclusive settings (Edgar & Rosa-Lugo, 2007).  In addition to caseload size, researchers have examined the effect of overall workload on special education teacher attrition. The relationship between teachers' perceptions of their workload as 'unmanageable' and burnout or attrition has been well-documented (Billingsley, 2004). In a study of early career special education professionals, Billingsley, Carson, and Klein (2004) found that approximately one quarter (28%) of participants rated their workload as either      60 not at all manageable, or manageable to a minimal extent. Teachers' workload perceptions are directly connected to ratings of factors associated with burnout and attrition, such as emotional exhaustion (Embich, 2001). When special education teachers perceive their workload as unmanageable, they are at greater risk for such outcomes (Brunsting, Sreckovic, & Lane, 2014).  Special education teacher attrition is also predicted by poor job design. Job design refers to the "degree to which the structure and processes established for doing […] work facilitate the successful completion of assigned tasks and responsibilities" (Gersten et al., 2001, p. 552). Gersten et al. (2001) conducted a large-scale survey of 887 special education teachers and used path analysis to examine the relationship between intent to stay in the field and various aspects of job design. The resulting path structure indicated that role dissonance (i.e., dissonance between educators' expectations about the job and the actual requirements of the job) was a strong predictor (r  = .42) of stress related to job design and satisfaction with one's current position (r  =  -.28). Stress related to job design refers to stress caused by factors associated with role dissonance – the perceived discrepancy between professionals’ perceptions of their expected role and their actual role in practice. Stress related to job design (r = -.21) and satisfaction with one's current position (r = .30) both predicted teachers' commitment to the profession. Therefore, role dissonance is significantly associated with increased stress and lower job satisfaction. These factors, in turn, predict teachers' intent to remain in special education.  Based on the results of the research reviewed in the previous section, TSVIs are at risk for experiencing role dissonance. Surveys and interviews demonstrate that these teachers have certain expectations of the role of the TSVI, such as providing appropriate levels of direct instruction (Griffin-Shirley et al., 2004) and meeting students' instructional needs in areas of the ECC (Lohmeier, Blankenship, & Hatlen, 2009). TSVIs find that a significant portion of their      61 workload is devoted to non-instructional duties that fall outside of direct service delivery to students (Griffin-Shirley et al., 2011). Certain challenges such as burdensome paperwork requirements impede the realization of TSVIs' role expectations (Correa-Torres & Howell, 2004). In the context of the models constructed by Gersten et al. (2001), TSVIs experiencing role dissonance are more likely to experience increased job design stress and lower job satisfaction, increasing the likelihood of attrition. In a survey of 103 TSVIs, Seitz (1994) noted that 31% of those with less than five years of experience indicated that if given another opportunity to choose a teaching specialization, they would not seek TSVI certification. Itinerant TSVIs “blamed their large caseloads and heavy travel schedules for preventing them from […] developing meaningful relationships with colleagues, administrators, and members of the community” (Seitz, 1994, p. 303). Thus, low job satisfaction is also associated with feelings of isolation associated with unmanageable workloads for TSVIs. With the field already experiencing well-documented chronic shortages of qualified TSVIs, there is a clear need to limit teacher attrition resulting from unmanageable workloads (Corn & Spungin, 2007; Dignan, 2012; Kirchner & Diament, 1999).  Summary of Consequences of Unmanageable Workloads Research with TSVIs and other special education service providers indicates that caseload size is positively associated with perceptions of workloads as unmanageable. Unmanageable workloads have a direct impact on student outcomes through less individualized programming, less access to adaptive tools and materials, and ultimately, may result in lower academic achievement. Furthermore, unmanageable workloads have an indirect impact on students' educational programming by increasing the likelihood of TSVI stress related to job design and lowered job satisfaction, and, ultimately, may result in attrition.       62 Summary of the Literature Review There are several adverse outcomes associated with unmanageable workloads for students and TSVIs. Unmanageable workload is associated with shifts in the nature of service delivery (i.e., from direct to consultative service) and lowered academic outcomes for students with special needs, and role dissonance, burnout, and attrition among teachers. These outcomes are of unique concern to special education administrators in the LEA, since they are responsible for ensuring high-quality programming for students with visual impairments and for maintaining a viable TSVI workforce.  This review of the literature indicates that several factors may account for greater complexity in TSVI workloads. Shifts to a more multi-exceptional student population and increased non-instructional duties (e.g., paperwork, travel) contribute to more complex workloads for TSVIs. TSVIs are also responsible for serving a greater number of students. Several researchers have noted a trend toward more complex workloads for TSVIs in Canada and the United States (Bozeman & Zebehazy, 2014; Corn, 2007; Zuvela, 2009). While the TSVI develops and delivers educational programming in conjunction with the student's school-based team, the TSVI discharges his or her professional duties under the supervision of a special education administrator. The special education administrator makes decisions regarding staffing and resources that, in effect, determine the workload of TSVIs in the school district or LEA.  Administrator support is one of the most significant controllable influences on teacher attrition (Bettini et al., 2014). However, special education administrators may lack the appropriate tools for determining manageable workloads for TSVIs. Where educational policy and caseload legislation exist, they are often under-utilized (e.g., educational guidelines) or inconsistent in both content and application (e.g., caseload ratios). The current study attempted to      63 address these shortcomings by systematically gathering experts' consensus ratings on the educational programming, personnel, and policy factors that should determine TSVI workloads. Chapter Three outlines the research design and methodological considerations of the study.           64 CHAPTER THREE RESEARCH DESIGN AND METHODOLOGY The purpose of the current study was to identify factors that experts believe should be taken into consideration when determining workloads for itinerant TSVIs. The research questions guiding the current study were as follows: 1. How do experts in special education administration and visual impairment rate the level of importance of factors that influence actual workload determinations for itinerant TSVIs? 2. What factors do experts in special education administration and visual impairment believe should be considered in workload determinations for itinerant TSVIs? a. What educational programming factors do experts in special education administration and visual impairment believe should be considered in workload determinations for itinerant TSVIs? b. What policy-level factors do experts in special education administration and visual impairment believe should be considered in workload determinations for itinerant TSVIs? c. What personnel factors do experts in special education administration and visual impairment believe should be considered in workload determinations for itinerant TSVIs? 3. How do experts in special education administration and visual impairment rate the level of importance of factors they believe should influence workload determinations for itinerant TSVIs?      65 In order to investigate these research questions, a quantitative survey design was employed. Panelists rated the importance of the initial factors identified through a review of the literature, while also nominating new variables that are relevant to his or her practice as an administrator. The Delphi approach was applied to the research problem. The Delphi approach allowed for the systematic collection of experts' judgments of the suitability of administrative-level variables as determinants of TSVI workload. This chapter outlines the research design and methodology of the study and provides a rationale for the use of the Delphi approach, along with detailed information on panelists, procedures, and the development and implementation of each iterative survey round.  The Delphi Approach The Delphi approach has found wide application in human service fields where research is motivated by the need to gather expert opinion on a particular topic (Clayton, 1997). Using the Delphi approach, "the goal of reaching consensus about the topic of interest is attained by sending several iterations of the same survey to respondents knowledgeable about the topic, and making gradual modifications in the questionnaires according to their judgments" (Bruininks, Wolman, & Thurlow, 1990, p. 9). Where traditional surveys attempt to describe "what is," the Delphi approach is most suitably applied when researchers seek to describe "what could/should be" (Hsu & Sandford, 2007, p. 1). The Delphi approach was first used as a tool for forecasting systems change in highly specialized sectors related to national security and defense (Landeta, 2006). Since its declassification and introduction to the public as a means of "facilitat[ing] an efficient group dynamic process" in 1963, the number of peer-reviewed studies using the Delphi approach has increased steadily over subsequent decades (von der Gracht, 2012, p.1526).      66  While the Delphi approach has been applied across diverse academic and professional fields, researchers have identified four necessary characteristics: anonymity, iteration, controlled feedback, and statistical aggregation of panelists’ responses (Rowe & Wright, 2001). Anonymity is achieved through the confidential collection of survey data from individual panelists. This enables panelists to express their judgements and perspectives privately and minimizes the likelihood of social desirability bias factoring into the survey results (Hallowell & Gambatese, 2010; Rowe & Wright, 2001).  Another advantage of the Delphi approach in examining expert opinion is the structure of the process itself. Over several iterations, or "rounds," this process enables the researcher to refine the content of the survey to best reflect the most reliable consensus among expert panelists (Powell, 2003). The number of rounds in a study using the Delphi approach is not fixed and varies between studies according to several factors, the most important of which is the stopping criterion (Okoli & Pawlowski, 2004). After each round, panelists are provided with controlled feedback on each survey item that reflects the group's aggregated results from the previous round. Feedback in studies employing the Delphi approach is characterized as controlled, as the researcher selects the type of feedback that is provided to panelists (von der Gracht, 2012). Panelists may refine their perspectives based on these data in subsequent rounds. This feature, in combination with anonymity for expert panelists, makes the Delphi approach particularly useful for structuring and organizing group communication (Powell, 2003). In addition to these various advantages, there are some disadvantages to the use of the Delphi approach. Most notable are threats to the reliability and validity of the findings. These threats are discussed at the end of this chapter alongside the steps taken to mitigate their impact.      67  Finally, the Delphi approach requires the statistical aggregation of panelists’ responses. A priori criteria must exist to determine both the inclusion and exclusion of items at the end of the study. According to von der Gracht (2012), there is little consensus on the essential criteria for the inclusion and exclusion of items. However, a statistic that captures the panelists’ consensus on ratings of survey items across iterative rounds is common to studies using the Delphi approach (Rowe & Wright, 2001).  Delphi Research in the Field of Visual Impairment The Delphi approach has been applied to a diverse set of educational research topics in the field of visual impairment. These topics span the core curriculum (e.g., literacy; Koenig & Holbrook, 2000a) and Expanded Core Curriculum (e.g., O&M; Wall Emerson & Corn, 2006) for students with visual impairment. Koenig and Holbrook (2000a) used the Delphi approach to generate consensus statements from experts regarding the service level required to provide high quality literacy instruction for braille-reading students. Koenig and Holbrook (2000a) collected data over three rounds, with the cut-off point for consensus set at 85% agreement. In a study of the content of high quality O&M programs, Wall Emerson and Corn (2006) used the Delphi approach to survey experts over three rounds, setting the minimum criterion for consensus at 85% agreement. From an original sample of 30 experts, 20 completed all three rounds. More recently, Smith, Kelley, Maushak, Griffin-Shirley, and Lan (2009) utilized the Delphi approach to generate a set of assistive technology competencies for TSVIs. The researchers conducted five rounds, with low attrition over the course of the study (35 panelists in Round 1 to 34 panelists in Round 5). Resulting assistive technology competencies were clustered according to their placement in ranges by percentage agreement (e.g., 90% and greater, 80-90%), with the minimum criterion for percentage level of agreement set at 75%. This is consistent with the      68 larger corpus of studies utilizing the Delphi approach. Diamond and colleagues (2014) reviewed a random sample of 100 studies and found the most common definition for consensus was percentage level of agreement, with 75% as the median threshold across studies.  Research Sample The current study employed purposive sampling techniques since expert opinion was sought and population representativeness was not required (Skulmoski et al., 2007). However, given the disparate legislative contexts that exist between states/provinces in the United States and Canada, regional representativeness was a consideration. Sampling procedures were intended to carefully select panelists with expertise in itinerant service delivery for students with visual impairments and special education administration. This specific intersection of professional backgrounds and areas of expertise was expected to furnish panelists that would be uniquely knowledgeable as to the administrative process of workload determination for itinerant TSVIs.  Sample Size There are no established rules governing sample size for studies using the Delphi approach (Hung, Altschul, & Lee, 2008). Sample sizes vary based on the complexity of the research topic and the heterogeneity of the sample (Powell, 2003). More complex research questions applied to panels of experts from various professional or academic backgrounds require larger sample sizes. The current study investigated diverse issues related to workload determination and service delivery, and thus, the expert panel was a relatively heterogeneous group to reflect the complexity of the factors under study (Hasson et al., 2000).  Workload determinations for itinerant staff generally fall under the purview of special education administrators, and so professionals in these positions are uniquely positioned to      69 address the research topic in question from an LEA-level perspective (McCarty, Hazelkorn, & Boreson, 2003). In addition to education professionals currently serving as special education administrators in the LEA, the current study also sought the expertise of leaders in the field who hold supervisory roles at centers and agencies operating at the state/provincial level. In the U.S., many states have state-level personnel dedicated to issues related to the education of students with visual impairment (Müller, 2006). These leaders have diverse roles, ranging from providing technical assistance to LEAs to representing the interests of students with visual impairment on state-level task forces and working groups (Müller, 2006). Similarly, in Canada several provinces (e.g., British Columbia, Nova Scotia) employ administrators to oversee provincial/regional resource centers or agencies that serve TSVIs and students in LEAs (Zuvela, 2009). It should be noted that while state/provincial-level administrators do not determine workloads for TSVIs in individual LEAs, they have a unique perspective on trends and issues across their state/province relating to service delivery (Smith et al., 2004; Zambone & Allman, 1988). In addition to these professional groups of administrators, a group of recognized experts was included. These experts were identified by the nomination panel as researchers who have made significant contributions to the research and professional literature related to service delivery and educational programming for students with visual impairment. The final category of panelists included in the study were administrators based at provincial- or state-level specialized schools for students with visual impairments who manage state- or province-wide outreach services based at their respective schools. This category of panelists was identified by the nomination panel as a category of professionals who would have unique insight into workload determination for TSVIs in their given state or province.       70 Researchers recommend that for homogenous samples, the final round should include, at minimum, 10-15 panelists (Skulmoski et al., 2007). Skulmoski et al.'s (2007) recommendation for sample size is used to set the lower limit of the required sample size. The goal of the current study was to achieve a sample size of 15-20 panelists. Clayton (1997) advised that between 15 and 30 panelists are required to develop adequate reliability and validity. By applying Skulmoski et al.'s (2007) criteria for Delphi study participation (e.g., sufficient time, willingness/capacity to participate), the sample size in the final round was expected to remain between 25-30 panelists. Sample size at the conclusion of the fourth and final survey round was 31 participants.    Sampling Criteria  When using the Delphi approach, well-defined criteria for panelist selection are required for purposive sampling (Powell, 2003). Hsu and Sandford (2007) recommended that panelists should "be highly trained and competent within the specialized area of knowledge related to the target issue" (p. 3). Given the highly specialized nature of expert sampling in studies using the Delphi approach, only generic guidelines for selecting experts are provided in the literature (Adler & Ziglio, 1996). Skulmoski et al. (2007) noted that, in the context of Delphi research, experts should possess: (1) knowledge and expertise relevant to the issues under study, (2) the capacity and willingness to participate, (3) sufficient time to participate in all rounds of the study, and (4) effective communication skills. Email messages to solicit panelists and notices of informed consent included these criteria. During the course of a study employing the Delphi approach, experts can become fatigued after two or three rounds (Fink, Kosechoff, Chassin, & Brook, 1984). Thus, panelists were made aware of the necessary time commitment from the outset. If the outcome of a Delphi study has a potentially ameliorative effect on variables of importance to panelists, attrition is minimized      71 (Hasson, Keeney, & McKenna, 2000). Since a potential outcome of the current study (i.e., data to support special education administrators) was hypothesized to be a positive step toward ensuring more informed decisions regarding the process of TSVI workload determination, a high degree of investment in the study was anticipated.   Most critical to the internal validity of the study is the knowledge and expertise of panelists. Panelists were fully qualified TSVIs who had served as itinerant TSVIs for no less than the equivalent of 5 full-time academic years in North America (i.e., at 1.0 FTE equivalent; Koenig & Holbook, 2000; Pogrund & Wibbenmeyer, 2008). Basic TSVI qualifications in the United States and Canada vary considerably across states/provinces, and so requiring panelists to possess a singular form of qualification (e.g., graduate degree) was not feasible (Pogrund & Wibbenmeyer, 2008; Zuvela, 2009). Instead, panelists were fully qualified to work as a TSVI in the LEA over a minimum of 5 years of professional experience. According to Palmer, Stough, Burdenski, and Gonzales' (2005) review of the literature, a minimum of five years of teaching experience is a consistent criterion applied in other studies of expert special education teachers. Panelists with five or more years of experience working as TSVIs in resource room or specialized school settings were not included in the sample unless he or she also had five years of itinerant experience. While these professionals may be qualified TSVIs, the experience of itinerant TSVIs differs significantly from that of teachers working in other service delivery models. In addition to TSVI credentials, panelists also had professional experience in one, or both, of the following capacities: (1) a special education administrator in an LEA in the United States or Canada, and/or (2) or an administrator overseeing an agency or program with a state- or province-wide mandate to serve TSVIs and students with visual impairment in the LEA (Müller, 2006; Smith et al., 2004). A third group of experts were administrators of state- or province-wide resource programs      72 based at a specialized school for students with visual impairments. Finally, a fourth panelist group of recognized experts in service delivery and educational programming for students with visual impairment was included in the study. These experts did not currently hold an administrative position at the LEA or state-/provincial-level but had a proven record of scholarship in the topic of the current study. Experts included in this group met at least one of the following: (a) author or co-author of peer-reviewed articles on service delivery and educational programming for students with visual impairment; (b) author or co-author of publications in use at the state, provincial, or national level to provide guidance on educational service for students with visual impairment; (c) experts have been identified as possessing, to the best knowledge of the nominator, unique knowledge or insight into the itinerant model of service delivery for students with visual impairments by virtue of their experience, professional contributions, and leadership. It was not required that panelists in this category possess credentials or professional experience as itinerant TSVIs.  No certification criteria for special education administration were included in the criteria for participation in the current study. Vast inconsistencies in the licensure and certification requirements for special education administrators in the United States are well documented (Lashely & Boscardin, 2003; Rude & Sasso, 1988; Wigle & Wilcox, 2002). Therefore, insisting on a base level of formal qualification was not pragmatic. Similar data were not available for Canadian administrators, but it is reasonable to surmise that similar conditions exist in Canada country. As a result of this inconsistency, years of administrative experience were required for participation in the relevant participant categories. LEA-level administrators had to be directly responsible for supervising the professional practice of TSVIs and determining the workload of TSVIs through FTE assignment. State- or province-level administrators had to be directly      73 responsible for the services provided by a state or provincial resource center, agency, or outreach program serving TSVIs and students with visual impairment in the LEA. Administrators of centers or agencies with state- or province-wide mandates also participated in the study. These services had to extend beyond the provision of materials in adapted/alternate formats (i.e., Instructional Materials Resource Centers) to increase the likelihood that administrators had sufficient knowledge of current TSVI staffing trends in the LEAs in her/his jurisdiction (Müller, 2006; Wall & Corn, 2002).  In order to participate in this study, experts at all levels were required to hold an administrative role for no less than the equivalent of 3 full-time academic years (1.0 FTE/year). The combination of a minimum of 5 years of TSVI experience and 3 years of administrative experience ensured that panelists have adequate overall experience to comment on the administrative-level factors under study. Professionals who retired from administrative positions within the last 5 years were eligible to participate. However, professional experience is a "necessary, though not sufficient, condition for developing teacher expertise" (Stough & Palmer, 2003, p.207).  Therefore, the current study employed social nomination and recognition as a necessary condition for determining panelist expertise as well as the importance of contributions through publication and research (Palmer et al., 2005). The following section details the sampling procedure that was used to identify and contact potential expert panelists.  Sampling Procedure Most studies employing a Delphi approach use a nomination process to identify the most suitable individuals for the research sample (Hsu & Sandford, 2007). Nonrandom sampling methods are inherent to the Delphi approach, as the research sample is purposefully composed of individuals possessing some expertise or qualification that is required to address the research      74 questions (Okoli & Pawlowski, 2004). Consistent with other Delphi research in the field (e.g., Smith et al., 2009), professional and academic leaders in the blindness field were contacted to nominate potential panelists. Three to four leaders were originally expected to comprise the nomination panel, but this was expanded to seven nominators to achieve a sufficient overall number of non-overlapping nominations. These leaders (i.e., the nomination sample) did not participate in the research sample since some did not meet criteria for study participation (i.e., TSVI and administrative qualifications/experience) and to preserve anonymity between experts in the research sample.  The nomination panel was composed of recognized experts in the field of visual impairment and education. A total of eight nominators was contacted, and seven provided nominations. At the time of data collection, five nominators held or had held leadership positions in personnel preparation programs at the university level (four in the United States, one in Canada). Three nominators held leadership positions at state or provincial instructional material resource centers or agencies (two in the United States, one in Canada). Some nominators had been retired for more than 5 years and could not have participated in the study. However, all nominators did not participate in the Delphi study in order to maintain the anonymity of panelists.  Each nominator received an email message outlining the current study and the criteria for participation and was asked to forward a list of a maximum of 20 potential panelists. There was limited overlap of nominators in a given state or province to increase the regional representation in the research sample. It was possible for professionals not currently working as a district-level or state/provincial-level administrator to be nominated to participate in the research sample if they qualified for participation but were currently in another position or retired, providing he or she held an administrative position in the last 5 years. This provision allowed for the      75 participation of experts who retired or moved on to other positions. This provision was made in light of survey data that notes the growing population of professionals in the field of visual impairment nearing retirement (Ambroze-Zaken & Bozeman, 2010; Mason, McNerney, & Davidson, 2000). After duplicate nominations were removed from the total set of nominations received, 70 unique nominations remained. Nominated individuals were contacted directly by the researcher via email. An initial email was sent to inform 70 potential panelists that he or she had been nominated and to introduce the study. This email was followed by a more detailed email containing a hyperlink directing the nominee to the consent documentation for the study and Round One survey (see Appendices A and B).  Study Administration There is no standard for the required number of rounds when using the Delphi approach (Hung et al., 2008). The classical Delphi approach holds that consensus generally requires four iterations (Erffmeyer, Erffmeyer, & Lane, 1986). However, more recent studies note that three rounds are generally held as sufficient to achieve a high level of agreement among panelists (Skulmoski et al., 2007). The minimum number of rounds for the current study was set at three, since the stability criterion requires that at least two ratings be obtained for each survey item. Since nominated factors were rated for the first time in the Round Two survey, a minimum of three rounds was required. By assuming three survey rounds would be required, the total timeline for the study was anticipated to be between 5 and 6 months - 6 to 8 weeks for the nomination process, 4 weeks for each survey round, and 1 week for analysis and instrument construction between rounds. Table 3.1 illustrates the actual study timeline and a summary of data collection. See Figure 3.1 for a diagram of the Delphi process as it was applied in this study.      76 As is the case with the timeline of the Delphi approach, there is significant variation between Delphi studies in terms of quantitative techniques for data analysis (Holey et al., 2007). This is particularly evident among the test statistics used by researchers to determine the stability and level of agreement of responses. Numerous analytic approaches have been proposed, and while there is no clear consensus among researchers using the Delphi approach, recent literature reviews and statistical papers inform the strategies for data analysis in the current study (e.g., von der Gracht, 2012; De Vet et al., 2004). Descriptive statistics, test statistics, and significance testing for each survey item were obtained via use of the SPSS statistical package (Version 20).   Table 3.1 outlines the timeline for the administration of the study over four iterative rounds.  Table 3.1  Timeline for Data Collection, Rounds One to Four  Round One Round Two Round Three Round Four Start-End Dates 01/24/2016 – 03/02/2016 03/21/2016 – 04/28/2016 06/12/2016 – 07/14/2016 16/11/2016 – 12/03/2016 Duration 38 days 38 days 32 days 17 days Response rate 42/70 (60%) 34/42 (79%) 34/34 (100%) 31/34 (91%)  Data Collected Initial (actual and ideal) importance ratings of 45 items drawn from literature review plus corresponding qualitative data (optional).   Ideal importance ratings of 45 items from initial set of factors plus corresponding qualitative data (optional).    Ideal importance ratings of 14 items from initial set of factors plus corresponding qualitative data (optional). Ideal importance ratings of 3 items from initial and nominated sets of factors plus corresponding qualitative data (optional).  Raw data of 241 participant-nominated factors across three thematic categories.  Initial (actual and ideal) ratings of 22 participant-nominated factors plus qualitative data (optional). Ideal importance ratings of 22 participant-nominated factors plus qualitative data (optional).       77                   Figure 3.1. Diagram of the Delphi Process Develop Round One survey  Develop and launch Round Three survey  Develop and launch Round Four survey  Groundwork: Review of the literature, development of research methodology  Develop criteria for expert panel Recruit nomination panel.  Recruit expert panelists Launch Round One survey  Round One data analysis  Develop and launch Round Two survey  Round Two data analysis  Finalize set of confirmed factors Develop final report Round Three data analysis  Round Four data analysis       78 In each round, data were collected via online surveys hosted on Canadian servers by Fluidsurveys.com. A new online survey was created for each round, and included feedback information from the previous iteration. Panelists’ qualitative responses were aggregated into tables in Microsoft Word documents and stored on the cloud-based Sync.com platform. Sync.com is a Canadian company with servers based in Canada.  The sections that follow outline the procedure for study administration, as well as the data analyses conducted following each survey round. To determine the consensus of panelists’ ratings on a given factor, the interquartile range was calculated. A composite variable of percentage importance ratings was used to gauge the relative importance of each factor, and the Wilcoxon signed-rank test was used to determine the stability of panelists’ ratings across rounds for a given factor.  Prior to Round One  Prior to beginning the survey, panelists accessed the documentation of consent to participate in the study. The panelist acknowledged that he or she had read the document and gave his or her consent to participate by clicking a checkbox. All indications of consent were verified by the researcher prior to including data gathered from that panelist in any subsequent analyses. Round One Survey Development Participant Demographics. Following the documentation of consent to participation, the panelist was directed to the Round One survey. The Round One survey opened with several items to gather information on the professional credentials and background of panelists. Panelists were first asked to indicate their current professional role. Of the 42 participants in Round One, 5 (11.9%) identified themselves as a special education administrator in a local education agency/district (LEA), 10 (23.8%) as administrators of a state/province-wide resource center or      79 agency, 5 (11.9%) as administrators of a state/province-wide outreach program based at a specialized school for students with visual impairment, and 17 (40.5%) as “Other.” Analysis of the “Other” response category revealed a category of panelists not originally identified in the survey item. State- or provincial-level consultants in visual impairment were not included as a response category based on Müller (2006), who identified an acute shortage of these positions in state-level infrastructure and programs. However, 5 (11.9%) panelists responding in the “Other” category indicated that they worked as a state-level consultant in visual impairment. Given that these panelists emerged as an identifiable participant sub-group, they were henceforth recognized as a separate participant category. All other panelists responding in the “Other” category met criteria as “Recognized Experts” (e.g., authors of well-known professional resources in service delivery for students with visual impairments in inclusive settings). Within the category of “Recognized Experts” (n = 17), five self-identified (9.5%) as university professors working in a personnel preparation program, while the balance of panelists in this category identified themselves as “education consultants” or “education specialists” in visual impairment. Table 3.2 displays the complete distribution of panelists’ professional roles.  Table 3.2 Distribution of Round One Panelists’ Professional Roles Professional Role  Number of Panelists  Percentage of Panel (%) Special Education Administrator in a Local Education Authority/District 5  11.9 Special Education Administrator at a State/Provincial Resource Centre or Agency 10 23.8 State/Provincial-Level Consultant in Visual Impairment  5 11.9 Administrator of a State/Province-Wide Outreach Program Based at a Specialized School 5 11.9 Recognized Experts 17 40.5      80 Professional Role  Number of Panelists  Percentage of Panel (%) - University Professor in a Personnel Preparation Program - Education Consultant/Specialist in Visual Impairment  5  12  Total Round One Panelists   42  100  Panelists were then asked to indicate their initial level of qualification as a TSVI. Two (4.7%) panelists had obtained bachelors’ degrees while eight (18.6%) had obtained a bachelor’s degree plus an additional endorsement/certificate to qualify to work as TSVIs. At the graduate level, ten (23.3%) panelists had obtained master’s degrees and 19 (45.2%) had obtained master’s degrees plus an additional endorsement/certificate. Two (4.7%) panelists had become qualified as TSVIs through their doctoral studies and two (4.7%) indicated an “Other” credential. Analysis of the “Other” category indicated that these panelists were not qualified as TSVIs. However, both participants met criteria as recognized experts and so they remained in the research sample. After indicating their initial certification as a TSVI, participants were asked to indicate any additional qualifications they possessed in addition to their initial certification. Four (11.8%) noted that they possessed an additional endorsement/certificate. At the graduate level, seven (20.6%) indicated that they had obtained an additional master’s degree, nine (26.5%) an additional master’s degree plus an endorsement/certificate, and nine (26.5%) had achieved doctoral degrees. Twelve (35.3%) indicated the “Other” response category. Within the category of additional certifications, panelists indicated various certificate and degree programs. It is likely that many of the certifications that were noted would have qualified under one of the assigned response categories, but the researcher was not in a position to evaluate these programs for the purpose of post hoc categorization. Finally, in this section of the Round One survey,      81 panelists were asked to indicate their number of years of experience working as a TSVI. Across the 40 panelists in Round One that indicated they had obtained certification, there was a mean of 22.60 years (SD = 12.88) of experience working as a TSVI, with a median of 25 years.  Once level of qualification and years of experience as a TSVI were determined, panelists were asked to indicate their level of qualification and years of experience as a special education administrator. The high variability in levels of qualifications for administration reflect what is commonly reported in descriptive studies of special education administrators (Lashely & Boscardin, 2003). Thirteen (31%) of panelists had no additional qualifications in educational administration, while another 13 (31%) had obtained a special endorsement or certificate in educational administration. At the undergraduate level, one (2.4%) had obtained a bachelor’s degree. At the graduate level, six had achieved master’s degrees and one had completed a doctorate in educational administration. Of the participants identifying as special education administrators, there was mean of 14.3 years (SD = 10.87). Years of experience in special education administration ranged from three to 31 years with a median of 15.5 years. The expert panel was to have regional representation from across Canada and the United States. This distribution is displayed in Table 3.3. Out of the Round One total of 42 panelists, seven were currently working or had worked in Canada prior to retirement, and 35 were based in the United States.  Table 3.3  Geographic Distribution of Round One Panelists Region Number of Panelists Percentage of Panel (%) Northern Territories, Canada (YK, NT, NU) 0 0.0 Western Provinces, Canada (MB, SK, AB, BC) 1 2.4 Central Provinces, Canada (ON, QC) 3 7.1 Eastern Provinces, Canada (NB, NS, PE, NL) 3 7.1      82 Region Number of Panelists Percentage of Panel (%)    Northeastern United States (ME, VT, NH, MA, CT, RI) 2 4.8 Middle Atlantic United States (NY, PA, NJ) 6 14.3 South Atlantic, United States (MD, DE, WV, WA, NC, SC, GA, FL) 7 16.7 East South Central, United States (KY, TN, MS, AL) 2 4.8 West South Central, United States (OK, AR, TX, LA) 4 9.5 East North Central, United States (WI, IL, MI, IN, OH) 3 7.1 West North Central, United States (ND, SD, MN, IA, NE, KS, MO 2 4.8 Mountain West, United States (MT, ID, WY, NV, UT, CO, AZ, NM) 4 9.5 Pacific West, United States (WA, OR, CA, HI, AK) 5 11.9  Delphi Survey – Round One. Following the survey items related to demographics and professional profile, panelists were asked to rate the importance of factors drawn from the extant literature devoted to itinerant service delivery and educational programming for students with visual impairments. A unique feature of the Round One survey was the requirement that panelists provide two ratings for each survey item. In Round One, panelists were asked to provide two ratings on the importance of each factor to the process of workload determination for TSVIs: 1) based on their professional experience (i.e., the importance of that item in practice); and 2) based solely on their expert judgement (i.e., the ideal importance of that item). Subsequent survey rounds sought only the latter rating for initial factors, as the expert judgement of the panel was most germane to the research questions. The Round One survey contained a total of 45 factors identified in the review of the literature to have potential significance in the process of workload determination for itinerant TSVIs in inclusive settings. The researcher clustered these 45 factors into three thematic areas based on the topic of the research –      83 Educational Programming, Personnel, and Policy-level factors. See Appendix B for a copy of the Round One survey tool.   In addition to the factors drawn from the review of the literature, experts were asked to write-in other factors that currently have an impact on the process of TSVI workload determination. This allowed for group "brainstorming” in Round One (Okoli & Pawlowski, 2004). Panelists were asked to generate a minimum of six factors, since multiple experts are likely to nominate the same factor using different terminology (Schmidt, 1997). These nominated factors were analyzed during the Round Two development phase to remove duplicate, redundant or invalid options. See Chapter Four – Round One Analysis for more detailed information on these analyses.  Round One Data Analysis The first iteration of the survey has a unique purpose in the Delphi approach (Skulmoski et al., 2007). As outlined in the previous section, descriptive information that is directly relevant to panelists' eligibility as experts (i.e., professional role, years of experience) was tabulated and used to report on the professional profile of the research sample. Statistical analyses in Round One were limited to the calculation of measures of central tendency (i.e., mean, standard deviation) and consensus (i.e., interquartile range) since this was the first opportunity that panelists had to rate individual factors. Panelists viewed the aggregated results of these ratings in Round Two, along with a complete listing of panelists’ qualitative responses.    An essential methodological consideration of the Delphi approach is determining the statistic used for establishing the consensus of expert opinion. There is little agreement as to the most appropriate statistic to measure consensus in Delphi studies (Hsu & Sandford, 2007; Landeta, 2006). In a recent review of consensus measurement in Delphi studies, von der Gracht      84 (2012) concluded that "a general standard of how to measure consensus in Delphi studies does not yet exist" (p. 1533). However, in making recommendations for future research employing the Delphi approach, von der Gracht (2012) suggested the selection of a more robust descriptive statistic, namely the interquartile range (IQR). The IQR is a measure of statistical dispersion for the median and consists of the central 50% of total observations per survey item per round (von der Gracht, 2012). The use of the IQR is "an objective and rigorous way of determining consensus" when utilizing the Delphi approach (Rayens & Hahn, 2000, p. 314). The IQR must be carefully applied, since most Delphi studies use ordinal scales, and the number of points on an ordinal scale can influence the size of the IQR. For a four- or five-point scale, an IQR of one or less is an adequate indicator of consensus (von der Gracht, 2012). An IQR of one or less indicates that more than 50% of panelists' ratings for the survey item fall within one point on the Likert-type scale for that item (De Vet et al., 2004). Thus, experts achieved a sufficient degree of consensus on a particular survey item when the IQR for that item was less than or equal to one.  The IQR, as a measure of consensus, indicates the degree to which panelists’ responses cluster around a point on the Likert-type scale for a particular survey item. The research questions emphasize experts’ perceptions of the importance of individual factors to workload determination for TSVIs. Specifically, experts’ ratings of positive significance (i.e., importance) are of interest. Therefore, an additional statistic is required to report the nature of the level of importance for individual survey items. Consistent with other studies in the field of visual impairment utilizing the Delphi approach (e.g., Smith et al., 2009), the percentage level of importance is used as an indication of the nature of agreement among panelists’ importance ratings for individual factors. Setting a percentage level of agreement for inclusion or exclusion of survey items is a common feature of consensus measurement in studies using the Delphi      85 approach (Powell, 2003). Analyses of panelists’ ratings used only valid percentages from ratings on the five-point Likert-type scale for level of importance; see Figure 3.2 for a representation of the scale used in all surveys. Response options “I don’t know” and “Not applicable” were coded as missing values, not as valid ratings, and were not included in the calculation of importance level percentages in SPSS. Since the research questions for the study emphasized only those factors rated as important to informing the process of workload determination for TSVIs, a variable accounting for “Very important” and “Important” ratings was required. Importance level of agreement (represented henceforth as ImpLOA) is a composite variable that is calculated by combining the raw valid percentages of “Very Important” or “Important” ratings per survey item. ImpLOA is reported throughout Chapter Four as an indicator of the level of agreement among panelists’ importance ratings for each factor.       Figure 3.2: Example of Likert-Type Response Scale for All Items.  Round Two Survey Development Participant Demographics. A total of 34 panelists completed the Round Two survey. This represents a 19% decrease (n = 8) in the size of the panel from Round One. Of the eight participants who left the study between Rounds One and Two, four were state/provincial-level administrators, two were district/LEA-level administrators, and two were recognized experts. All were from the United States. Table 3.4 displays the distribution of remaining panelists’ professional roles in Round Two.       86 Table 3.4  Distribution of Round Two Panelists’ Professional Roles Professional Role  Number of Panelists  Percentage of Panel (%) Special Education Administrator in a Local Education Authority/District 3  8.8 Special Education Administrator at a State/Provincial Resource Centre or Agency 9 26.5 State/Provincial-Level Consultant in Visual Impairment  2 5.9 Administrator of a State/Province-Wide Outreach Program Based at a Specialized School 5 14.7 Recognized Experts - University Professor in a Personnel Preparation Program - Education Consultant/Specialist in Visual Impairment  15 5  10 44.1  Total Round Two Panelists   34   100  Delphi Survey – Round Two. A second hyperlink to the Round Two online survey was emailed to panelists two weeks after the close of the Round One survey. The second round provided experts with their first opportunity to examine their own beliefs in the context of aggregated feedback from Round One. For each item rated in Round One, panelists reviewed controlled feedback in Round Two. These data included both raw number and percentage agreement for each response option on the Likert-type scale for that item, as well as the total number of responses, mean, and standard deviation. Complete data from both Round One ratings were displayed (i.e., actual and ideal ratings) for each item. Each survey item also contained a hyperlink to panelists’ aggregated qualitative responses for each factor in the Round One survey. For each factor in Round One, panelists’ responses were copied directly from the online survey tool and arranged in a table in a Microsoft Word document. By following the hyperlink, panelists      87 accessed the aggregated qualitative data from Round One via an additional browser tab that opened to display the data table.   Items from the Round One survey were randomized within their appropriate thematic category (e.g., policy factors that currently impact workload determinations) to minimize any potential sequencing or practice effects in Round Two (Hasson & Keeney, 2011). Each item was assigned a number corresponding to the sequential order of items within each thematic category in Round One. A random order was then generated using Microsoft Excel 2013 and items were arranged accordingly. The same process was repeated for all three thematic categories of items. In addition to all 45 items from Round One, 22 panelist-nominated factors were included in the Round Two survey. These 22 items were the result of qualitative analyses conducted following Round One, drawn from an original pool of 241 participant-nominated factors. Nominated factors appeared in random order in each thematic category – nine Educational Programming factors, eight Personnel factors, and five Policy-level factors. See Appendix C for a copy of the Round Two survey tool. Round Two Data Analysis The second iteration of the survey was the first opportunity that panelists had to verify or modify ratings based on aggregated panelist ratings from Round One. The Round Two survey contained detailed feedback on results from the previous round. Feedback for each initial factor included: - The total number of ratings for that factor in Round One; - measures of central tendency (i.e., mean and standard deviation); - the distribution of responses on the Likert-type scale in addition to the percentage of overall responses for each response option;      88 - and a hyperlink to a table of the aggregated qualitative comments entered by panelists in the Round One survey. Quantitative feedback reviewed by panelists in the Round Two survey featured complete data from the previous round. Therefore, percentage calculations accounted for all ratings, including those not on the five-point Likert-type scale – “I don’t know” and “Not applicable.” These response options were not coded with scalar values in FluidSurveys (i.e., not factored into measures of central tendency displayed to panelists) but were included in percentage calculations for each response option per factor. Therefore, while these responses were not valid for the statistical calculation of level of importance, consensus, and stability indicators, these responses were included in controlled feedback to panelists via percentages per response option (e.g., “Somewhat important;” “Not at all important”) for a given factor. To have presented only results from valid response options may have artificially inflated the percentage ratings per response option of factors where a significant number of panelists reported that the factor did not apply to their practice (i.e., “Not applicable”), or they did not know what the importance of that factor should be to the process of workload determination for itinerant TSVIs (i.e., “I don’t know”). See Appendices B-E for copies of the Rounds Two through Four surveys, which include the controlled feedback viewed by panelists in each of these rounds.       After reviewing controlled feedback, panelists provided a second rating of each of the initial factors from Round One. A second rating enabled the researcher to ascertain the stability of ratings across rounds. Since stability could not be determined with only one round of data, no survey items were removed, added, or otherwise modified between Rounds One and Two. In addition to level of agreement and consensus, the stability of consensus is an important analytic consideration. Stability refers to the consistency of responses across rounds of the study (Dajani      89 et al., 1979). To calculate stability for a survey item between rounds, the Wilcoxon signed-ranks test was applied. Given that most Delphi studies use ordinal scales for survey items (e.g., "strongly agree" to "strongly disagree"), the Wilcoxon signed-rank test is most appropriate since it provides researchers with a nonparametric statistic that "works with paired data of the same group of individuals as in a 'before and after' situation" (von der Gracht, 2012, p. Table 2.2 highlights the educational programming factors drawn from the current literature review and indicates the corresponding reference. Here, the research study or professional literature has implicated each given factor as a possible determinant of the complexity of the itinerant TSVI’s workload. 1532). A nonparametric statistic is preferable since the relatively small expert sample makes the assumption of normality in the data questionable. The Wilcoxon signed-ranks test assesses whether or not there is a significant discrepancy in the paired difference of panelists' rankings between rounds (De Vet et al., 2004). For the null hypothesis for the stability test to be accepted, there must be no significant discrepancy in the median difference between panelists' rankings between rounds. Therefore, acceptance of the null hypothesis means that there is no statistically significant difference in ratings for a given factor from those obtained in the previous survey round. Mathematically, this is represented as: WNt < WNt, a = 0.05 where W is the test statistic; Nt is the number of matched pairs; and a=0.05 is the level of significance. To reject the null hypothesis, the median difference is significant (i.e., WNt > WNt, a = 0.05). If the null hypothesis is rejected, a statistically significant difference exists between panelists’ ratings between rounds – in short, the ratings are not stable between survey rounds. This indicates a significant adjustment in panelists’ ratings compared to those for the same factor in the previous round.       90 Stability itself does not imply a specific level of agreement, but it is an indicator of the validity of level of agreement analyses (Dajani et al., 1979). If panelists' rating of an item demonstrates stability across survey rounds, this indicates a strong conviction in the validity of that rating, across experts over time. However, it should be noted that while panelists' responses to a given survey item may be consistent across iterations, this does not provide an indication of the nature of stable agreement. For example, panelists may consistently rate a factor as unimportant to the determination of TSVI workloads. The Wilcoxon signed-rank test provides an indication of stability in panelists' responses across survey rounds, but does not indicate the level of importance for a given factor. For this reason, the ImpLOA percentage rating remains the primary criterion for determining which factors are included in the final set of confirmed factors (i.e., those who meet inclusion criteria).  As the primary criterion for determining whether a factor is included in the final set of confirmed factors, the ImpLOA percentage rating is an important statistic as the research questions of the current study are concerned with only important factors. For example, a given factor might achieve consensus and stability criteria but, ultimately, is of interest only if that factor also achieves a high ImpLOA percentage rating. As mentioned previously, Round Two analyses presented the first opportunity for all three criteria to be evaluated simultaneously. At this stage, only initial factors (i.e., those appearing in the Round One survey) can be included in, or excluded from, the final set of confirmed factors. Inclusion or exclusion from the final set of confirmed factors was possible only after each factor had received a minimum of two consecutive ratings from the panel, in order to satisfy the stability criterion for inclusion/exclusion.       91 In the design phase of the study, it was determined that a mechanism was required to exclude or include (i.e., confirm) factors. With a set of 45 initial factors and without being able to predict the number of nominated factors that would result in Round One, a rule was needed to ensure that panelists were not required to persistently re-rate factors that had already achieved ImpLOA percentage, consensus, and stability criteria. Factors with an ImpLOA percentage rating of 85% or greater that also met consensus and stability criteria after Round Two were confirmed and were not referred to Round Three.  Factors with low ImpLOA percentage ratings were excluded as a result of Round Two analyses so that items with consistently low ImpLOA percentage ratings could be excluded from further rating by the panel. By excluding only factors with an ImpLOA percentage rating of less than 65%, the panel had an opportunity to review factors approaching the 75% ImpLOA threshold (i.e., those with a Round Two ImpLOA percentage rating between 65% and 74.5%) to either confirm that these factors were of lower importance to the process of TSVI workload determination or to promote them into the final set of confirmed factors if Round Three ratings surpassed the 75% ImpLOA threshold. Conversely, factors with ImpLOA percentage ratings between 75% and 84.5% were also referred to Round Three, to provide an equal opportunity for factors having marginally achieved the 75% ImpLOA threshold to be rated again. These factors would either be confirmed by panelists or excluded if Round Three ratings fell below the 75% ImpLOA threshold. Following Round Two, the inclusion criteria for the final set of confirmed factors were applied for the first time.  • Importance - ImpLOA percentage rating above 75% (as stated previously, 85% or above following Round Two); • Consensus - Interquartile range (IQR) of one or less; and      92 • Stability – Non-significant difference between ratings across rounds according to Wilcoxon signed-rank test (i.e., accept null hypothesis).    As mentioned above, factors with ImpLOA percentage ratings within 10% of the 75% ImpLOA threshold were referred to Round Three for further rating. Since initial factors were not directly nominated by panelists, a conservative approach to including or excluding these factors in the final set of confirmed factors was required. Factors with ImpLOA percentage ratings of 85% or above that had also achieved consensus and stability criteria were not referred to Round Three as they had met criteria for the final set of confirmed factors with a high importance ranking and, thus, further ratings were not required. Factors with ImpLOA percentage ratings of less than 65% were also not referred to Round Three and excluded from the final set of confirmed factors. As stated earlier, the research questions of the study prioritize factors rated by panelists as important considerations in the process of TSVI workload determination. Therefore, factors that failed to achieve an ImpLOA percentage rating greater than 65% after two survey rounds were not subjected to subsequent ratings by panelists. Taken together, these provisions enabled the researcher to refine survey content by limiting the requirement that panelists continue to rate consistently high- and low-rated factors across subsequent surveys. This enabled the panel to focus on factors with ratings around the 75% ImpLOA percentage threshold as well as those factors with ImpLOA percentage ratings over 75% that had not also achieved consensus and/or stability criteria. Round Three Survey Development  Participant Demographics. A total of 34 panelists completed the Round Three survey for a response rate of 100%. As a result, the distribution of professional roles across panelists was      93 unchanged between rounds. Table 3.5 displays the distribution of panelists’ professional roles in Round Three. Table 3.5 Distribution of Round Three Panelists’ Professional Roles Professional Role  Number of Panelists  Percentage of Panel (%) Special Education Administrator in a Local Education Authority/District 3  8.8 Special Education Administrator at a State/Provincial Resource Centre or Agency 9 26.5 State/Provincial-Level Consultant in Visual Impairment  2 5.9 Administrator of a State/Province-Wide Outreach Program Based at a Specialized School 5 14.7 Recognized Experts - University Professor in a Personnel Preparation Program - Education Consultant/Specialist in Visual Impairment  15 5  10 44.1  Total Round Three Panelists  34              100  Delphi Survey – Round Three. In Round Three, results from Round Two were displayed to panelists in a third online survey, accessed via a hyperlink transmitted through email. The period between Rounds Two and Three presented the first opportunity for initial items to be evaluated against the criteria for the final set of confirmed factors of the study (i.e., percentage agreement, consensus, and stability criteria). Eleven of the initial 45 factors were referred to the Round Three survey since they did not meet all of these three criteria. All 22 nominated factors were referred to the Round Three survey since they were not yet rated for a second time. The same randomization procedure as in previous rounds was used in the design and formatting of the Round Three survey. Initial items were randomized within each thematic category with      94 nominated factors interspersed in random order. See Appendix D for a copy of the Round Three survey tool. Round Three Data Analysis Statistical procedures from Round Two were applied in Round Three. Measures of central tendency were obtained for responses in Round Three, and results from this iteration were compared with Round Two results to determine stability, consensus, and level of agreement statistics. Round Three ratings for three factors indicated that three items did not meet the stability criterion. For these three factors, the 75% ImpLOA threshold and consensus criterion were met, but stability was not achieved (WNt > WNa, a = 0.05). A fourth Delphi survey round was required to see if these factors could achieve the full criteria for inclusion in the final set of confirmed factors (i.e., the 75% ImpLOA threshold is achieved; IQR < 1; WNt < WNt, a=0.05).  Round Four Survey Development    Based on analyses of Round Three results, three factors did not meet the stability criterion (i.e., rejection of the null hypothesis). Both ImpLOA percentage and consensus criteria were met by these factors following Round Three, but further ratings were required for these factors to meet criteria for inclusion in the final set of confirmed factors. See Appendix E for a copy of the Round Four survey tool. A hyperlink to the online survey was sent to each panelist who had completed the Round Three survey. Thirty-one panelists returned ratings (91%). Table 3.6 displays the distribution of panelists’ professional roles in Round Four. Table 3.6 Distribution of Round Four Panelists’ Professional Roles Professional Role  Number of Panelists  Percentage of Panel (%) Special Education Administrator in a Local Education Authority/District 3  9.7      95 Professional Role  Number of Panelists  Percentage of Panel (%) Special Education Administrator at a State/Provincial Resource Centre or Agency 7 22.6 State/Provincial-Level Consultant in Visual Impairment  2 6.5 Administrator of a State/Province-Wide Outreach Program Based at a Specialized School 5 16.1 Recognized Experts - University Professor in a Personnel Preparation Program - Education Consultant/Specialist in Visual Impairment  14 5  9 45.2  Total Round Four Panelists   31  100  Round Four Data Analysis  A Round Four survey hyperlink was emailed to each of the 34 panelists who had completed the Round Three survey. This survey contained a total of three items – two nominated factors and one from the initial set of factors. Thirty-one panelists completed the Round Four survey. Since all three items achieved importance, consensus, and stability criteria for inclusion in the final set of confirmed factors, no subsequent survey rounds were required and data collection was terminated.  Methodological Limitations Despite the advantages of the Delphi approach in generating consensus among expert panelists, there are some noteworthy limitations. There are several challenges to establishing methodological rigour in Delphi studies (Hasson & Keeney, 2011). The following section outlines these challenges and the steps taken, where possible, to mitigate potential impacts on study findings.       96 Validity One of the most critical threats to the internal validity of studies using the Delphi approach is a set of selection criteria that do not sufficiently establish the expert status of panelists (Hasson et al., 2000; Mullen, 2003). The current study operationally defines an expert according to guidelines derived from reviews of the literature on special education professional experience and the identification of "experts" (e.g., Stough & Palmer, 2007). In addition to self-identification as meeting certification and professional experience criteria for expert status, leaders in the field of visual impairment and blindness nominated potential panelists.  In addition to issues surrounding expert selection, there is an inherent threat from bias to the validity of studies using the Delphi approach. The Delphi approach is generally not applied to research problems that are straightforwardly addressed by precise analytic techniques (Pollard & Pollard, 2003). Instead, it is applied to problems that "can benefit from subjective judgments on a collective basis" (Pollard & Pollard, 2004, p.147). Despite the group feedback mechanism in the Delphi approach meant to attenuate the effects of individual biases on the panel's overall level of agreement, these effects are worth considering.  Hallowell and Gambatese (2010) outlined several threats to the validity of studies utilizing the Delphi approach resulting from bias. First, panelists’ responses may be influenced by the order in which survey items are presented. The primacy effect occurs when panelists unconsciously assign greater meaning to initial survey items by virtue of the order of their appearance. In the current study, bias from the primacy effect was minimized through the randomization of both the order of presentation of each thematic category and the randomization of survey items within each category across survey rounds. Similarly, bias may result from the “contrast effect,” where panelists’ ratings for a given survey item may be enhanced or      97 diminished by the perceived contrast between that item and an adjacent survey item. For example, if a survey item is perceived as very important to the process of workload determination, the importance of the survey item that is next encountered may be minimized by comparison. Like the recency effect, bias from the contrast effect was minimized by item order randomization in each round.  Second, Hallowell and Gambatese (2010) cautioned against “myside bias,” which occurs when panelists evaluate survey items in a way that is biased toward their preexisting opinion and attitudes (Stanovich, West, & Toplak, 2013). By asking panelists to provide qualitative entries to elaborate on their quantitative ratings and making these entries available to the entire panel in the subsequent survey round, the current study worked to minimize the potential effects of myside bias. When reviewing the controlled feedback from the previous round, panelists had access to the full range of viewpoints expressed by the panel for a given factor.  Finally, Hallowell and Gambese (2010) identified “dominance” as a potential threat to validity. Dominance occurs when the views and opinions of a vocal or intimidating panelist overrides that of other panelists. Bias resulting from dominance is minimized in the current study through full anonymity between panelists. Anonymity of panelists is a central feature of studies using the Delphi approach. In the current study, any potentially identifying information (e.g., locations, program names, professional titles) was removed from qualitative responses prior to these data being made available to panelists. According to Hallowell and Gambese (2010), strict anonymity between panelists helps to ensure that any bias effects from dominance are mitigated.  Reliability  Common measures of reliability, such as test-retest, cannot be appropriately applied to studies using the Delphi approach, as it is assumed that panelists’ responses will change over the      98 course of the study (Okoli & Pawlowski, 2004). Limited applicability of reliability measurements is a common critique of the Delphi approach (Keeney, Hasson, & McKenna, 2001). To address reliability, researchers recommend increased panel size (Osborne et al., 2003). There is an assumption of “safety in numbers,” in that a group of experts is less likely to arrive at a wrong or unrealistic decision than a single expert (Hasson et al., 2000, p. 1013). Thus, by maximizing the size of the expert panel, researchers increase the likelihood that another panel with similar expert credentials would arrive at similar ratings. The current study consistently exceeded Skulmoski et al.’s (2007) lower limit for sample size for a Delphi study (i.e., 15-20 panelists). The concluding sample size in Round Four was 31 panelists. Therefore, the expert panel of the current study was well above the recommended range for studies using the Delphi approach.  The Delphi approach was applied over four iterative survey rounds. This chapter delineated the methodology of the study and analytic procedures used in survey development. Chapter Four outlines the results of the application of this methodology to the research questions.        99 CHAPTER FOUR FINDINGS  This chapter presents survey results from the four iterative Delphi rounds. Survey data are presented in chronological order by thematic cluster (i.e., Educational Programming, Personnel, and Policy factor) per round. For ease of reference and the purpose of organization, individual factors are numbered within their thematic category (e.g., EDU12, PERS7, POL9). In each of the data tables, a consistent set of column headings is used to indicate the results of the statistical analyses outlined in Chapter Three. “Actual Percentage Agreement” refers to the ImpLOA percentage rating for panelists’ ratings of that factor’s perceived importance to the current practice of workload determination for itinerant TSVIs. “Ideal Percentage Agreement” refers to the ImpLOA percentage rating of that factor’s perceived importance in ideal circumstances – essentially, panelists’ perceptions of how important that factor should be to the actual practice of workload determination for itinerant TSVIs. IQR refers to the interquartile range. The IQR is calculated by subtracting the first quartile from the third quartile of rankings, and is a measure of variability of responses for a given factor. An IQR of one or less on a five-point Likert-type scale indicates low variability in panelists’ ratings and, thus, high consensus (von der Gracht, 2012). ImpLOA refers to the Importance Level of Agreement percentage rating for a given factor and is a composite of the total percentages of “Very Important” and “Important” ratings for that factor.  “Difference sig.” refers to the p-value resulting from the Wilcoxon signed-rank test. As outlined in Chapter Three, for any p-value less than .05, the null hypothesis (i.e., there is no statistically significant difference between two sets of ratings) is rejected and it is concluded that a statistically significant difference exists between the two sets of ratings in question. Round One features only Wilcoxon signed-rank tests on “Actual” and      100 “Ideal” rantings within the same factor. Round Two features Wilcoxon signed-rank tests between Round One and Round Two ratings for initial factors, and “Actual” and “Ideal” ratings for nominated factors. Rounds Three and Four feature only Wilcoxon signed-rank tests within factors, across rounds, as “Actual” and “Ideal” ratings were collected only at the initial rating of a factor.   Round One Data Analysis  Panelists were asked to provide two ratings per item in Round One across three thematic clusters of factors. Each sections that follows displays the results of data analyses conducted within each thematic cluster in Round One.  Educational Programming Factors Table 4.1 displays the complete results for Educational Programming factors (EDU1 – EDU22) in Round One.  Table 4.1 Round One Results for Educational Programming Factors  Educational Programming Factors Round One Round One  Actual Percentage Agreement  IQR – Actual Ideal Percentage Agreement  IQR – Ideal Difference Sig. **p<.05 EDU1. The total number of new students entering the LEA who qualify for service from a TSVI.  77.5 1 79.5 1 .068 EDU2. The total number of students who are currently receiving service from a TSVI in the LEA.  77.5 1 74.3 2 .644 EDU3. The number of students who use braille as his or her primary literacy medium in the LEA. 89.8 1 91.1 0 .008**      101  Educational Programming Factors Round One Round One  Actual Percentage Agreement  IQR – Actual Ideal Percentage Agreement  IQR – Ideal Difference Sig. **p<.05  EDU4. The number of students who use print as his or her primary literacy medium in the LEA.  64.8 2 82.4 1 .122 EDU5. The number of students with deafblindness in the LEA.  59.4 3 81.9 1 .006** EDU6. The number of students with visual impairment and additional disabilities in the LEA.   61.6 3 88.2 1 .002** EDU7. The amount of preparation time required by TSVIs in the LEA.  65.0 3 91.1 0 .001** EDU8. The amount of time needed for TSVIs to complete indirect service tasks (e.g., report writing, team meetings, liaising with community-based organizations).  45.0 2 82.3 1 .001** EDU9. The amount of time needed for TSVIs to complete grant proposals for curriculum expansion, including the acquisition of new teaching materials/technology.  15.1 2 35.7 1 .007** EDU10. Input from advocacy groups regarding the level of service for individual students with visual impairment in the LEA.  25.0 1 40.7 2 .034** EDU11. Input from parents regarding the level of service 73.6 2 87.9 1 .068      102  Educational Programming Factors Round One Round One  Actual Percentage Agreement  IQR – Actual Ideal Percentage Agreement  IQR – Ideal Difference Sig. **p<.05 for individual students with visual impairment in the LEA.  EDU12. Results of a formal caseload analysis process conducted at LEA-level.  57.1 3 93.4 1 .001** EDU13. Results of specialized assessments of student functioning conducted at LEA-level (e.g., Functional Vision Assessment, Learning Media Assessment).  81.6 1 94.0 0 .007** EDU14. Information on the current visual functioning of individual students from medical reports.  79.5 1 88.2 2 .030** EDU15. Information on the prognosis for the visual conditions of individual students in the LEA (e.g., progressive vision loss).  92.4 1 100.0  1 .004** EDU16. Information on the core academic needs (e.g., Mathematics, Science) of individual students in the LEA.    83.7 1 90.6 1 .650 EDU17. Information on the disability-specific (i.e., Expanded Core Curriculum) needs of individual students in the LEA.  64.8 3 100.0 0 .000** EDU18. The availability of assistive technology for students accessing learning materials through vision (e.g., ZoomText, MAgic). 79.0 1 84.9 1 .012**      103  Educational Programming Factors Round One Round One  Actual Percentage Agreement  IQR – Actual Ideal Percentage Agreement  IQR – Ideal Difference Sig. **p<.05  EDU19. The availability of assistive technology for student accessing learning materials through non-visual modalities (e.g., braille notetaker, text-to-speech software).  86.9 1 87.9 0 .009** EDU20. The availability of opportunities for non-academic instruction (i.e., Expanded Core Curriculum) in the home provided by community-based organizations.   31.6 2 70.6 2 .000** EDU21. The availability of opportunities for individual students in the LEA to attend camps and short-term programming provided by community-based organizations.  52.7 2 76.5 1 .000** EDU22. The availability of short-term placement opportunities for individual students in the LEA at a specialized school or center for students with visual impairment.  42.4 2 70.0 2 .004**   The mean ImpLOA percentage rating of ideal condition ratings was higher (81.42%; SD = 16.23%) than the mean ImpLOA percentage rating for actual practice ratings (63.91%; SD = 21.39%). The average differential between ideal and actual practice conditions for educational programming factors is 17.80% (SD = 12.52%), with the overall difference in favour of ideal      104 ImpLOA percentage ratings. Four factors obtained an ideal importance level of agreement (ImpLOA) percentage rating of 90% or greater in Round One. One factor in the “actual practice” scenario achieved this rating. Nine factors obtained an ideal ImpLOA percentage rating between 80 – 89% in Round One. Five factors in the “actual practice” scenario achieved this rating. Three factors obtained an ideal ImpLOA percentage ratings between 70 – 79%. Four factors in the “actual practice” scenario achieved this rating. Finally, two factors in the “ideal” scenario were rated at an ImpLOA percentage rating of 69% or less compared with 12 in the “actual practice” scenario. With the exception of EDU2 (The total number of students who are currently receiving service from a TSVI in the LEA), all other ImpLOA percentage ratings are greater in the ideal condition over that which is reflected in panelists’ perceptions of current practice. Wilcoxon signed rank tests were used to examine the statistical significance of the difference between ideal and “actual practice” ratings for each factor. The null hypothesis was rejected for 17 of 22 factors (77%), indicating a statistically significant difference between actual practice and ideal ratings. An examination of ImpLOA percentage ratings across the 17 factors with significant differences notes that for each factor, the differential is in favour of the ideal condition. Conversely, no significant difference is indicated across ideal and actual practice condition ratings for five of the 22 Educational Programming Factors (23%).  This distinction in favour of ideal importance ratings is also evident in consensus measurement across each set of ratings. On average, there is greater consensus among panelists’ ideal ratings (mean IQR = 1.00) compared to ratings in the “actual practice” scenario (mean IQR = 1.77). Across 19 of 22 Educational Programming factors, there was greater consensus in the ideal scenario compared with the “actual practice” scenario. Three factors have marginally higher IQRs in the ideal scenario, indicating less consensus among panelists - EDU2 (The total      105 number of students who are currently receiving service from a TSVI in the LEA), EDU10 (Input from advocacy groups regarding the level of service for individual students with visual impairment in the LEA) and EDU14 (Information on the current visual functioning of individual students from medical reports). Personnel Factors  Following the educational programming factors in the Round One survey, panelists were asked to provide “ideal” and “actual practice” ratings for personnel-level factors. Table 4.2 displays the complete results for Personnel factors (PERS1 – PERS14) in Round One. Table 4.2 Round One Results for Personnel Factors  Round One Round One  Personnel Factors Actual Percentage Agreement  IQR – Actual Ideal Percentage Agreement  IQR – Ideal Difference Sig. p<.05 PERS1. The professional development needs of TSVIs in the LEA (i.e., conference/travel costs, release time).  56.4 3 88.3 1 .005** PERS2. The time required for TSVIs to travel between school sites.  82.5 1 96.9 0 .002** PERS3. The availability of a TSVI to serve students in more than one capacity in the LEA (dually-certified TSVI/O&M specialist vs. TSVI only).  42.5 1 48.4 2 .460 PERS4. The total number of TSVIs currently employed by the LEA as permanent staff.  69.1 2 73.5 1 .376 PERS5. The number of years of experience of individual TSVIs currently employed by the LEA. 35.9 2 33.3 2 .252      106  Round One Round One  Personnel Factors Actual Percentage Agreement  IQR – Actual Ideal Percentage Agreement  IQR – Ideal Difference Sig. p<.05  PERS6. Data from performance reviews of current TSVIs in the LEA.  38.7 2 76.7 1 .009** PERS7. The availability of qualified Orientation and Mobility (O&M) specialists in the LEA.  77.5 1 90.9 0 .074 PERS8. The number of qualified intervenors for students who are deafblind currently employed by the LEA.     48.2 2 71.4 1 .008** PERS9. The availability of braille transcribers in the Local Education Authority (LEA) to produce materials in alternate formats (e.g., braille, tactile graphics, text in electronic format).  80.5 1 88.5 1 .013** PERS10. The availability of qualified paraprofessionals to support individual students with visual impairment for the entire school day (i.e., one-to-one assignment to the student).  51.2 2 56.3 1 .144 PERS11. The availability of state/provincial centers to provide material resource support to the LEA.  80.5 1 90.9 1 .008** PERS12. The TSVI service needs of neighboring LEAs, in the case of multiple LEAs sharing a TSVI's time.  61.3  2 76.0 2 .010**      107  Round One Round One  Personnel Factors Actual Percentage Agreement  IQR – Actual Ideal Percentage Agreement  IQR – Ideal Difference Sig. p<.05 PERS13. The capacity of the LEA to sponsor current LEA teachers to train to be TSVIs.  57.1 2 86.6 1 .001** PERS14. The geographic proximity of the LEA to the closest university program training new TSVIs.  39.0 3 38.2 3 .046**   The mean ImpLOA percentage rating of ideal condition ratings was higher (72.56%; SD = 20.69%) than the mean ImpLOA percentage rating for actual practice ratings (58.60%; SD = 16.94%). The average differential between ideal and actual practice conditions per personnel-level factor is 14.45% (SD = 11.80%), with the overall difference in favour of ideal ImpLOA percentage ratings. Three factors obtained an ideal ImpLOA percentage rating of 90% or greater while no factors in the actual practice condition achieved this threshold. Another three factors fell within the 80-89% range for ImpLOA percentage ratings with the same number of actual practice ratings falling within that range. Four factors achieved ideal ImpLOA percentage ratings between 70-79% compared to one in the actual practice condition. Finally, another four factors from the ideal condition had an ImpLOA percentage rating of 69% or less versus 10 in the actual practice condition. Greater ImpLOA percentage ratings for actual practice ratings were calculated for two items: PERS5 (The number of years of experience of individual TSVIs currently employed by the LEA); and PERS14 (The geographic proximity of the LEA to the closest university program training new TSVIs). However, it is important to note that in both cases, the differential was minimal (2.6% and 0.8%, respectively). The greatest differential between conditions was for PERS1 (The professional development needs of TSVIs in the LEA;      108 31.9%), indicating that this factor is one that panelists believe deserves greater attention in the process of workload determination for itinerant TSVIs.   In terms of consensus, ideal ratings had a lower mean IQR (1.21) and median IQR (1) than ratings in the actual practice scenario (M = 1.79, Mdn = 2) for personnel-level factors in Round One. This indicates greater consensus for ideal ratings over actual practice ratings. Nine of 14 (64%) personnel-level factors saw a statistically significant difference between ideal and actual practice conditions.  There were five (36%) factors for which the null hypothesis was accepted, indicating no statistically significant difference between ideal and actual practice ratings.    Policy Factors  The final section in the Round One survey contained items that referred to legislative and policy-level considerations in the determination of workloads for itinerant TSVIs in inclusive settings. This was the smallest thematic set of factors, with nine initial factors appearing in Round One. Table 4.3 displays the complete results for policy-level factors (POL1 – POL9) in Round One. Table 4.3  Round One Results for Policy Factors  Policy Factors Round One Round One  Actual Percentage Agreement  IQR – Actual Ideal Percentage Agreement  IQR – Ideal Difference Sig. **p<.05  POL1. The overall budget for special education services in the LEA.  88.2 1 75.8 1 .158 POL2. Federal/State/Provincial per-student funding formulae.  61.3 2 69.2 2 .423      109  Policy Factors Round One Round One  Actual Percentage Agreement  IQR – Actual Ideal Percentage Agreement  IQR – Ideal Difference Sig. **p<.05  POL3. The total number of students qualifying for special education services in the LEA.   58.1 2 44.4 2 .351 POL4. TSVI-to-student ratio stipulated in state/provincial legislation or special education policy document.  16.0 1 40.0 2 .019** POL5. Resources available through a state/provincial deafblind project/program.    72.7 2 71.4 3 .713 POL6. Annual registration data available from state/provincial-level material resource centers.  58.8 3 65.5 3 .262 POL7. Position statements from professional organizations in the field of visual impairment.  42.9 2 71.9 1 .001** POL8. Educational service guidelines published by national/state/provincial associations of special education administrators/directors.  32.2 2 56.0 2 .009** POL9. National statements of standards for the education of students with visual impairment published by stakeholder groups.  60.0 3 86.7 1 .010**       110 The overall mean ImpLOA percentage of ideal condition ratings was higher (64.54%; SD = 15.09%) than the mean ImpLOA percentage for actual practice ratings (54.67%; SD = 21.47%). The mean differential between ImpLOA percentage for conditional ratings was 16.17% (SD = 9.97%). No ideal or actual practice ratings fell into the 90% or greater ImpLOA percentage rating range. One ideal rating and one actual practice rating were in the 80-89% ImpLOA percentage rating range. Three ideal ratings and one actual practice rating were in the 70-79% range. Five ideal ratings and seven actual practice ratings achieved ratings of 69% or below. Only two policy-level factors – POL1 (The overall budget for special education services in the LEA) and POL9 (National statements of standards for the education of students with visual impairment published by stakeholder groups) – achieved ideal ratings that surpassed the 75% ImpLOA significance threshold for the study after Round One.   Panelists’ consensus around policy-level factors was low in both ideal (mean IQR = 1.89, Mdn = 2) and actual practice ratings (mean IQR = 2, Mdn = 2). In this instance low consensus is indicated by higher mean and median IQR in both conditions as compared to the consensus threshold required in the current study (i.e., IQR £1). Statistically significant differences between ideal and actual practice ratings were found for four of nine policy-level factors (44.4%). When comparing ideal and actual practice ratings per factor, a statistically significant difference exists between four of nine policy-level factors. The null hypothesis was accepted for five of nine policy-level factors – the highest portion of overall factors with no significant difference between ideal and actual practice rating of thematic groupings of factors in Round One.  Nominated Factors  A unique feature of the Round One survey was the ability for panelists to nominate factors that impact workload determination for TSVIs that had not appeared in the preceding      111 section. Following each set of initial factors within a thematic cluster, panelists were asked to enter up to 12 additional factors. In total, 241 additional factors were nominated by the panel. In order to collapse these nominated factors into a manageable set to include in the Round Two survey, panelists’ qualitative nominations were imported into NVivo 10.0. The researcher then removed nominations that were duplicates of initial factors from elsewhere in the Round One survey. By clustering nominated factors with a high degree of conceptual overlap (e.g., “Service needs of young children starting Kindergarten” and “Supports required by preschoolers starting school”), the researcher sorted the data into major concepts. Finally, these major concepts were translated into a total of 22 nominated factors. The breakdown of nominated factors across thematic clusters is as follows: Nine educational programming factors, eight personnel-level factors, and five policy-level factors.  Round Two Data Analysis  Second ratings were recorded for each of the Round One factors in Round Two. In addition to initial factors, panelists also rated each of the 22 nominated factors. In total, 67 factors appeared in Round Two. Results are listed in the tables that follow. Round One factors and nominated factors are reported separately since panelists provided only ideal condition ratings for the former while providing ideal and actual practice condition ratings for the latter. In addition to Round One factors, 22 nominated factors were included in the Round Two survey. Since this was the first evaluation of these factors, panelists were asked to provide both ideal and actual practice ratings, as in Round One. Results from nominated factors are reported in sections following the Round Two results for initial items.       112 Round Two Data Analysis – Initial Factors Educational Programming Factors. Table 4.4 displays results from panelists’ ratings in Round Two. Round One ratings are displayed for comparison, along with an indicator of the stability of ratings between Round One and Round Two.  Table 4.4  Round Two Results for Initial Educational Programming Factors  Educational Programming Factors Round One Round Two  Percentage Agreement  IQR Percentage Agreement  IQR Stability Sig. **p<0.05 EDU1. The total number of new students entering the LEA who qualify for service from a TSVI.  79.5 1 85.3 1 .231 EDU2. The total number of students who are currently receiving service from a TSVI in the LEA.  74.3 2 72.8 2 .358 EDU3. The number of students who use braille as his or her primary literacy medium in the LEA.  91.1 0 87.9 1 .836 EDU4. The number of students who use print as his or her primary literacy medium in the LEA.  82.4 1 67.7 1 .323 EDU5. The number of students with deafblindness in the LEA.  81.9 1 91.2 1 .572 EDU6. The number of students with visual impairment and additional disabilities in the LEA.   88.2 1 87.9 1 .499 EDU7. The amount of preparation time required by TSVIs in the LEA.  91.1 0 97.1 1 .465 EDU8. The amount of time needed for TSVIs to complete indirect service tasks (e.g., report writing, team meetings, 82.3 1 91.2 1 .870      113  Educational Programming Factors Round One Round Two  Percentage Agreement  IQR Percentage Agreement  IQR Stability Sig. **p<0.05 liaising with community-based organizations).  EDU9. The amount of time needed for TSVIs to complete grant proposals for curriculum expansion, including the acquisition of new teaching materials/technology.  35.7 1 6.0 1 .052 EDU10. Input from advocacy groups regarding the level of service for individual students with visual impairment in the LEA.  40.7 2 15.6 1 .018** EDU11. Input from parents regarding the level of service for individual students with visual impairment in the LEA.  87.9 1 82.4 1 .068 EDU12. Results of a formal caseload analysis process conducted at LEA-level.  93.4 1 87.5 1 .244 EDU13. Results of specialized assessments of student functioning conducted at LEA-level (e.g., Functional Vision Assessment, Learning Media Assessment).  94.0 0 100.0 1 .458 EDU14. Information on the current visual functioning of individual students from medical reports.  88.2 2 84.8 1 .249 EDU15. Information on the prognosis for the visual conditions of individual students in the LEA (e.g., progressive vision loss).  100.0 1 97.0 0 1.000 EDU16. Information on the core academic needs (e.g., Mathematics, Science) of individual students in the LEA. 90.6 1 84.8 1 .099      114  Educational Programming Factors Round One Round Two  Percentage Agreement  IQR Percentage Agreement  IQR Stability Sig. **p<0.05    EDU17. Information on the disability-specific (i.e., Expanded Core Curriculum) needs of individual students in the LEA.  100.0 0 97.1 1 1.000 EDU18. The availability of assistive technology for students accessing learning materials through vision (e.g., ZoomText, MAgic).  84.9 1 91.2 1 .388 EDU19. The availability of assistive technology for student accessing learning materials through non-visual modalities (e.g., braille notetaker, text-to-speech software).  87.9 0 91.1 1 .107 EDU20. The availability of opportunities for non-academic instruction (i.e., Expanded Core Curriculum) in the home provided by community-based organizations.   70.6 2 75.8 1 .521 EDU21. The availability of opportunities for individual students in the LEA to attend camps and short-term programming provided by community-based organizations.  76.5 1 79.4 1 .166 EDU22. The availability of short-term placement opportunities for individual students in the LEA at a specialized school or center for students with visual impairment.  70.0 2 62.6 2 .491   In Round Two, eight factors achieved an ImpLOA percentage rating of 90% or greater, seven factors between 80-89%, three between 70-79%, and four had an ImpLOA percentage rating of 69% or less. The overall trend in the change in ImpLOA percentage ratings between      115 Rounds One and Two was negative. There were 13 educational programming factors whose importance ratings decreased between the rounds. The average differential between rounds for these factors is -8.34% (SD = -9.23%). Three factors – EDU4, EDU9, EDU10 – with differentials of -14.7%, -29.7%, and -25.1% respectively, accounted for a significant portion of the downward shift in Round Two ratings for educational programming factors. Nine factors saw increases in ImpLOA percentage ratings in Round Two, with an average differential of +5.96% (SD = +2.17%).   Consensus among panelists was largely unchanged between Rounds One and Two. There was no change in the IQR of panelists’ responses for 13 factors. Consensus increased across five factors in Round Two – EDU3, EDU7, EDU13, EDU17, and EDU19. For each factor, IQR decreased from one to zero, indicating very high consensus. Four factors saw an increase in IQR between Rounds One and Two. Three factors – EDU10, EDU14, and EDU20 – moved from an IQR£1 to an IQR≥1 and as a result no longer met consensus criteria following Round Two.   Panelists’ ratings for educational programming factors were mostly stable between Delphi rounds. Wilcoxon signed-rank tests were conducted for each factor by examining the consistency in ratings, per panelist, across Round One and Two. All educational programming factors were stable with the exception of EDU10 (Input from advocacy groups regarding the level of service for individual students with visual impairment in the LEA). The ImpLOA percentage rating for this factor decreased significantly in Round Two (-25.1% from Round One).  Analyses conducted after Round Two provided the first opportunity to evaluate factors against the inclusion/exclusion criteria of the study. From the set of 22 initial educational programming factors, 12 factors (EDU1, EDU3, EDU5, EDU6, EDU7, EDU8, EDU12, EDU13,      116 EDU16, EDU17, EDU18, EDU19) met criteria for inclusion in the final set of confirmed factors. Each of these factors was stable between Rounds One and Two, demonstrated high consensus among panelists (i.e., IQR£1), and achieved a Round Two ImpLOA percentage rating of 75% or greater. As outlined in Chapter Three, initial factors falling within 10% of the 75% ImpLOA percentage point (i.e., 85% - 65%) were included in the Round Three survey.     Personnel Factors. Results for personnel factors from Round Two are displayed in Table 4.5. Round One ratings are displayed for comparison, along with an indicator of the stability of ratings between Round One and Round Two.  Table 4.5 Round Two Results for Initial Personnel Factors Personnel Factors Round One Round Two  Percentage Agreement  IQR Percentage Agreement  IQR Stability Sig. **p<0.05 PERS1. The professional development needs of TSVIs in the LEA (i.e., conference/travel costs, release time).  88.3 1 94.1 0 .171 PERS2. The time required for TSVIs to travel between school sites.  96.9 0 96.9 1 1.00 PERS3. The availability of a TSVI to serve students in more than one capacity in the LEA (dually-certified TSVI/O&M specialist vs. TSVI only).  48.4 2 43.8 2 .543 PERS4. The total number of TSVIs currently employed by the LEA as permanent staff.  73.5 1 75.8 2 .844 PERS5. The number of years of experience of individual TSVIs currently employed by the LEA.  33.3 2 48.5 1 .048** PERS6. Data from performance reviews of current TSVIs in the LEA. 76.7 1 72.7 2 .503      117 Personnel Factors Round One Round Two  Percentage Agreement  IQR Percentage Agreement  IQR Stability Sig. **p<0.05  PERS7. The availability of qualified Orientation and Mobility (O&M) specialists in the LEA.  90.9 0 87.9 1 .595 PERS8. The number of qualified intervenors for students who are deafblind currently employed by the LEA.     71.4 1 75.9 2 .857 PERS9. The availability of braille transcribers in the Local Education Authority (LEA) to produce materials in alternate formats (e.g., braille, tactile graphics, text in electronic format).  88.5 1 87.9 1 .336 PERS10. The availability of qualified paraprofessionals to support individual students with visual impairment for the entire school day (i.e., one-to-one assignment to the student).  56.3 1 50.1 2 .164 PERS11. The availability of state/provincial centers to provide material resource support to the LEA.  90.9 1 87.5 0 .509 PERS12. The TSVI service needs of neighboring LEAs, in the case of multiple LEAs sharing a TSVI's time.  76.0 2 69.0 2 .396 PERS13. The capacity of the LEA to sponsor current LEA teachers to train to be TSVIs.  86.6 1 71.9 2 .003* PERS14. The geographic proximity of the LEA to the closest university program training new TSVIs.  38.2 3 24.2 3 .082   In Round Two, two factors obtained an ImpLOA percentage rating between 90-100%, three factors saw an ImpLOA percentage ratings between 80-89%, four between 70-79%, and      118 five factors achieved an ImpLOA percentage rating of 69% or lower. Nine of 14 ImpLOA percentages ratings decreased from Round One to Round Two. The average differential for these factors is +6.39% (SD = 4.88%). Four of 14 ImpLOA percentage ratings increased from Round One to Round Two. The average differential for these factors between rounds is 6.95% (SD = 5.69%). The ImpLOA percentage rating for one factor (PERS2) did not change between rounds.   The IQR for 10 of 14 personnel-level factors changed in Round Two, denoting shifts in consensus among panelists. The IQR for four factors (PERS4, PERS6, PERS10, PERS13) increased from one to two, indicating that these factors no longer met consensus criteria. Three factors saw greater panelist consensus in Round Two (PERS1, PERS5, PERS11). While there was change in consensus indicators, 12 of 14 ratings were stable between Rounds One and Two. Only ratings for PERS5 (The number of years of experience of individual TSVIs currently employed by the LEA) and PERS13 (The capacity of the LEA to sponsor current LEA teachers to train to be TSVIs) were not stable over the first two rounds of the study. Of  personnel factors that were stable, five achieved the consensus criterion and an ImpLOA percentage rating of 85% or greater. As a result, these five factors (i.e., PERS1, PERS2, PERS7, PERS9, & PERS11) were referred to the final set of confirmed factors following Round Two.  Policy Factors. Results for policy-level factors from Round Two are displayed in Table 4.6. Round One ratings are displayed for comparison, along with an indicator of the stability of ratings between Round One and Round Two.           119 Table 4.6 Round Two Results for Initial Policy Factors Policy Factors Round One Round Two  Percentage Agreement  IQR Percentage Agreement  IQR Stability Sig. **p<0.05 POL1. The overall budget for special education services in the LEA.  75.8 1 69.7 2 .492 POL2. Federal/State/Provincial per-student funding formulae.  69.2 2 57.3  1 .232 POL3. The total number of students qualifying for special education services in the LEA.   44.4 2 28.1 0 .170 POL4. TSVI-to-student ratio stipulated in state/provincial legislation or special education policy document.  40.0 2 25.1 1 .983 POL5. Resources available through a state/provincial deafblind project/program.    71.4 3 75.7 2 .936 POL6. Annual registration data available from state/provincial-level material resource centers. 65.5 3 66.6 1 .788 POL7. Position statements from professional organizations in the field of visual impairment.  71.9 1 57.6 2 .106 POL8. Educational service guidelines published by national/state/provincial associations of special education administrators/directors.  56.0 2 58.1 2 .339 POL9. National statements of standards for the education of students with visual impairment published by stakeholder groups.  86.7 1 87.9 0 .623       120 In Round Two, no policy-level factors achieved an ImpLOA percentage rating between 100-90%. One factor (POL9) had a rating between 80-89%, while two factors had ratings between 70-79%. Six factors had ImpLOA percentage ratings of 69% or lower. The overall trend was for lower ImpLOA percentage ratings in Round Two for policy factors. Among the five factors where ImpLOA percentage ratings were lower, the average differential between Round One and Round Two ratings was -12.7% (SD = -4.02%). By contrast, there was a smaller average differential between Round One and Round Two ratings among factors that saw an increase in ImpLOA percentage rating (+2.18%, SD = 1.49%).  There was a significant overall shift toward greater consensus in Round Two for policy-level factors. IQR decreased for six of nine total factors, with four of those factors achieving the consensus criterion after this round (POL2, POL3, POL4, POL6). After Round Two, five of nine policy-level factors had achieved the consensus criterion. Finally, all ratings in Round Two were stable when compared with Round One ratings. The results of Wilcoxon signed-rank tests indicated no statistically significant differences between panelists’ responses across rounds for all factors. Thus, the null hypothesis is accepted and as a result, panelists’ ratings for initial policy-level factors were determined to be stable.  Round Two Analysis – Nominated Factors  Round Two was the first opportunity for panelists to rate the importance of 22 nominated factors. The proportion of nominated factors across thematic categories mirrored the proportion of Round One items across categories. There were nine nominated educational programming factors, eight nominated personnel-level factors, and five nominated policy-level factors. Since this was the first rating for these factors, panelists were asked to provide two ratings per factor: 1) a rating that corresponded to panelists’ perceptions of the level of importance of that item in      121 the actual practice of determining itinerant TSVI workloads; and 2) a rating that corresponded to panelists’ perceptions of the ideal level of importance of that factor. Round Two results for nominated factors are displayed in the sections that follow. The numbering system initial factor labels is continued for nominated factors for ease of identification.  Educational Programming Factors. Nine educational programming factors resulted from qualitative analyses following Round One. Table 4.7 displays ImpLOA percentage ratings for ideal and actual practice conditions as well as consensus and stability indicators in Round Two. Table 4.7  Round Two Results for Nominated Educational Programming Factors Educational Programming – Nominated Factors Round Two Round Two  Actual Percentage Agreement  IQR - Actual Ideal Percentage Agreement  IQR - Ideal Difference Sig. **p<.05  EDU23. The early intervention service needs of the preschool population in the LEA (i.e., birth to five years).  69.7 2 97.1 0 .000** EDU24. The total amount of materials production (i.e., braille, large print, e-text, tactile materials) time required by TSVIs in the LEA.  65.7 2 94.2 1 .004** EDU25. Consultation time for students with vision loss that do not meet certification criteria for visual impairment in the LEA or state/province.   10.3 2 27.3 3 .030** EDU26. The age distribution of students with visual impairments enrolled in the LEA.  34.4 2 40.0 2 .130 EDU27. The total level of support required by students with visual impairments in the LEA who are 75.7 1 94.1 1 .002**      122 Educational Programming – Nominated Factors Round Two Round Two  Actual Percentage Agreement  IQR - Actual Ideal Percentage Agreement  IQR - Ideal Difference Sig. **p<.05  expected to transition in the next year (e.g., from secondary school to post-secondary options).   EDU28. The level of support required by students with visual impairments in the LEA who will be writing state/province-wide standardized assessments (i.e., high-stakes testing) in that academic year.   54.6 3 50.0 3 .928 EDU29. Time for the TSVI to provide learning opportunities off-site (i.e., off of school grounds) to support Expanded Core Curriculum skill development (e.g., trip to local grocery store).   42.4 2 76.5 1 .000** EDU30. Flexibility in the itinerant TSVI's schedule to accommodate unique student schedules (e.g., student has regular medical appointments and is periodically absent from school).   50.0 2 67.7 1 .010** EDU31. The time/opportunity to collect adequate data to inform educational programming (e.g., progress monitoring, specialized assessments).    48.4 2 94.1 1 .000**   As was evident in Round One among educational programming factors, there was a clear distinction between actual practice and ideal condition ratings for nominated factors. The mean ImpLOA percentage rating for ideal condition ratings for nominated factors was 71.22% compared to 50.13% for actual practice condition ratings. No actual practice condition ratings achieved an ImpLOA percentage rating between 90-100% while four ideal condition ratings fell      123 in this range. No ratings were in the 80-89% range for either condition. Two actual practice condition ratings were in the 70-79% range compared to one ideal condition rating. Finally, seven of nine (77.7%) of actual practice condition ratings had an ImpLOA percentage rating of less than 69% versus only four ideal condition ratings. The mean differential between actual practice and ideal ImpLOA percentage ratings was 22.11% (SD = 13.26) in favour of ideal ratings. Of factors where ideal ratings were greater than actual practice ratings, the mean differential was 24.3% (SD = 12.32%). The only exception to this trend was EDU28, where the actual practice condition rating exceeded the ideal condition rating by 4.6%. A high number of statistically significant differences between actual practice and ideal ratings was anticipated by large differentials evident in the descriptive data. When Wilcoxon signed-rank tests were applied to panelists’ conditional ratings, seven of nine factors had statistically significant differences between actual practice and ideal condition ratings.   The opposite trend was true in terms of consensus among panelists on nominated educational programming factors. Overall, actual practice condition ratings showed greater consensus (mean IQR = 1.44) than ideal condition ratings (mean IQR = 2). EDU28 (The level of support required by students with visual impairments in the LEA who will be writing state/province-wide standardized assessments) was notably controversial with both actual practice and ideal condition ratings achieving very low consensus (IQR = 3) across panelists. Least controversial was EDU23 (The early intervention service needs of the preschool population in the LEA [i.e., birth to five years]) with high (IQR = 1) and very high (IQR = 0) consensus in actual practice and ideal condition ratings, respectively.       124 Personnel Factors. Eight nominated personnel-level factors were included in the Round Two survey. Table 4.8 displays ImpLOA percentage ratings for ideal and actual practice conditions as well as consensus and stability indicators. Table 4.8  Round Two Results for Nominated Personnel Factors Personnel – Nominated Factors Round Two Round Two  Actual Percentage Agreement IQR - Actual Ideal Percentage Agreement IQR - Ideal Difference Sig. **p<.05 PERS15. Time/opportunity for more experienced TSVIs to mentor early career/novice TSVIs in the LEA.   43.8 2 93.9 1 .000** PERS16. The time required for TSVIs to devote to leadership roles at the LEA-, state/province-, national-level (e.g., committee work, event coordination).   19.4 2 54.6 1 .000** PERS17. The time required for opportunities for collaboration between TSVIs and other vision professionals in the LEA (e.g., O&M Specialists, Low Vision Therapists, other TSVIs).   50.0 2 87.1 1 .000** PERS18. The time required for opportunities for collaboration between TSVIs and other specialists in the LEA (e.g., Occupational Therapists, Speech-Language Pathologists).   33.4 2 84.8 1 .000** PERS19. Long-term absences/leaves or retirement among TSVIs in the LEA.  50.0 2 66.7 2 .005**      125 Personnel – Nominated Factors Round Two Round Two  Actual Percentage Agreement IQR - Actual Ideal Percentage Agreement IQR - Ideal Difference Sig. **p<.05 PERS20. The individual skill sets or specialized expertise of TSVIs in the LEA (e.g., TSVI with advanced knowledge of assistive technology; TSVI with expertise in early literacy for students who read braille).   43.8 2 84.9 1 .000** PERS21. The level of support required by paraprofessionals working with students with visual impairments in the LEA.  37.9 2 63.3 1 .017** PERS22. The level of support from the TSVI required by students' school-based teams in the LEA (e.g., in-service training, consultation).  40 2 83.9 1 .000**   There was a notable difference between actual practice and ideal condition ratings among nominated personnel-level factors in Round Two. One ideal condition rating achieved an ImpLOA percentage ratings between 90-100%, four between 80-89%, none between 70-79%, and three below 69%. All actual practice condition ratings had an ImpLOA percentage rating of less than 69%. The mean ImpLOA percentage rating for the actual practice condition was 39.79% (SD = 10.0%) while the mean ImpLOA percentage rating for the ideal condition was 77.40% (SD = 13.90%). The mean differential between conditional ratings was 37.61% (SD = 11.89%), with a range between differentials of 16.7% and 51.4%. Ideal condition ratings exceeded actual practice ratings across all nominated personnel-level factors. As was the case with nominated educational programming factors, a high number of statistically significant differences between ratings per factor were anticipated. Unsurprisingly, all factors showed statistically significant differences between actual practice and ideal condition ratings.        126  There was less consensus among panelists for actual practice ratings (mean IQR = 2) than for ideal ratings (mean IQR = 1.13). In Round Two, all but one nominated factor met consensus criteria (IQR£1) for ideal condition ratings. When compared with ideal condition ratings for nominated educational programming factors, nominated personnel-level factors were less controversial in Round Two.  Policy Factors. Five policy-level factors resulted from qualitative analyses of panelists’ nominations. Table 4.9 displays ImpLOA percentage ratings for ideal and actual practice conditions as well as consensus and stability indicators in Round Two.  Table 4.9 Round Two Results for Nominated Policy Factors Policy – Nominated Factors Round Two Round Two  Actual Percentage Agreement IQR - Actual Ideal Percentage Agreement IQR - Ideal Difference Sig. **p<.05 POL10. The findings of research studies of expert opinion on service levels for students with visual impairments (i.e., Delphi studies).   28.1 2 78.8 1 .000** POL11. Special education administrator's degree of familiarity with specialized programming considerations for students with visual impairments (e.g., role and responsibilities of the TSVI).   56.3 3 90.9 1 .000** POL12. Technical assistance and guidance from a state/provincial Department/Ministry of Education-level consultant in visual impairment.  53.3 2 84.4 1 .001** POL13. Language in Collective Agreements and labor relations/union considerations.  39.1 2 54.1 2 .426      127 POL14. Special education policies and procedures in place in general at the LEA level (e.g., district policy on inclusion).  65.5 1 68.8 2 .665   In a trend seen across other thematic categories of nominated factors, there is a significant overall difference between actual practice and ideal condition ratings favouring ideal ratings. The mean ImpLOA percentage rating for actual practice ratings was 48.46% (SD = 14.81%) compared with a mean of 75.40% (SD = 14.41%) for ideal condition ratings. One ideal condition rating achieved an ImpLOA percentage rating between 90-100% and one in each of the 80-89% and 70-79% ranges. Two ideal condition ratings had ImpLOA percentage ratings of 69% or less. All actual practice ratings fell below the 69% threshold. The mean differential between ImpLOA percentage ratings was 26.94% (SD = 18.32%) with a range from 3.3% to 50.7%. Three of five nominated policy-level factors had statistically significant differences between actual practice and ideal ratings.   Panelists’ ratings showed greater consensus in the ideal condition (mean IQR = 1.4) versus the actual practice condition (IQR = 2). The most pronounced difference in consensus per factor was noted for POL11 (Special education administrator's degree of familiarity with specialized programming considerations for students with visual impairments). Actual practice ratings showed low consensus (IQR = 3) among panelists while ideal practice ratings demonstrated high consensus (IQR = 1). These data suggest that panelists disagreed on the degree to which special education administrators are familiar with programming considerations for students with visual impairments when making workload decisions for TSVIs. However, it is clear from consensus (IQR = 1) and level of agreement (ImpLOA percentage rating = 90.9%)      128 that, in general, panelists believe that special educators should be aware of these considerations and factor them into workload determinations for itinerant TSVIs.  Round Three Data Analysis  After applying inclusion and exclusion criteria following Round Two, 13 initial factors were included in Round Three. Six of these are educational programming factors, four are personnel-level factors, and three are policy-level factors. All nominated factors were included in Round Three as second ideal condition ratings are needed to establish the stability of those ratings. As with Round Two analyses, initial and nominated factors results are reported in separate sections in Round Three analyses.  Round Three Analysis – Initial Factors  Educational Programming Factors. Six initial educational programming factors were rated in Round Three. Table 4.10 displays ImpLOA percentage ratings as well as consensus and stability indicators in Round Three. Table 4.10  Round Three Results for Initial Educational Programming Factors Educational Programming Factors Round One Round Two Round Three Stability Sig. **p<.05 Percentage Agreement IQR Percentage Agreement IQR Percentage Agreement IQR EDU2. The total number of students who are currently receiving service from a TSVI in the LEA.  74.3 2 72.8 2 82.3 1 .093 EDU4. The number of students who use print as his or her primary literacy medium in the LEA.  82.4 1 67.7 1 60.6 1 .614      129 Educational Programming Factors Round One Round Two Round Three Stability Sig. **p<.05 Percentage Agreement IQR Percentage Agreement IQR Percentage Agreement IQR EDU5. The number of students with deafblindness in the LEA.  81.9 1 81.9 1 87.6 0 .439 EDU11. Input from parents regarding the level of service for individual students with visual impairment in the LEA.  87.9 1 82.4 1 88.2 1 .268 EDU20. The availability of opportunities for non-academic instruction (i.e., Expanded Core Curriculum) in the home provided by community-based organizations.   70.6 2 75.8 1 97.0 1 .003** EDU21. The availability of opportunities for individual students in the LEA to attend camps and short-term programming provided by community-based organizations.  76.5 1 79.4 1 85.3 1 .171   In Round Three, one factor (EDU20) had an ImpLOA percentage rating between 90-100%, four between 80-89%, none between 70-79%, and one with an ImpLOA percentage rating less than 69%. ImpLOA percentage ratings increased for four of five initial educational      130 programming factors in Round Three. EDU4 (The number of students who use print as his or her primary literacy medium in the LEA) was the only factor whose ImpLOA percentage rating decreased between Rounds Two and Three and the only factor to fall below the 75% ImpLOA percentage threshold in Round Three.   All factors met criteria for high consensus in Round Three. Consensus remained stable for four factors, and increased for two factors (EDU2, EDU5). Four of five factor ratings were stable between Rounds Two and Three. EDU20 (The availability of opportunities for non-academic instruction in the home provided by community-based organizations) saw a marked increase in ImpLOA percentage ratings in Round Three (+21.2%) and did not meet stability criteria. The null hypothesis was rejected and as a result, EDU20 was referred to Round Four. As the only factor to achieve consensus and stability criteria but with an ImpLOA percentage rating below the 75% threshold, EDU4 was excluded from the final set of confirmed factors and was not referred to Round Four. EDU2, EDU5, EDU11, EDU20, and EDU21 all had ImpLOA percentage ratings above the 75% threshold, high consensus, and were stable in Round Three. Therefore, all are included in the final set of confirmed factors.   Personnel Factors. Five initial personnel factors were rated in Round Three. Table 4.11 displays ImpLOA percentage ratings as well as consensus and stability indicators. Table 4.11  Round Three Results for Initial Personnel Factors Personnel Factors Round One Round Two Round Three Stability Sig. **p<.05 Percentage Agreement IQR Percentage Agreement IQR Percentage Agreement IQR PERS4. The total number of TSVIs currently employed by the 73.5 1 75.8 2 82.3 1 .803      131 Personnel Factors Round One Round Two Round Three Stability Sig. **p<.05 Percentage Agreement IQR Percentage Agreement IQR Percentage Agreement IQR LEA as permanent staff.  PERS6. Data from performance reviews of current TSVIs in the LEA.  76.7 1 72.7 2 66.6 1 .132 PERS8. The number of qualified intervenors for students who are deafblind currently employed by the LEA.     71.4 1 75.9 2 69.0 2 .499 PERS12. The TSVI service needs of neighboring LEAs, in the case of multiple LEAs sharing a TSVI's time.  76.0 2 69.0 2 87.4 1 .095 PERS13. The capacity of the LEA to sponsor current LEA teachers to train to be TSVIs. 86.6 1 71.9 2 74.2 2 .572   No initial personnel factors achieved an ImpLOA percentage rating within the 90-100% range. Two factors had ratings between 80-89%, one between 70-80%, and two with an ImpLOA percentage rating equal to or  less than 69%. Three of four factors saw ratings increase between Rounds Two and Three. The ImpLOA percentage rating for PERS8 (The number of qualified intervenors for students who are deafblind currently employed by the LEA) was lower in Round Three and fell below the 75% ImpLOA percentage threshold.       132  The consensus criterion was achieved for two of the four initial personnel-level factors in Round Three. IQR for PERS4 and PERS1indicate that panelists moved toward greater consensus after lower consensus on these factors in Round Two. Panelists’ ratings did not achieve consensus on PERS8 and PERS13. However, both factors had an ImpLOA percentage rating below the 75% ImpLOA percentage threshold and were stable between Rounds Two and Three. As a result of this failure to achieve the ImpLOA percentage threshold, both factors are excluded from the final set of confirmed factors and neither is referred to Round Four. PERS4 and PERS12 were stable between rounds, and when also evaluated against ImpLOA percentage rating and consensus criteria, both are eligible for inclusion in the final set of confirmed factors.    Policy Factors. Three initial policy factors were rated in Round Three. Table 4.12 displays ImpLOA percentage ratings as well as consensus and stability indicators. Table 4.12  Round Three Results for Initial Policy Factors Policy Factors Round One Round Two Round Three Stability Sig. **p<.05 Percentage Agreement IQR Percentage Agreement IQR Percentage Agreement IQR POL1. The overall budget for special education services in the LEA.  75.8 1 69.7 2 81.8 1 .254 POL5. Resources available through a state/provincial deafblind project/program.   71.4 3 75.7 2 82.4 1 .741 POL6. Annual registration data available from state/provincial-level material resource centers. 65.5 3 66.6 1 66.7 1 .577      133 While no ImpLOA percentage ratings fell in the 90-100% range in Round Three, two of three factors had ImpLOA percentage ratings in the 80-89% range. Lastly, POL6 achieved an ImpLOA percentage rating of less than 69%. All three policy-level factors demonstrated high consensus among panelists in Round Three and were stable between Rounds Two and Three. POL1 and POL5 achieved ImpLOA percentage, consensus, and stability criteria for inclusion in the final set of confirmed factors. POL6 achieved both consensus and stability criteria, but fell below the 75% ImpLOA percentage threshold for inclusion. As a result, POL6 is excluded from the final set of confirmed factors. No initial policy-level factors are referred to Round Four.  Round Three Analysis – Nominated Factors  All nominated factors from Round Two appeared in Round Three. Panelists rated the importance of each of 22 nominated factors.   Educational Programming Factors. Nine nominated educational programming factors received second ratings in Round Three. Table 4.13 displays ImpLOA percentage ratings as well as consensus and stability indicators. Table 4.13  Round Three Results for Nominated Educational Programming Factors Educational Programming – Nominated Factors Round Two Round Three Stability Sig. **p<.05 Percentage Agreement IQR Percentage Agreement IQR EDU23. The early intervention service needs of the preschool population in the LEA (i.e., birth to five years).   97.1 0 94.2 0 .276 EDU24. The total amount of materials production (i.e., braille, large print, e-text, tactile materials) time required by TSVIs in the LEA.  94.2 1 97.1 0 .197 EDU25. Consultation time for students with vision loss that do not meet 27.3 3 8.8 1 .206      134 Educational Programming – Nominated Factors Round Two Round Three Stability Sig. **p<.05 Percentage Agreement IQR Percentage Agreement IQR certification criteria for visual impairment in the LEA or state/province.   EDU26. The age distribution of students with visual impairments enrolled in the LEA.  40.0 2 39.4 1 .712 EDU27. The total level of support required by students with visual impairments in the LEA who are expected to transition in the next year (e.g., from secondary school to post-secondary options).   94.1 1 88.3 1 .357 EDU28. The level of support required by students with visual impairments in the LEA who will be writing state/province-wide standardized assessments (i.e., high-stakes testing) in that academic year.   50.0 3 53.0 3 .950 EDU29. Time for the TSVI to provide learning opportunities off-site (i.e., off of school grounds) to support Expanded Core Curriculum skill development (e.g., trip to local grocery store).   76.5 1 91.1 1 .202 EDU30. Flexibility in the itinerant TSVI's schedule to accommodate unique student schedules (e.g., student has regular medical appointments and is periodically absent from school).   67.7 1 81.3 1 .248 EDU31. The time/opportunity to collect adequate data to inform educational programming (e.g., progress monitoring, specialized assessments).    94.1 1 100.0 0 .039**       135  In Round Three, four of nine nominated educational programming factors had an ImpLOA percentage rating between 90-100%, two between 80-89%, none between 70-79%, and three factors had ImpLOA percentage ratings below 69%. ImpLOA percentage ratings decreased for four of nine factors in Round Three (M = -6.95%, SD = 7.99%) while five factors saw increases in ImpLOA percentage ratings (M = +8.0%, SD = 5.71%) when compared with Round Two results. The average differential between Round Two and Round Three ratings was 7.53% (SD = 6.37%), with a notable decrease for EDU25 (-18.5%) and increases for EDU29 (+14.6%) and EDU30 (+13.6%). Despite these shifts in ImpLOA percentage ratings for some nominated educational programming factors, all factors were stable between Rounds Two and Three with the exception of EDU31 (The time/opportunity to collect adequate data to inform educational programming).  Consensus was very high for most nominated educational programming factors in Round Three (mean IQR = 0.89). Panelists’ ratings achieved greater consensus on four of nine factors when compared with indicators from Round Two. All other factors saw no change in IQR between rounds. Based on Round Three results, ImpLOA percentage ratings for five factors (EDU23, EDU24, EDU27, EDU29, EDU30) surpassed the 75% ImpLOA percentage threshold, as well as consensus and stability criteria. Therefore, these factors were included in the final set of confirmed factors. Three factors (EDU25, EDU26, EDU28) failed to meet the 75% ImpLOA percentage threshold while achieving both consensus and stability criteria. These factors were excluded from the final set of confirmed factors. EDU31, while exceeding the 75% ImpLOA percentage threshold and meeting consensus criteria, failed to meet the stability criterion in Round Three. EDU31 was referred to Round Four.       136  Personnel Factors. Eight nominated personnel-level factors received second ratings in Round Three. Table 4.14 displays ImpLOA percentage ratings as well as consensus and stability indicators. Table 4.14  Round Three Results for Nominated Personnel Factors Personnel – Nominated Factors Round Two Round Three Stability Sig. **p<.05 Percentage Agreement IQR Percentage Agreement IQR PERS15. Time/opportunity for more experienced TSVIs to mentor early career/novice TSVIs in the LEA.   93.9 1 93.9 1 1.000 PERS16. The time required for TSVIs to devote to leadership roles at the LEA-, state/province-, national-level (e.g., committee work, event coordination).   54.6 1 35.3 2 .067 PERS17. The time required for opportunities for collaboration between TSVIs and other vision professionals in the LEA (e.g., O&M Specialists, Low Vision Therapists, other TSVIs).   87.1 1 94.0 1 .578 PERS18. The time required for opportunities for collaboration between TSVIs and other specialists in the LEA (e.g., Occupational Therapists, Speech-Language Pathologists).   84.8 1 88.2 1 .837 PERS19. Long-term absences/leaves or retirement among TSVIs in the LEA.  66.7 2 81.9 0 .360 PERS20. The individual skill sets or specialized expertise of TSVIs in the LEA (e.g., TSVI with advanced knowledge of assistive technology; 84.9 1 91.2 1 .417      137 Personnel – Nominated Factors Round Two Round Three Stability Sig. **p<.05 Percentage Agreement IQR Percentage Agreement IQR TSVI with expertise in early literacy for students who read braille).   PERS21. The level of support required by paraprofessionals working with students with visual impairments in the LEA.  63.3 1 82.4 1 .054 PERS22. The level of support from the TSVI required by students' school-based teams in the LEA (e.g., in-service training, consultation).  83.9 1 97.0 1 .169   Four of eight nominated personnel-level factors had ImpLOA percentage ratings between 90-100% in Round Three while three factors had ImpLOA percentage ratings in the 80-89% range. One factor (PERS16) was rated below the 69% ImpLOA percentage mark. Six of eight factors saw increases in ImpLOA percentage ratings between Rounds Two and Three (M = +9.14%, SD = 6.86%). One factor did not change and one factor decreased (PERS16) between rounds. The average differential for nominated personnel-level factors between Rounds Two and Three was 10.41% (SD = 7.28%).   For most factors, consensus remained consistent between Rounds Two and Three. Evaluating Round Three results for nominated personnel-level factors against inclusion criteria for the study, six of eight factors (PERS15, PERS17, PERS18, PERS19, PERS20, PERS22) were added to the final set of confirmed factors. One factor (PERS16) was excluded from the final set of confirmed factors since the ImpLOA percentage rating fell below the 75% ImpLOA percentage threshold, coupled with low consensus in Round Three. Finally, PERS21 (The level of support required by paraprofessionals working with students with visual impairments in the      138 LEA) is referred to Round Four resulting from a stability indicator approaching significance (p = 0.054). Given the narrow margin for rejecting the null hypothesis, it was not prudent to consider the stability criterion satisfied for this factor after Round Three. Policy Factors. Five nominated policy-level factors received second ratings in Round Three. Table 4.15 displays ImpLOA percentage ratings as well as consensus and stability indicators. Table 4.15  Round Three Results for Nominated Policy Factors Policy – Nominated Factors Round Two Round Three Stability Sig. **p<.05 Percentage Agreement IQR Percentage Agreement IQR POL10. The findings of research studies of expert opinion on service levels for students with visual impairments (i.e., Delphi studies).   78.8 1 91.2 1 .424 POL11. Special education administrator's degree of familiarity with specialized programming considerations for students with visual impairments (e.g., role and responsibilities of the TSVI).   90.9 1 91.1 1 .539 POL12. Technical assistance and guidance from a state/provincial Department/Ministry of Education-level consultant in visual impairment.  84.4 1 84.9 1 .276 POL13. Language in Collective Agreements and labor relations/union considerations.  54.1 2 60.0 1 .411 POL14. Special education policies and procedures in place in general at the LEA level (e.g., district policy on inclusion).  68.8 2 82.4 1 .087       139  Two of five nominated policy-level factors had ImpLOA percentage ratings between 90-100% in Round Three. Two factors saw ImpLOA percentage ratings between 80-89% and one factor (POL13) had a rating below 69%. ImpLOA percentage ratings for all nominated policy-level factors increased in Round Three. The average differential was +6.52% (SD = 6.35%). There were notable increases for POL10 (+12.4%) and POL14 (+13.6%). All nominated policy-level factors were stable in Round Three when compared with rankings from Round Two.   All nominated policy-level factors met the consensus criterion for the study (IQR £ 1) in Round Three. Panelists’ ratings demonstrated greater consensus for POL13 and POL14 compared with Round Two ratings while consensus indicators remained the same for POL10, POL11, and POL12. By evaluating Round Three results for nominated policy-level factors against inclusion criteria for the study, four of five factors (POL10, POL11, POL12, POL14) were subsequently added to the final set of confirmed factors. While satisfying both consensus and stability criteria, POL13 (Language in Collective Agreements and labor relations/union considerations) was excluded as a result of a Round Three ImpLOA percentage rating falling below the 75% ImpLOA percentage threshold.  Round Four Data Analysis  Following Round Three analyses, three factors met ImpLOA percentage ratings and consensus criteria but not stability criteria for inclusion in the final set of confirmed factors. A short survey containing these three items was sent to the panel via an emailed hyperlink. Given the limited number of survey items in Round Four, results are not reported by thematic cluster and are instead aggregated in the table below.        140 Table 4.16 Round Four Results for All Factors  Round One Round Two Round Three Round Four Stability Sig. **p<.05 Percentage Agreement IQR Percentage Agreement IQR Percentage Agreement IQR Percentage Agreement IQR EDU20.   70.6 2 75.8 1 97.0 1 86.7 1 .332 EDU31.     94.1 1 100.0 0 96.8 1 .439 PERS21.     63.3 1 82.4 1 83.9 0 .971  One initial educational programming factor appeared in Round Four, along with a nominated educational programming factor and a nominated personnel-level factor. EDU31 had an ImpLOA percentage rating between 90-100% while the other Round Four factors had ImpLOA percentage ratings in the 80-90% range. All factors also met consensus and stability criteria and as a result, were added to the final set of confirmed factors. In addition to finalizing the set of confirmed factors, results from Round Four indicated the termination of the data collection phase of the current study.   Additional Data Analysis  Following the conclusion of Round Four data analysis, additional analyses were required in order to fully address the research questions (i.e., R1, R2, R3). These analyses were not required in the development of iterative surveys. As a result, these results are reported outside of those required to progress through the application of the Delphi approach. The sections that follow outline the differences between panelists’ conditional ratings (R1 and R3) provided at the initial rating, as well as examine the final set of confirmed factors (R2) in its entirety.        141 Conditional Ratings of Initial and Nominated Factors Mean differentials of ImpLOA percentage ratings between conditions are significant across all educational programming factors, personnel-level factors, and policy-factors. Greater discrepancies were evident between conditional ratings for nominated items when compared with mean differentials in ImpLOA percentage ratings between conditions for initial factors. Wilcoxon signed-rank tests were conducted on each factor to test for statistically significant differences. In all cases, mean ImpLOA percentages for ideal condition ratings exceeded actual practice condition ratings. Notable factors (i.e., confirmed factors in the upper quartile [75th percentile and above]) are listed in Table 4.17. Table 4.17  Factors with the Greatest Differentials Between ImpLOA Conditional Ratings  Factor [Initial/Nominated] Percentage Differential (%)  Difference Sig. **p<.05 PERS18. The time required for opportunities for collaboration between TSVIs and other specialists in the LEA (e.g., Occupational Therapists, Speech-Language Pathologists). [Nominated]  +51.4 .000** POL10. The findings of research studies of expert opinion on service levels for students with visual impairments (i.e., Delphi studies). [Nominated]  +50.7 .000** PERS15. Time/opportunity for more experienced TSVIs to mentor early career/novice TSVIs in the LEA. [Nominated]  +50.1 .000** EDU31. The time/opportunity to collect adequate data to inform educational programming (e.g., progress monitoring, specialized assessments).  [Nominated]  +45.7 .000** PERS22. The level of support from the TSVI required by students' school-based teams in the LEA (e.g., in-service training, consultation). [Nominated]  +43.9 .000** PERS20. The individual skill sets or specialized expertise of TSVIs in the LEA (e.g., TSVI with advanced knowledge of assistive +41.1 .000**      142  Factor [Initial/Nominated] Percentage Differential (%)  Difference Sig. **p<.05 technology; TSVI with expertise in early literacy for students who read braille). [Nominated]  EDU20. The availability of opportunities for non-academic instruction (i.e., Expanded Core Curriculum) in the home provided by community-based organizations. [Initial]  +39.0 .000** PERS6. Data from performance reviews of current TSVIs in the LEA. [Initial]  +38.0 .009** EDU8. The amount of time needed for TSVIs to complete indirect service tasks (e.g., report writing, team meetings, liaising with community-based organizations). [Initial]  +37.3 .001** PERS17. The time required for opportunities for collaboration between TSVIs and other vision professionals in the LEA (e.g., O&M Specialists, Low Vision Therapists, other TSVIs). [Nominated]  +37.1 .000** EDU12. Results of a formal caseload analysis process conducted at LEA-level. [Initial]  +36.3 .001** EDU17. Information on the disability-specific (i.e., Expanded Core Curriculum) needs of individual students in the LEA. [Initial]  +35.2 .000** POL11. Special education administrator's degree of familiarity with specialized programming considerations for students with visual impairments (e.g., role and responsibilities of the TSVI). [Nominated]  +34.6 .000** EDU29. Time for the TSVI to provide learning opportunities off-site (i.e., off of school grounds) to support Expanded Core Curriculum skill development (e.g., trip to local grocery store). [Initial]  +34.1 .000** PERS1. The professional development needs of TSVIs in the LEA (i.e., conference/travel costs, release time). [Initial]  +31.9 .005** POL12. Technical assistance and guidance from a state/provincial Department/Ministry of Education-level consultant in visual impairment.  +31.1 .001**      143   In examining the differentials for conditional ratings across individual factors, actual practice ratings exceeded ideal ratings for a total of two educational programming factors, eight personnel-level factors, and three policy-level factors. Therefore, panelists’ average ratings of importance indicate that the current perceived importance of 13 factors in actual practice is higher than what it should ideally be in practice. Wilcoxon signed-rank tests were conducted to determine if there were any statistically significant differences between conditional ImpLOA percentage ratings. No significant differences were detected for factors where the actual practice rating exceeded the ideal condition rating.  Where ideal condition ratings were greater than actual practice condition ratings, 54 of 67 total factors had conditional ratings that favoured the ideal condition. The results of Wilcoxon signed-rank tests comparing conditional ratings indicate that for these 54 factors, 48 differentials are statistically significant. Therefore, the panel indicated that for 48 of 67 total items (71.64%) in the current study, there is a significant discrepancy between the importance of those factors in current practice versus the ideal importance of that factor to the practice of workload determination for itinerant TSVIs. The magnitude of the differential between conditional ratings provides an indication of panelists’ perceptions of a departure from best practice when determining TSVI workloads. For example, panelists noted a significant discrepancy between the importance of considering the unique skill set of the TSVI (i.e., PERS20) when determining workloads and their perceptions of how significantly this factors into the current practice of workload determination for itinerant TSVIs. Factors in the final set of confirmed factors with large, statistically significant differentials between conditional ImpLOA percentage ratings      144 should be prioritized and highlighted as essential considerations in the process of TSVI workload determination. Final Set of Confirmed Factors Table 4.18 displays the complete set of confirmed factors following Round Four, listed from highest to lowest by final ImpLOA percentage rating. Each of these 45 factors met the percentage agreement, consensus, and stability criteria for inclusion determined at the outset of the Delphi process. From the initial set of factors, 28 of 45 (62.22%) ultimately met criteria for inclusion. This is compared with 17 of 22 nominated factors (77.27%) that met the same criteria for inclusion in the final set of confirmed factors. The higher percentage of nominated factors that met inclusion criteria is not surprising considering that these survey items were generated by the panel. These were factors that were salient to panelists, standing out as notable absences from the initial set of factors and warranting nomination.  Table 4.18 Complete Listing of the Set of Confirmed Factors by Final ImpLOA Percentage Rating Factor  Final Round ImpLOA Percentage Rating (%) EDU13. Results of specialized assessments of student functioning conducted at LEA-level (e.g., Functional Vision Assessment, Learning Media Assessment).  100.0 EDU7. The amount of preparation time required by TSVIs in the LEA  97.1 EDU17. Information on the disability-specific (i.e., Expanded Core Curriculum) needs of individual students in the LEA.  97.1 EDU24. The total amount of materials production (i.e., braille, large print, e-text, tactile materials) time required by TSVIs in the LEA.  97.1 EDU15. Information on the prognosis for the visual conditions of individual students in the LEA (e.g., progressive vision loss).  97.0      145 Factor  Final Round ImpLOA Percentage Rating (%) PERS22. The level of support from the TSVI required by students' school-based teams in the LEA (e.g., in-service training, consultation).  97.0 PERS2. The time required for TSVIs to travel between school sites.  96.9 EDU31. The time/opportunity to collect adequate data to inform educational programming (e.g., progress monitoring, specialized assessments).    96.8 EDU23. The early intervention service needs of the preschool population in the LEA (i.e., birth to five years).   94.2 PERS1. The professional development needs of TSVIs in the LEA (i.e., conference/travel costs, release time).  94.1 PERS17. The time required for opportunities for collaboration between TSVIs and other vision professionals in the LEA (e.g., O&M Specialists, Low Vision Therapists, other TSVIs).   94.0 PERS15. Time/opportunity for more experienced TSVIs to mentor early career/novice TSVIs in the LEA.   93.9 EDU8. The amount of time needed for TSVIs to complete indirect service tasks (e.g., report writing, team meetings, liaising with community-based organizations).  91.2 EDU18. The availability of assistive technology for students accessing learning materials through vision (e.g., ZoomText, MAgic).  91.2 PERS20. The individual skill sets or specialized expertise of TSVIs in the LEA (e.g., TSVI with advanced knowledge of assistive technology; TSVI with expertise in early literacy for students who read braille).   91.2 POL10. The findings of research studies of expert opinion on service levels for students with visual impairments (i.e., Delphi studies).   91.2 EDU19. The availability of assistive technology for student accessing learning materials through non-visual modalities (e.g., braille notetaker, text-to-speech software).  91.1      146 Factor  Final Round ImpLOA Percentage Rating (%) EDU29. Time for the TSVI to provide learning opportunities off-site (i.e., off of school grounds) to support Expanded Core Curriculum skill development (e.g., trip to local grocery store).  91.1 POL11. Special education administrator's degree of familiarity with specialized programming considerations for students with visual impairments (e.g., role and responsibilities of the TSVI).  91.1 EDU27. The total level of support required by students with visual impairments in the LEA who are expected to transition in the next year (e.g., from secondary school to post-secondary options).   88.3 EDU11. Input from parents regarding the level of service for individual students with visual impairment in the LEA.  88.2 PERS18. The time required for opportunities for collaboration between TSVIs and other specialists in the LEA (e.g., Occupational Therapists, Speech-Language Pathologists).   88.2 EDU3. The number of students who use braille as his or her primary literacy medium in the LEA.  87.9 EDU6. The number of students with visual impairment and additional disabilities in the LEA.   87.9 PERS7. The availability of qualified Orientation and Mobility (O&M) specialists in the LEA.  87.9 PERS9. The availability of braille transcribers in the Local Education Authority (LEA) to produce materials in alternate formats (e.g., braille, tactile graphics, text in electronic format). 87.9 POL9. National statements of standards for the education of students with visual impairment published by stakeholder groups.  87.9 EDU5. The number of students with deafblindness in the LEA. 87.6  EDU12. Results of a formal caseload analysis process conducted at LEA-level.  87.5 PERS11. The availability of state/provincial centers to provide material resource support to the LEA.  87.5 PERS12. The TSVI service needs of neighboring LEAs, in the case of multiple LEAs sharing a TSVI's time.  87.4      147 Factor  Final Round ImpLOA Percentage Rating (%) EDU20. The availability of opportunities for non-academic instruction (i.e., Expanded Core Curriculum) in the home provided by community-based organizations.   86.7 EDU1. The total number of new students entering the LEA who qualify for service from a TSVI.  85.3 EDU21. The availability of opportunities for individual students in the LEA to attend camps and short-term programming provided by community-based organizations.  85.3 POL12. Technical assistance and guidance from a state/provincial Department/Ministry of Education-level consultant in visual impairment.  84.9 EDU14. Information on the current visual functioning of individual students from medical reports.  84.8 EDU16. Information on the core academic needs (e.g., Mathematics, Science) of individual students in the LEA.  84.8 PERS21. The level of support required by paraprofessionals working with students with visual impairments in the LEA.  83.9 POL5. Resources available through a state/provincial deafblind project/program.   82.4 POL14. Special education policies and procedures in place in general at the LEA level (e.g., district policy on inclusion).   82.4 EDU2. The total number of students who are currently receiving service from a TSVI in the LEA. 82.3 PERS4. The total number of TSVIs currently employed by the LEA as permanent staff. 82.3 PERS19. Long-term absences/leaves or retirement among TSVIs in the LEA.  81.9 POL1. The overall budget for special education services in the LEA.  81.8 EDU30. Flexibility in the itinerant TSVI's schedule to accommodate unique student schedules (e.g., student has regular medical appointments and is periodically absent from school).   81.3      148  Twenty-four of the 31 (77.42%) total educational programming factors included in the study met inclusion criteria. This is compared with 14 of 22 (63.64%) total personnel-level factors and 7 of 14 (50%) of policy-level factors that met inclusion criteria following Round Four. The mean ImpLOA percentage rating of confirmed factors within each thematic cluster followed the same trend as the proportion of confirmed factors. The mean final ImpLOA percentage rating for confirmed educational policy factors was 90.64% (n = 24, SD = 5.61%). Confirmed personnel-level factors had a mean ImpLOA percentage rating of 89.47% (n = 14, SD = 5.19%), and confirmed policy-level factors had a mean ImpLOA percentage rating of 85.96% (n = 7, SD = 4.11%).  The central purpose of the study was realized by arriving at the final set of confirmed factors. Over a series of iterative survey rounds, expert panelists identified a set of factors that should be taken into consideration by special education administrators when determining workloads for itinerant TSVIs. Panelists provided two initial ratings for every factor – their perception of the importance of that factor to the actual practice of workload determination (i.e., R1) and a rating that reflected their perception of the ideal importance of that factor (i.e. R3). Panelists also nominated educational programming, personnel, and policy-level factors of importance to the process of TSVI workload determination that did not appear in the initial set of factors (i.e., R2a, R2b, & R2c, respectively).  Upon arriving at a final set of confirmed factors, the Delphi study phase was complete. Chapter Five examines the set of confirmed factors in greater detail through a practice lens and draws implications for the process of workload determination for itinerant TSVIs tailored for special education leadership.      149 CHAPTER FIVE  DISCUSSION The continued realization of the principles of inclusive education has resulted in more students with visual impairments educated in general education settings than ever before. The shift toward more inclusive settings requires corresponding shifts in modes of service delivery for these learners. As a result, the majority of students with visual impairments in North America receive support from itinerant TSVIs in general education settings. Meeting the specialized educational programming needs of students with visual impairments through itinerant service delivery presents several challenges to special education administrators when determining workloads for itinerant TSVIs. Given the low incidence of visual impairment among children and youth, expert-driven guidance on factors to consider when determining workloads is required to address the dearth of administrative experience with specialized programming for these learners. A study using the Delphi approach, consisting of four iterative survey rounds, was used to answer three main research questions:  1. How do experts in special education administration and visual impairment rate the level of importance of factors that influence actual workload determinations for itinerant TSVIs? 2. What factors do experts in special education administration and visual impairment believe should be considered in workload determinations for itinerant TSVIs? a. What educational programming factors do experts in special education administration and visual impairment believe should be considered in workload determinations for itinerant TSVIs?      150 b. What policy-level factors do experts in special education administration and visual impairment believe should be considered in workload determinations for itinerant TSVIs? c. What personnel factors do experts in special education administration and visual impairment believe should be considered in workload determinations for itinerant TSVIs? 3. How do experts in special education administration and visual impairment rate the level of importance of factors they believe should influence workload determinations for itinerant TSVIs? A panel of experts, nominated by recognized leaders in the field of visual impairment, provided level of importance ratings for each of a set of 45 initial factors. Factors were drawn from professional and research literature and were grouped according to three thematic clusters - educational programming factors, personnel-level factors, and policy-level factors.  Each factor received two ratings – one based on the panelist’s perception of the importance of that factor to the actual practice of workload determination and another based on the panelist’s perception of how important that factor should ideally be to the process of workload determination. Twenty-two factors that did not appear in the initial set of factors were derived from panelists’ nominations. Each initial and nominated factor received two ratings in its first survey appearance. Subsequent ratings were focused only on the panelists’ perception of the ideal importance of that factor. After four Delphi survey rounds, a final set of 45 factors was confirmed by panelists as important factors to consider when determining workloads for TSVIs. In order to provide richer context to the final set of confirmed factors, qualitative data from the controlled feedback is referenced in the sections that follow. While not included in any analyses      151 required under the Delphi approach, the qualitative entries provided by panelists offer greater insight into the meaning of a given factor to the process of workload determination. In the sections that follow, the final set of confirmed factors will be examined in greater detail and translated into implications for the process of workload determination for itinerant TSVIs.  Summary of Findings  A focal point of the research questions of the current study was to examine any discrepancy between two conditional ratings provided by panelists. At the first appearance of each factor, panelists provided two ratings: 1) a rating of the panelist’s perception of the importance of that factor in the actual (i.e., current) practice of workload determination for itinerant TSVI; and 2) a rating of the panelist’s perception of how important that factor should ideally be to the practice of workload determination. The implications of this disparity between conditional ratings are discussed throughout the sections that follow.  The ultimate product of the Delphi process was the final set of confirmed factors. Within the final set of confirmed factors, educational programming factors had the highest proportion of confirmed educational programming factors to total educational programming factors. Personnel-level factors had the next highest proportion of confirmed personnel-level factors to total personnel-level factors. Policy-level factors had the lowest proportion of  confirmed factors to total. In order to increase the relevance of the final set of confirmed factors to administrators in applied settings, the discussion sections that follow draw connections between confirmed factors and posit implications for the process of TSVI workload determination. Factors related by content (e.g., level of TSVI qualifications) are discussed within each of the three thematic clusters of the study. Relationships between factors from different thematic clusters are also      152 explored, as over the course of the iterative process of the study it became evident that there were interrelations between factors that transcended thematic boundaries. Curricular Access  A predominant research goal in the literature devoted to educational programming for students with visual impairments is to demonstrate improved curricular access as an outcome of a given intervention (Douglas, McLinden, McCall, Pavey, Ware, & Farrell, 2010). The rationale for this research emphasis is intuitive, given the unique perceptual constraints that visual impairments impose on students’ ability to access learning materials in regular print. The need to ensure that students with visual impairments receive appropriate programming, materials, and devices to achieve the same level of access to learning materials is inherent to the promise of an inclusive education for these learners (Douglas et al., 2011). A number of initial and nominated factors in the current study were related to the provision of learning materials in accessible formats and supports for gaining equal access to the content of these materials.  Literacy Media Two educational programming factors, EDU3 and EDU4, prompted panelists to rate the relative importance of students’ literacy medium to the determination of TSVI workloads. EDU3 (The number of students who use braille as his or her primary literacy medium in the LEA) met criteria as a confirmed factor following Round Two. That panelists reached consensus so readily on this factor is unsurprising given the unique instructional needs of students who read braille and the concomitant implications for high service frequency and intensity required from the itinerant TSVI (Koenig & Holbrook, 2000a). In addition to the number of students who read braille in the LEA, panelists were also asked to rate the importance of these students’ assistive technology needs. Innovation and product development in assistive technology supporting braille      153 reading and writing continues to progress at a swift pace (D’Andrea & Siu, 2015). Therefore, there is a need for the itinerant TSVI to maintain a current working knowledge of assistive technology for students accessing learning materials through non-visual modalities. This need was recognized by panelists via EDU19 (The availability of assistive technology for student accessing learning materials through non-visual modalities) and PERS1 (The professional development needs of TSVIs in the LEA). EDU19 and PERS1 were also confirmed following Round Two. By comparison, the number of students with visual impairments using print as their primary literacy medium was excluded from the final set of confirmed factors after three rounds of ratings by panelists. Qualitative responses from panelists after Rounds One and Two provide some rationale for the exclusion of this factor. Several panelists stated that in the case of students with visual impairments who use print, it is not their literacy medium, per se, that should be a significant factor in determining TSVI workloads. Instead, these students’ programming needs in areas of the Expanded Core Curriculum – specifically the skills, strategies, and tools required to use print effectively as a literacy medium – should factor into TSVI workload determination. These qualitative responses align with quantitative data from EDU18 (The availability of assistive technology for students accessing learning materials through vision). EDU18 met criteria for inclusion in the final set of confirmed factors after Round Two. Interestingly, EDU18 (i.e., assistive technology for non-visual access) and EDU19 (i.e. assistive technology for print access) had identical consensus indicators (IQR = 1) and nearly identical ImpLOA percentages in Round Two (91.2% and 91.1%, respectively). Panelists’ qualitative responses emphasized that all students with visual impairments, regardless of literacy medium, require direct instruction in the use of assistive technology in order to remain competitive in increasingly digitized learning      154 environments and that this requirement should factor into workload determinations for itinerant TSVIs. Availability of Materials in Alternate Format The current study also included factors related to the availability of materials in alternate formats for students with visual impairments. Personnel-level factors prompted panelists to rate the importance of the availability of qualified professionals to produce materials in alternate formats in the LEA. Panelists also rated the importance of the availability of state/provincial centers to provide material support to LEAs, since many LEAs in the United States and Canada rely on the centralized production of materials in alternate formats (e.g., braille, large print, electronic text; Wall & Corn, 2002; Zuvela, 2009).   PERS9 (The availability of braille transcribers in the Local Education Authority [LEA] to produce materials in alternate formats) met criteria for inclusion in the final set of confirmed factors following Round Two. Several panelists’ qualitative entries underscored the importance of the LEA to employ a sufficient workforce of braille transcribers so that TSVIs were not required to divert a significant proportion of their workload away from direct instruction to meet students’ material needs. Panelists also commented on the inability or unwillingness of LEA administration to maintain an adequate transcriber workforce to meet the day-to-day alternate format requirements of students, specifically those who read braille. In this instance, panelists felt that the responsibility would fall largely to the TSVI, which may or may not be adequately factored into workload determination. As one panelist commented: “If a teacher will be the person preparing materials then for some students a huge amount of time will need to be designated for that task. I think this time is often considered but underestimated.” TSVI responsibility for alternate format production was reflected in EDU24 (The total amount of      155 materials production [i.e., braille, large print, e-text, tactile materials] time required by TSVIs in the LEA) – a nominated educational programming factor. In Round Three, panelists achieved very high consensus (IQR = 0) and a very high ImpLOA rating (97.1%) for this factor. In Round Two, a statistically significant difference existed between actual practice and ideal condition ImpLOA ratings (i.e., 65.7% versus 94.2%). The significant discrepancy between conditional ratings would seem to validate the assertion that the TSVI’s responsibility for materials preparation is not adequately considered in the process of workload determination. Therefore, while panelists emphasized the importance of considering the LEA’s braille transcription capacity in the determination of TSVI workloads, it should also be recognized that the task of material preparation can often fall to the TSVI. According to panelists’ ratings, special education administrators should consider the LEA’s transcription capacity in the process of workload determination with the understanding that if students’ within-LEA material production requirements cannot be met by existing capacity, the responsibility will likely to fall to the TSVI. This should, in turn, factor into TSVI workload determination.  In their comments regarding PERS9 (The availability of braille transcribers in the Local Education Authority [LEA]), many panelists commented on the role of a state or provincial resource center in producing materials in alternate formats. PERS11 (The availability of state/provincial centers to provide material resource support to the LEA) achieved inclusion criteria for the final set of confirmed factors following Round Two. Commenting from their perspective as administrators, many panelists expressed how fortunate they and TSVIs in the LEA were to have a state/provincial resource center working on alternate format procurement or production, especially of textbooks and other large, complex learning materials. One panelist summarized the connection between the support of a state/provincial material resource center      156 and the workload of itinerant TSVIs: “Administrators and teachers need a central resource for materials. This helps reduce the workload of teachers so that they can focus on actual teaching, not finding materials.”  The promise of an inclusive educational program requires that students with visual impairments have the same level of access to learning materials as their sighted peers. Recognizing this goal, panelists emphasized the importance of supporting students’ acquisition of the knowledge, skills, devices, and alternate format materials they need to access curricular content and the corresponding factors that should enter into TSVI workload determination to enable this level of support.  Assessment The use of data-based decision making in educational leadership is increasingly necessary to ensure accountability in the legislative and policy contexts of special education (Bakken, O’Brien, & Shelden, 2010). A number of factors reviewed by panelists in the current study relate directly to data sources that may enter into the process of workload determination for itinerant TSVIs. In general, panelists’ ratings and qualitative responses emphasized the importance of data on students’ visual functioning and on the results of specialized assessment and progress monitoring. The following sections outline panelists’ ratings of factors related to assessment and sources of assessment data.  Clinical and Functional Data Clinical data obtained from ophthalmological reports are an essential source of information on students’ visual functioning upon which many important decisions (e.g., qualification, programming) are based (Lusk & Schwartz, 2016). Two factors in the current study prompted panelists to rate the importance of these data to the process of workload      157 determination. EDU14 (Information on the current visual functioning of individual students from medical reports) achieved inclusion criteria as a confirmed factor following Round Two. Many panelists described clinical data as “critical” to the development of educational programming tailored to the student’s unique needs. The importance of clinical data was rated especially high in the case of the prognosis for a student’s visual condition. EDU15 (Information on the prognosis for the visual conditions of individual students in the LEA) was one of the most uncontroversial initial educational programming factors in the study (IQR = 0), and achieved inclusion criteria following Round Two. Many panelists’ qualitative entries focused on the importance of prognosis data for future planning: “Information about the prognosis of a student's visual conditions assists greatly in planning for the future and meeting the psychosocial needs of the student.” Therefore, information on the prognosis of students’ visual conditions should be considered when determining workloads to forecast how students’ educational programming needs may shift across the academic year, and, thus, which adaptive skills and knowledge will need to be taught in advance. According to panelists’ ratings, this shift in programming priorities should have significant implications for TSVI workload determination.        Despite panelists’ early consensus on the importance of clinical data, qualitative responses also indicated an important caveat. Across the various jurisdictions represented by the panelists, the data in the medical report serves to qualify the learner as a student with a visual impairment. However, panelists were careful to note that clinical data is only one source among many that can inform the design of educational programming for the learner. One panelist noted that “[data from the clinical report] is only one piece of information; it is useful when paired with functional evaluations as part of overall assessment of student need.” The importance of functional data was evident in quantitative ratings. EDU13 (Results of specialized assessments of      158 student functioning conducted at LEA-level [e.g., Functional Vision Assessment, Learning Media Assessment]) was one of only two factors in the current study to achieve an ImpLOA percentage of 100% in its final rating. Since ImpLOA is a composite variable of “Very Important” and “Important” percentage ratings, an ImpLOA of 100% indicates that every panelist indicating a valid response perceived EDU13 as an important factor to consider when determining workloads for itinerant TSVIs. Panelists highlighted the importance of basing service delivery decisions on more objective functional assessment and expressed dissatisfaction with alternative approaches based on anecdotal data and professionals’ perceptions of student functioning. This perspective was summarized by one panelist commenting on the importance of functional assessment data:  Functional assessments related to students' visual functioning (e.g., FVAs, LMAs) are essential in determining the type and amount of support they should receive from a TSVI. Although subjective factors are also taken into account, sometimes too much weight is given to the TSVI's gut instinct and personal feelings toward a particular student, and not enough is given to objective assessment data.   Data Collection In addition to the importance of data from clinical or functional assessments, panelists also emphasized the importance of data from ongoing progress monitoring to inform the process of workload determination. Recognizing that models of service delivery are increasingly likely to follow a Response-to-Intervention (RTI) framework, progress monitoring is an increasingly salient tool for instructional decision making (Vaughn & Swanson, 2015). As a result, progress monitoring and its implications for educational decision-making is a topic of interest to both practitioners and researchers alike. In a study of special education administrators’ perceptions of their professional development needs, “monitoring student progress and measuring student outcomes” was rated as the third most important priority for professional learning for      159 administrators (Thompson & O’Brien, 2007). According to panelists’ ratings in the current study, the importance of progress monitoring extends also to the process of workload determination. EDU31 (The time/opportunity to collect adequate data to inform educational programming) is a nominated factor that achieved inclusion criteria following Round Four. It should be noted that this factor refers specifically to the time and opportunity to collect ongoing assessment data, including that resulting from specialized assessments. Following panelists’ rating of initial educational programming factors, several panelists elaborated on the importance of considering assessment data and time to conduct ongoing assessment to the process of workload determination. According to one panelist: “Assessment is the key to appropriate services, so having time set aside in the TVI's caseload to conduct progress monitoring and specialized assessments is absolutely important when determining teacher workloads.” Caseload Analysis As described in Chapter Two, caseload analysis is a data-driven approach based on severity rating scales applied to characteristics of the student and learning environment to arrive at an estimate of an appropriate level of itinerant TSVI service delivery (Wall Emerson & Anderson, 2014). As professionals with backgrounds in low-incidence service delivery, panelists were largely aware of the process of caseload analysis and many reported having applied a caseload analysis tool in their own practice, as expressed in qualitative comments. Panelists’ experiences with caseload analysis translated into importance ratings - EDU12 (Results of a formal caseload analysis process conducted at LEA-level) met inclusion criteria for the final set of confirmed factors following Round Two. The early confirmation EDU12 is further evidence of the overall perceived importance of assessment data to the process of workload determination, as rated by panelists.      160 Confirmed factors relating to assessment underscore panelists’ belief in the importance of assessment data to inform the process of workload determination in inclusive settings. In addition to assessment results, time and opportunity to gather data through regular progress monitoring and specialized assessment should be considered when determining the workloads of itinerant TSVIs.  The Expanded Core Curriculum  As outlined in Chapter Two, the Expanded Core Curriculum (ECC) is composed of nine disability-specific curricular areas for students with visual impairments (Hatlen, 2009). Skills and knowledge in the areas of the ECC require direct, systematic instruction, as this content is less likely to be acquired incidentally though observation, as is more typical in the case of sighted peers (Lohmeier, Hatlen, & Blankenship, 2009). An initial educational programming factor - EDU17 (Information on the disability-specific [i.e., Expanded Core Curriculum] needs of individual students in the LEA) – prompted panelists to rate the importance of data on students’ ECC programming needs to the process of workload determination. EDU17 received some of the highest importance ratings in the current study – ImpLOA percentage ratings of 100% and 97% in Rounds One and Two, respectively. EDU17 achieved criteria as a confirmed factor after Round Two with very high consensus among panelists (IQR = 0). Several panelists referred to students’ ECC programming needs as “central” and “vital” to the processing of workload determination. Some panelists went further, stating their belief that the primary function of the itinerant TSVI is to provide direct instruction in the ECC and as such, data on students’ programming needs in the ECC should be paramount when determining workloads. As one panelist remarked: “Assessed student needs in all areas of the ECC should be the primary factor on which a TVI's workload is determined. Although the practice varies by LEA, only in      161 those districts with well-informed administrators is this factor really considered when assigning workloads to TVIs.”   Orientation and Mobility It is beyond the scope of the current study to examine the relative importance of factors related to each of the component areas of the ECC. However, orientation and mobility (O&M) warranted consideration since O&M service delivery may be the responsibility of another professional (i.e., an O&M specialist) if the TSVI does not have qualifications in this area (Topor, Holbrook, & Koenig, 2000). PERS7 (The availability of qualified Orientation and Mobility [O&M] specialists in the LEA) achieved criteria for inclusion in the final set of confirmed factors following Round Two. Most panelists emphasized the critical importance of O&M services for students with visual impairments. However, despite the inclusion of this factor in the final set of confirmed factors, several panelists cautioned that while overlap does exist, O&M specialist workload and TSVI workload require separate consideration by administrators when determining TSVI workloads. Interestingly, PERS3 (The availability of a TSVI to serve students in more than one capacity in the LEA [dually-certified TSVI/O&M specialist vs. TSVI only]) did not meet criteria for inclusion. Panelists stated that while it may be beneficial for smaller or more rural districts to have dually-certified professionals for logistical reasons, there is a possibility that these professionals may be expected to take on additional workload as a consequence of their additional qualification, an outcome that has been documented in surveys of dually-certified professionals (see Griffin-Shirley, Pogrund, & Grimett, 2011). For this reason, panelists felt that dual certification should not be an important consideration for TSVI workload determination.       162 ECC in the Community There is growing recognition in both professional and research literature that in order to ensure adequate opportunity for skill development in the areas of the ECC, learning opportunities in addition to those offered via itinerant service delivery should be available (Pogrund, Darst, & Boland, 2013). Several items in the current study addressed the importance of additional learning opportunities in the areas of the ECC available through community partner organizations to workload determination. Panelists rated the availability of ECC instruction in the home provided by community partner organizations as an important factor in the process of TSVI workload determination. This was the most closely debated factor in the current study, requiring ratings in each of the four survey rounds. Qualitative entries from Round Four indicate differences in how panelists interpreted the importance of this factor. Some saw home-based ECC instruction from community partner organizations as reducing the workload of the itinerant TSVI, as responsibility for ECC instruction is shared between home and school. Other panelists interpreted this factor as having implications for increased workload for the TSVI in terms of the time required for collaboration and coordination with the community partner organization. In addition to home-based ECC instruction, panelists rated opportunities for ECC instruction through camps and short-term programming offered by community partner organizations as an important consideration in the determination of itinerant TSVI workload. Panelists noted that while service delivery within the LEA or school district should assume full responsibility for ECC goals outlined in the IEP, programming offered through intensive, short-term learning opportunities can promote skill development and decrease the need for the itinerant TSVI to repeat instruction or reinforce the same skills and move onto other areas of need. However, it should be noted that, like home-based ECC instruction, short-term programming was      163 emphasized as an opportunity to enhance, but not replace, ECC-oriented instruction at school. Therefore, while it is important that special education administrators be aware of the opportunities that exist for ECC skills development outside of the school setting, according to panelists’ ratings, the availability of these opportunities alone do not warrant adding to or subtracting instructional time from TSVI workloads.  Developmental Profile of the Learner  Over the last several decades, the developmental profile of students with visual impairments has become more heterogeneous (Hatton et al., 2007). An increasing number of students with visual impairments present with disabling conditions in addition to vision loss (Hatton, Ivy, & Boyer, 2013). The current study included several items related to the presence of additional disabilities and the impact that these more complex developmental profiles might have on workload determination for TSVIs. The number of students with visual impairments and additional disabilities met criteria as a confirmed factor following Round Two (i.e., EDU6). Many panelists cautioned against a perceived default assumption on the part of special education administrators that these students are most appropriately served on a consultative basis by the TSVI. Panelists emphasized the significant proportion of students with visual impairments and additional disabilities who make up typical itinerant TSVI caseloads, as well as emphasizing the need for TSVI service level to be determined by the assessed needs of the learner rather than by the presence of additional disabilities alone. Within factors considering service delivery for students with visual impairments and additional disabilities, factors relating to service delivery for students with deafblindness were considered separately.       164 Students with Deafblindness  Several factors in the current study considered the service needs of students with deafblindness. The number of students with deafblindness in the school district or LEA (i.e., EDU5) met criteria as a confirmed factor in Round Three. Panelists’ qualitative responses emphasized the unique impacts of dual sensory loss on learning and development and the important role that the TSVI can play on the educational team serving these learners. Panelists also recognized the importance of considering the programming needs of students with deafblindness in workload determination with ratings of POL5 (Resources available through a state/provincial deafblind project/program). Following Round Three, POL5 met criteria as a confirmed factor. Panelists’ qualitative responses indicated an important caveat to this rating. Unlike other factors that were rated to have a direct impact on TSVI workload, several panelists indicated that the availability of resources from a state/provincial deafblind project or program may have a more indirect impact on TSVI workload determination. As noted by one participant: “To the extent that the resources available from a state/provincial deafblind project/program can assist in the determination of student needs, this resource has the potential to impact [TSVI]’s workloads.” Therefore, estimates on the programming requirements of students with deafblindness from specialists at the state/provincial deaf blind project/program should inform TSVI workload determination in the district/LEA. The availability of these resources alone, while important to service delivery for students with deafblindness, was not wholly endorsed by panelists as a significant consideration for TSVI workload determination. Early Intervention  Given that early intervention services for young children with visual impairments are mandated under Part C of the IDEA in the United States, it is unsurprising that after not      165 appearing in the initial set of factors, workload considerations for early intervention appeared as a nominated factor in Round Two. There is no similarly legislated federal mandate in Canada. It is important to note, however, that early intervention service needs were rated as highly significant regardless of the nationality of the panelist. Panelists’ qualitative responses emphasized the disproportionately significant impact of high quality early intervention on later developmental outcomes for students with visual impairments as compared to intervention at later stages. Despite panelists’ broad recognition of the critical importance of early intervention services delivered by a qualified TSVI, there was a significant discrepancy between actual and ideal condition ratings for this item, implying that current processes for determining TSVI workloads may not adequately take account of the service needs of young learners with visual impairments. According to panelists’ ratings and qualitative responses, it is essential that administrators consider the early intervention programming needs of young children with visual impairments to ensure that critical periods for growth and development are adequately supported.   Consultative Service  The previous sections reported on factors of TSVI service delivery related to the provision of direct instruction to students with visual impairments in inclusive settings. In addition to direct service, TSVIs also provide consultative service to the student’s educational team to ensure that the sum of intervention and instruction is accessible and meaningful for the learner. The sections that follow detail panelists’ responses to items related to consultative service delivery in inclusive settings.  Paraprofessionals The assignment of paraprofessional support to students with visual impairments is a complex process dependent on a variety of programming and administrative considerations      166 (MacCuspie, 2002). Concern over the over-prescription of paraprofessional service and its impact on academic and socio-emotional outcomes for students with visual impairments has been noted in the research and professional literature (e.g., MacCuspie, 2002; Whitburn, 2013). PERS10 (The availability of qualified paraprofessionals to support individual students with visual impairment for the entire school day) probed panelists’ perceptions of the impact of full assignment of paraprofessional time (i.e., full-day) to the process of workload determination. Quantitative ratings reflected the concern outlined in the literature as this factor did not achieve criteria for inclusion in the final set of confirmed factors. Qualitative data emphasize the lack of qualification among paraprofessionals to provide instruction in general, and specifically, the lack of qualifications to provide specialized instruction to meet the programming needs of students with visual impairments. Panelists noted that qualified paraprofessional support can be effective when the student’s educational team has determined that there is a targeted need (e.g., to address personal support needs). However, this effectiveness is realized only when decisions are made based on the assessed needs of the learner, and all members of the educational team are clear on the scope of their respective roles. Basing the assignment of paraprofessional service time on factors external to student needs may invite several adverse outcomes, as described by one panelist: The paraprofessional does "for" the student instead of stepping back and providing assistance only when needed […] the paraprofessional tends to take over everything to do with the student and become the main point of contact for everyone (parents, principals, TSVIs)[and] the student's relationships with other students become stunted because of the continued presence of an adult.  Recognizing, however, that paraprofessional support can be effectively applied when based on the needs of the learner, PERS21 (The level of support required by paraprofessionals working with students with visual impairments in the LEA) appeared as a nominated factor in Round Two      167 and met criteria as a confirmed factor following Round Four. In Round Three, several panelists recognized that given the commonplace practice of assigning paraprofessionals to work directly with students with visual impairments, TSVI workloads should reflect time and opportunity to support paraprofessionals. As noted by one panelist: “[i]f we are using paraprofessionals we must provide them with training and ongoing support.” In Round Four, panelists continued to emphasize the importance of time and opportunity within the workload of the TSVI to provide support for paraprofessionals:  All [paraprofessionals] must be supported in order to implement the instructional strategies recommended by teachers. This level of support can vary according to the student's needs and the [paraprofessional]'s ability to understand and implement the needed strategies.  Bottom line, if a [paraprofessional] is hired for support - the [paraprofessional] needs the support he or she needs to do the job.   Low importance ratings, combined with detailed qualitative responses warning of the potential for adverse student outcomes resulting from the over-prescription of paraprofessional service delivery indicate that the sole presence of a paraprofessional should not have an impact on workload determinations for itinerant TSVIs. However, the training needs of paraprofessionals assigned to work with students served by the TSVI should enter into the process of TSVI workload determination. The Educational Team  In addition to the paraprofessional, students with visual impairments may be served by a number of other professionals (Topor, Holbrook, & Koenig, 2000). Initial factors in the Round One survey prompted panelists to provide ratings of the importance of the number and/or availability of other vision professionals (e.g., O&M specialists, braille transcribers) on TSVI workload determination. However, the initial set of factors in Round One did not account for the time and opportunity required for the itinerant TSVI to collaborate with and support the members      168 of students’ educational teams. Three nominated factors related to the frequency and intensity of TSVI consultation and collaboration with students’ educational teams emerged from Round One:  • PERS17 (The time required for opportunities for collaboration between TSVIs and other vision professionals in the LEA [e.g., O&M Specialists, Low Vision Therapists, other TSVIs]) • PERS18 (The time required for opportunities for collaboration between TSVIs and other specialists in the LEA [e.g., Occupational Therapists, Speech-Language Pathologists]) • PERS22 (The level of support from the TSVI required by students' school-based teams in the LEA [e.g., in-service training, consultation]) Following Round Three data analyses, each of these factors met criteria for inclusion in the final set of confirmed factors. Panelists achieved strong consensus and a very high ImpLOA percentage rating (97%) for PERS22 in Round Three. Panelists recognized the central role of the school-based team in the itinerant model of service delivery and the need for the TSVI to work to build capacity at school-level: “It will take a team to make it happen and the classroom teachers will need to know what to do...i.e., not wait for "his teacher" to get here.” Panelists also acknowledged the importance of the TSVI’s efforts to build capacity among members of students’ educational teams through collaboration. According to panelists, TSVIs should have adequate time and opportunity factored into their workloads to collaborate with other specialists working on behalf of their assigned students, as well as time and opportunity to collaborate with other vision professionals in the school district or LEA. Here, panelists’ quantitative responses emphasized the need for a sense of community among TSVIs as well as between TSVIs and the broader community of specialists supporting students in the school district or LEA. By factoring the need for collaboration into TSVIs’ workloads, administrators may also be working to limit      169 TSVIs burnout and attrition, as unmanageable workloads that do not allow the time and opportunity for the TSVI to form meaningful relationships with colleagues are associated with low job satisfaction among TSVIs (Seitz, 1994). However, according to panelists, time and opportunity to collaborate with other specialists in the district/LEA are seldom ever factored into TSVI workloads in current practice – actual ImpLOA and ideal ImpLOA percentage ratings saw the greatest differential of any factor in the current study (51.4%, p = .000). Therefore, in supporting TSVI efforts to build capacity within the district/LEA through collaboration, administrators should consider not only the benefits of such collaboration to the quality of students’ educational programming, but also the potential of collaboration as a protective factor against TSVI burnout and attrition.   TSVI Succession and Mentorship Given the acute shortage of qualified TSVIs in North America, the issue of TSVI succession factored in both initial and nominated factors. Several initial and nominated personnel-level factors in the current study were devoted to the importance of the sustainability of the TSVI workforce in the LEA or school district. According to panelists’ ratings, long-term absences or retirement among TSVIs in the LEA/school district should be considered when determining workloads for itinerant TSVIs. While succession is an important workload consideration according to panelists, the means by which the TSVI workforce is maintained are not clear. Panelists could not reach consensus as to whether  the capacity of the district or LEA to sponsor teachers to train to be TSVIs should be factored into workload determination for existing TSVIs (PERS13). PERS13 did not ultimately meet criteria for inclusion in the final set of confirmed factors. Qualitative data reflect the lack of consensus among panelists. Some panelists referred to the lack of consideration to planning for vacancies on the part of district      170 administration while others praised local and state/provincial initiatives targeted to address the chronic shortage of specialist teachers.  Far less ambiguous, however, were ratings for PERS15 (Time/opportunity for more experienced TSVIs to mentor early career/novice TSVIs in the LEA). PERS15 is a nominated factor and obtained consistently high consensus and ImpLOA percentage ratings across Rounds Two and Three, ultimately meeting criteria for inclusion in the final set of confirmed factors after Round Three. Panelists emphasized the importance of mentorship in the area of low-incidence service delivery given that TSVIs often work in isolation from other TSVIs and that support offered by administration is not likely informed by knowledge of the multifarious impact of visual impairment on growth and development. Several panelists indicated that they explicitly consider mentorship requirements for new TSVIs when determining TSVI workloads, while other panelists emphasized the role of general state- or provincial-level initiatives to provide induction support to new teachers in special education. Despite differences in panelists’ experiences of how mentorship is delivered, there was a strong recognition that the time and opportunity for offering and receiving mentorship support should be factored into TSVI workloads.  Administrative Resources Policy-level factors were least likely to meet criteria as confirmed factors when compared to the other thematic clusters of factors. A total of six policy-level factors met criteria for inclusion in the final set of confirmed factors. Two policy-level factors of general importance were confirmed: • POL1 (The overall budget for special education services in the LEA) • POL14 (Special education policies and procedures in place in general at the LEA level)      171 Qualitative responses indicated that ratings for both of these confirmed factors came with significant caveats. In the case of the overall budget for special education services, panelists indicated that administrators should have a detailed understanding of the budget available for special education services when determining workloads for itinerant TSVIs. However, several panelists noted that the extent to which budget influences workload determination should be determined by the assessed programming needs of learners and not by exosystem-level pressures such as the state/provincial or federal political climate. For general special education policies and procedures at the LEA or school district level, panelists emphasized the importance of considering the larger policy environment in which service delivery for students with visual impairments is situated. However, panelists cautioned that general policies and procedures designed to apply broadly to all exceptional learners may not encompass or adequately recognize the unique programming needs of students with visual impairments: “[programming for students with visual impairments] should be included in general special education policies, but in many cases, our students have additional unique needs.”  Thus, while general policy-level factors were identified as important to the process of TSVI workload determination, the relative importance of these factors should be interpreted with caution.  Conversely, three nominated policy-level factors more specific to the process of TSVI workload determination achieved higher importance ratings with strong consensus:  • POL10 (The findings of research studies of expert opinion on service levels for students with visual impairments) • POL11 (Special education administrator's degree of familiarity with specialized programming considerations for students with visual impairments)      172 • POL12 (Technical assistance and guidance from a state/provincial Department/Ministry of Education-level consultant in visual impairment) Each of these highly significant policy-level factors relates to the information and resources available to special education administrators to inform the process of workload determination. Panelists emphasized the combined importance of evidence-based resources and input from knowledgeable professionals. When asked to comment on ratings for POL10, panelists emphasized the importance of sharing the results of studies of expert opinion to service levels for students with visual impairments (e.g., Corn & Koenig, 2002; Koenig & Holbrook, 2000a) with special education administrators who do not have a background in low incidence service delivery. Several panelists highlighted the foundational importance of research evidence to decision making in special education. On the importance of expert-driven research evidence, one panelist commented: “We need a starting point sometimes and a framework to speak knowledgeably about service delivery with those outside our field.” Several panelists elaborated on this need for a starting point by suggesting that future research should examine the relationship between the manageability of TSVI workloads and student outcomes. Unlike the research reviewed in Chapter Two from the field of speech-language pathology, there is no extant research that directly examines the relationship between TSVI workload and academic outcomes for students with visual impairments. Whether commenting on the utility of existing research or the need for further research into the impacts of TSVI service delivery on student outcomes, panelists’ ratings and qualitative entries reinforced the importance of evidence-based resources to inform the process of workload determination.  Closely related to the need for research evidence on service delivery to inform the process of workload determination is the need for special education administrators to be well informed as to      173 the unique programming needs of students with visual impairments. When asked to rate the importance of special education administrators who are well-informed with respect to programming for students with visual impairments (i.e., POL11), panelists rated this factor as highly important to the process of workload determination. Qualitative responses underscore the importance of well-informed special education administrators at the LEA or school district level. According to one panelist:  This is an extremely important issue. Leadership in the right direction with understanding makes all of the difference in a quality vs. mediocre program. If you do not understand the unique needs of students who are [blind or visually impaired] then it is hard to allocate appropriate funding, staffing or time to meet the needs of the students.   According to Müller (2006), one of the barriers to high quality programming for students with visual impairments in inclusive settings is a lack of expertise on the part of LEA or school-district level administrators in evaluating the work of TSVIs or identifying features of high quality programming. Expert panelists in the current study echoed this concern, suggesting that TSVIs and professionals working at administrative levels with expertise in visual impairment should act as a resource to LEA or school district special education administrators to inform the process of workload determination. In some jurisdictions, there may be a state- or provincial-level consultant in visual impairment working for a state department of education or provincial ministry of education. Panelists provided high ratings for the importance of input from a consultant working at this level. Qualitative responses underscored the importance of the state- or provincial-level consultant as a source for current information on best practice. According to one panelist, “[t]his guidance keeps us (TSVIs and administrators) knowledgeable about what is happening in general and special education. The ability of this person to synthesize all this information and provide the most relevant information to the field is invaluable!” Several      174 panelists noted that input from these consultants is of particular importance to special education administration in smaller, more rural LEAs or school districts where there is less likely to be a team of vision professionals in place to advise on best practices and indicators of quality programming for students with visual impairments.  An Ecological Framework for TSVI Workload Determination In recognizing that there are multiple determinants of service levels for students with visual impairments in inclusive settings, the workload of an itinerant TSVI is not equivalent to the sum of the frequency and intensity of service required to meet the assessed needs of individual students. Special education administrators must consider an array of factors, each functioning at one or more levels of an educational system. As Bays and Crockett (2007) noted, “leadership for special education is […] influenced by micro-and macro-political dimensions including student and teacher demographics, varied instructional settings, shared leadership responsibility, and the impact of legislation, policies, and reform movements” (p. 145). The final set of confirmed factors contained factors from each of the three thematic categories. Each thematic category can theoretically represent one of the systems within an ecological approach. An ecological approach, originally developed by Bronfenbrenner (1976), places the student and his or her unique educational programming requirements at the centre of the conceptual framework. The student is situated in the microsystem, where “events occurring within specific settings affect children's behavior and development" (Odom & Diamond, 1998, p.4).  In the context of the current study, the theoretical value of the ecological systems framework stems from its ability to conceptualize the importance of factors that are both proximal and distal to the student, recognizing that special education administrators are more likely to exert influence on the educational systems in which students are situated, as opposed to more direct impacts on student      175 outcomes typically associated with the work of school-based professionals (Boscardin, 2007). An ecological framework for TSVI workload determination situates the special education administrator as operating largely in the exosystem, which "consists of events or individual actions occurring in settings in which microsystem participants do not participate, but which have an influence on events or actions in the microsystem" (Odom & Diamond, 1998, p.5). Following the logic of the ecological systems approach, impacts of TSVI workload determination at the policy-level are mediated by processes more proximal to the student (e.g., TSVI job stress, burnout) and as a result, are considered distal processes characteristic of the exosystem. Therefore, by situating the process of TSVI workload determination in an ecological framework, the relative importance of factors that are both distal (i.e., policy) and proximal (i.e., educational programming, personnel) can be hypothesized and further studied. See Figure 5.1 for an illustration of the hypothesized relationships between factors operating both proximally and distally to the student with a visual impairment based on the distribution of confirmed factors across the three thematically-oriented clusters of factors in the current study.   An ecological systems approach has been applied to several other topics of relevance to special education leadership. For example, Ruppar, Allcock, and Gonsier-Gerdin (2016) used an ecological approach to organize the various factors that have an impact on decisions about access to the general education curriculum for exceptional students. Brunsting, Sreckovic, and Lane (2014) applied an ecological systems approach to a synthesis of evidence from studies examining burnout among special education teachers. More recently, McLinden et al. (2016) used an ecological systems approach to examine the role of the itinerant teacher in providing access to the curriculum for students with visual impairments in inclusive settings. According to McLinden et al., the strength of an ecological systems approach applied to service delivery for      176 students with visual impairments in inclusive settings is that it “includes a focus on the characteristics of the individual learner as well as acknowledging the complexity and multi-dimensional nature of the influences on development” (p. 193). Given the multitude of factors that impact workload determination for itinerant TSVIs, an ecological systems approach seems an intuitive fit in that it provides a unifying framework for factors that are both proximal and distal to the student in inclusive settings.   Figure 5.1 An Ecological Framework for TSVI Workload Determination Summary of Study Implications  A broad set of factors were rated according to their relative importance to the process of workload determination for itinerant TSVIs by an expert panel. Factors that met a priori criteria for inclusion in the final set of confirmed factors represented various workload considerations,      177 ranging from those at the educational programming level (e.g., curricular access) to policy-level (e.g., statements from stakeholder groups). The preceding sections posited connections between confirmed factors, synthesizing quantitative and qualitative results into larger thematic domains (e.g., “Assessment,” and “Administrative Resources”) and drawing implications for the process of workload determination within each.   Recommendations  In the context of the results of the current study and resulting implications for the process of workload determination for itinerant TSVIs, there are several recommendations for future research and professional development. The “actual practice” condition in Rounds One and Two for the initial rating of factors provided only a superficial examination of the current practice of workload determination at the LEA or school district level. Subsequent research should examine this process in greater detail through qualitative examinations of the process of TSVI workload determination through the perspectives of various stakeholders (e.g., special education administrator, TSVI, parent/caregiver). This research would provide a more comprehensive understanding of the application of the workload determination process, and would provide greater context to the set of confirmed factors resulting from the current study. More detailed qualitative study of the relationships between factors and corresponding administrative decision-making processes will also be useful in examining the validity of the ecological framework for workload determination hypothesized in the previous section. The current study was not intended to examine interrelationships between factors through an ecological lens. As a result, subsequent research will be required to validate the ecological framework.   As well as inspiring future research into workload determination for itinerant TSVIs, the current study should also inform the development of two resources for special education      178 administrators: 1) updated service guidelines for students with visual impairments in inclusive settings; and 2) expanded caseload analysis processes to reflect a workload analysis approach.   Existing service guidelines for students with visual impairments written for an administrative audience (e.g., Pugh & Erin, 1999) are nearly two decades old and need to be updated to reflect current best practice and the changing roles and responsibilities of the itinerant TSVI. There is a need for professional organizations in the field of special education for students with visual impairments (e.g., Association for Education and Rehabilitation of the Blind and Visually Impaired, Division on Visual Impairments and Deafblindness of the Council for Exceptional Children) to work collaboratively with professional organizations in the field of special education administration (e.g., National Association of State Directors of Special Education, Council of Administrators of Special Education of the Council for Exceptional Children) to update guidelines for special education administrators. The results of the current study can form the basis for expert-driven recommendations for the process of determining manageable workloads for itinerant TSVIs featured in these service guidelines.  Current caseload analysis tools emphasize the educational programming needs of individual students that make up an itinerant TSVI’s caseload. However, as outlined in Chapter Two, the scope of the itinerant TSVI’s professional practice is not solely focused on meeting the educational programming needs of his or her assigned students. For example, existing caseload analysis tools may not adequately account for the time required for the itinerant TSVI to provide mentor support to a less experienced TSVI in the LEA or school district. There is a need for existing caseload analysis tools to be expanded to encompass and quantify the full scope of the itinerant TSVI’s professional practice consistent with a workload analysis approach (see Chapter      179 Two). The findings of the current study should be used to create and pilot a workload analysis tool to inform the process of workload determination for itinerant TSVIs in inclusive settings.  Finally, the findings of the current study have implications for personnel preparation programs training TSVIs for service in the field as well as for credential programs for educational leadership. Of the administrative-level variables under study, the majority of those that met criteria for inclusion in the final set of confirmed factors were related to the information and resources available to special education administrators to inform the process of workload determination. Personnel preparation programs in Canada and the United States should consider enhancing their course offerings to ensure that topics such as workload analysis and state/provincial special education legislation and policies regarding service delivery receive adequate coverage in TSVIs’ initial training. The results of the current study frame the process of workload determination, with the teacher sharing the results of this expert review as a means of focusing attention on key considerations (e.g., how many students with visual impairments in the LEA/district will be transitioning to the post-secondary sector in the coming year?). As the subject-area specialist in an LEA or school district, the TSVI may be relied upon to provide key information regarding both the workload requirements of students but also to speak to best practices in workload determination. As a result, these teachers should be prepared to inform the process of workload determination from the point they enter the field.  With respect to programs certifying school administrators, it is important that prospective candidates for educational leadership have a basic understanding of the professional role of the itinerant TSVI. In addition, these candidates should understand that given the unique educational programming needs of students with visual impairments, a number of specialized considerations arise in the context of workload determination for itinerant TSVIs. Course content may include      180 presentations from TSVIs or administrators with experience overseeing the programs of students with sensory impairments. Candidates may also be familiarized with resources geared to administrators, such as the NASDSE service guidelines cited in previous sections (i.e., Pugh & Erin, 1999).  Limitations of the Study  The current study has various limitations, both in terms of the methodology and administration of the study. There are several methodological limitations that are inherent to the Delphi approach. First, given the purposive sampling employed in the Delphi approach, the generalizability of the results is unknown. Second, there is an inherent challenge in checking the accuracy and reliability of the study given how dependent study implementation is on the administrative and analytical skill of the researcher. Third, studies employing the Delphi approach typically require a significant time commitment from panelists, especially in studies where more than two survey rounds are required (Worrell, Gangi, & Bush, 2013). In these scenarios, panelist fatigue can lead to attrition. In the current study, the intervening span between rounds was determined largely by panelists’ availability and the academic calendar. For example, the span of four months between Rounds Three and Four resulted from panelists’ limited availability over the months of the summer break from school in the K-12 sector. Implementation of the Round Four survey was also delayed out of consideration of the demands of the first several weeks of the school year. It became evident during Round One that panelists had very demanding schedules, as evidenced by the number of reminders needed by some panelists (up to three per round) and extensions to deadlines for survey completion. Despite these considerations to minimize the loss of panelists, attrition was a significant issue between Rounds One and Two with the loss of eight panelists. While this loss did not disproportionately deplete      181 one panelist category over others, a 19% contraction in the overall size of the panel between Rounds One and Two is noteworthy.   In addition to the methodological limitations of the study, there were some administrative limitations that extended beyond those typical to the Delphi approach. The online survey tool used to the create and host each online survey tool posed some challenges to the administration of the study. The advantage of using FluidSurveys.com is that content is guaranteed to be hosted on servers located in Canada. However, panelists with low vision reported that the interface was difficult to use in coordination with screen magnification software. The researcher sought to increase the accessibility of the interface by increasing legibility through large (18 point) sans serif font for all survey sections, high contrast between texts and background, and a limit of three survey items per page. No further accessibility concerns were reported after the Round One survey. The accessibility of web-based survey tools should be an important consideration in the design phase of any study.   Finally, there was an important limitation in the interpretation of the results of the current study. Survey items asked panelists to rate the importance of a given factor to the process of workload determination for itinerant TSVIs. Survey items did not explicitly prompt panelists to explain how that item should be factored into the process of determining TSVI workloads. Items with sufficient corresponding qualitative data allowed the researcher to comment on participants’ explanations for the importance ratings, which in some instances indicated whether the panelist believed that the factor should have an additive or subtractive impact on workload. More specific survey items that prompted panelists to provide a quantitative rating in addition to an indication of the direction (i.e., additive or subtractive) of the ideal effect of that factor on TSVI workloads      182 would have enabled a more straightforward translation of the results of the current study to guidelines for special education administrators. Conclusion  With most students with visual impairments placed in general education classrooms in their community schools and served by itinerant TSVIs, it is important to consider the factors that influence specialist service levels in inclusive settings. While the responsibility for designing and implementing specialized programming is that of the TSVI, the special education administrator is generally responsible for overseeing the quality of programming and for managing the TSVI workforce in place in the LEA or school district. A review of the literature noted a dearth of evidenced-based data to support the process of workload determination for itinerant TSVIs. This review of the literature also identified several factors that may impact the manageability of TSVI workloads in inclusive settings. These factors were assembled into three thematic clusters (i.e., Educational Programming, Personnel, and Policy-Level factors) within an ecological framework, enabling the examination of the impact of factors both proximal and distal to the student. A panel of experts was assembled to rate the relative importance of these factors to the process of TSVI workload determination. Panelists were experts in itinerant service delivery for students with visual impairments with high-level professional roles at the LEA/district, state/provincial, or national level. Both initial and nominated factors were rated over four iterative survey rounds using the Delphi approach -  a robust methodology for generating expert-driven consensus statements around a specialized topic or area of inquiry. Based on the results of the Delphi survey rounds and the resulting list of confirmed factors, implications for the process of workload determination were developed. It is expected that these implications for practice will      183 be used to inform future research into service delivery for students with visual impairments, as well as provide the impetus for the creation of new evidence-driven resources to ensure that special education administrators have the tools necessary to make informed workload determinations for itinerant TSVIs. It is anticipated, in turn, that better informed workload determinations for itinerant TSVIs will translate into educational programming that is more responsive to the unique learning needs of students with visual impairments.         184 REFERENCES  Adler, M. & Ziglio. E. (1996). Gazing into the oracle: The Delphi Method and its application to social policy and public health. London, UK: Jessica Kingsley Publishers.  Agran, M., Hong, S., & Blankenship, K. (2007). Promoting the self-determination of students with visual impairments: Reducing the gap between knowledge and practice. Journal of Visual Impairment and Blindness, 101, 453-464. Ajuwon, P. M., & Oyinlade, A. O. (2008). Educational placement of children who are blind or have low vision in residential and public schools: a national study of parents' perspectives. Journal of Visual Impairment and Blindness, 102, 325-339. Algozzine, B., Hendrickson, J., Gable, A. & White, R. (1993). Caseloads of teachers of students with behavioral disorders. Behavioral Disorders, 18, 103-109. Alonso, L. (1990). Educational reform: The responsibilities of mainstream administrators. Journal of Visual Impairment and Blindness, 84, 347-349. Allinder, R. M. (1994). The relationship between efficacy and the instructional practices of special education teachers and consultants. Teacher Education and Special Education, 17, 86-95. doi: 10.1177/088840649401700203 Ambrose-Zaken, G., & Bozeman, L. (2010). Profile of personnel preparation programs in visual impairment and their faculty. Journal of Visual Impairment and Blindness, 104, 148-169. American Foundation for the Blind. (1954). The Pine Brook Report: National work session on educating the blind with the sighted. New York, NY: American Foundation for the Blind. American Printing House for the Blind. (2015). Annual Report 2015: Distribution of Eligible Students Based on the Federal Quota Census of January 6, 2014. Retrieved October 3, 2016 from http://www.aph.org/federal-quota/distribution-2015/      185 American Speech-Language-Hearing Association. (2000). National Data Report 1999-2000; National Outcomes Measurement System. Rockville, MD: Author. American Speech-Language-Hearing Association. (2002). Omnibus survey caseload report: SLP. Rockville, MD: Author. American Speech-Language-Hearing Association. (2002). A workload analysis approach for establishing speech-language caseload standards in the school: position statement [Position Statement]. Retrieved August 17, 2015 from http://www.asha.org/policy/PS2002-00122.htm. Arick, J. R., & Krug, D. A. (1993). Special education administrators in the United States: Perceptions on policy and personnel issues. The Journal of Special Education, 27, 348-364. doi: 10.1177/002246699302700306 Aylesworth, F. A. (1938). The young myope. The Canadian Medical Association Journal, 39, 374-375.  Ban, J. R., & Masoodi, B. A. (1980). School administrators and the visually handicapped. NASSP Bulletin, 64, 64-69. Bakken, J. P., O'Brian, M., & Shelden, D. L. (2006). Changing roles and responsibilities of special education administrators. In F. E. Obiakor, A. F. Rotatori, & S. Burkhardt (ed.). Current Perspectives in Special Education Administration (Advances in Special Education, Volume 1, pp.1 - 157). Emerald Group Publishing Limited. Bays, D. A.., & Crockett, J. B. (2007). Investigating instructional leadership for special education. Exceptionality, 15, 143–161. doi:10.1080/09362830701503495      186 BC Ministry of Education (2016). Special Education Services: A Manual of Policies, Procedures, and Guidelines. Retrieved September 15, 2016 from http://www.bced.gov.bc.ca/specialed/special_ed_policy_manual.pdf Benson, B. N. (2001). Supervision of itinerant teachers: Perspectives from itinerant teachers and those who supervise them. (Doctoral Dissertation). Retrieved January 7, 2014 from https://shareok.org/bitstream/handle/11244/286/3004879.PDF?sequence=1 Bettini, E. A., Cheyney, K., Wang, J., & Leko, C. (2014). Job design: An administrator’s guide to supporting and retaining special educators. Intervention in School and Clinic. doi:10.1177/1053451214532346 Billingsley, B. S. (2004). Special education teacher retention and attrition: A critical analysis of the research literature. The Journal of Special Education, 38, 39-55. doi: 10.1177/00224669040380010401 Billingsley, B., Carlson, E., & Klein, S. (2004). The working conditions and induction support of early career special educators. Exceptional Children, 70, 333-347. Bina, M. J. (1982). Morale of teachers of the visually handicapped: Implications for administrators. Journal of Visual Impairment & Blindness, 76, 121-128. Bina, M. J. (1987). Meeting the needs of rural teachers of visually impaired students. Journal of Visual Impairment and Blindness, 81, 204-209. Boscardin, M. L. (2007). What is Special about Special Education Administration? Considerations for School Leadership. Exceptionality, 15(3), 189–200. doi:10.1080/09362830701503537 Bowen, S.K., & Ferrell, K. A. (2003). Assessment in low-incidence disabilities: The day-to-day realities. Rural Special Education Quarterly, 22, 10-19.       187 Bozeman, L., & Zebehazy, K. T. (2014). Personnel preparation in visual impairment. In P. T. Sindelar, E. D. Mccray, M. T. Brownell, & B. Lignugaris/Kraft (eds.). Handbook of research on special education teacher preparation (pp. 353- 368). New York, NY: Routledge. Bronfenbrenner, U. (1976). The experimental ecology of education. Educational Researcher, 5, 5-15.  Brown, J. E., & Beamish, W. (2012). The changing role and practice of teachers of students with visual impairments: Practitioners' views from Australia. Journal of Visual Impairment & Blindness, 106, 81-92. Brown, L. C., & Glaser, S. (2014). Teaching the Expanded Core Curriculum in general education settings. In C. B. Allman & S. Lewis (eds.). ECC Essentials: Teaching the Expanded Core Curriculum to Students with Visual Impairments. New York, NY: AFB Press.  Brown, C. M, Packer, T. L, & Passmore, A. (2013). Adequacy of the regular early education classroom environment for students with visual impairment. Journal of Special Education, 46, 223–232. doi: 10.1177/0022466910397374 Bruininks, R. H., Wolman, C., & Thurlow, M. L. (1990). Considerations in designing survey studies and follow-up systems for special education service programs. Remedial and Special Education, 11(2), 7–17. doi:10.1177/074193259001100204 Brunsting, N. C., Sreckovic, M. A., & Lane, K. L. (2014). Special education teacher burnout: A synthesis of research from 1979 to 2013. Education and Treatment of Children, 37, 681-711. Bullard C. (2003). The itinerant teacher’s handbook. Hillsboro, OR: Butte.      188 Carlson, E., Brauen, M., Klein, S., Schroll, K., & Willig, S. (2002). Study of Personnel Needs in Special Education (SPENSE). Washington, DC: Office of Special Education Programs. Retrieved January 21, 2014 from http://education.ufl.edu/spense/files/2013/06/Key-Findings-_Final_.pdf Cirrin, F., Bird, A., Biehl, L., Disney, S., Estomin, E., Rudebusch, J., Schraeder, T., & Whitmire, K. (2003). Speech-language caseloads in the schools: A workload analysis approach to setting caseload standards. Seminars in Speech and Language, 24, 155-180. Clayton, M. (1997). Delphi: A technique to harness expert opinion for critical decision making tasks in education. Educational Psychology, 17, 373-386.  Corn, A. L. (2007). On the future of the field of education of students with visual impairments. Journal of Visual Impairment and Blindness, 101, 741-743. Corn, A. L., & Huebner, K. M. (Eds.). (1998). A report to the nation: The national agenda for the education of children and youths with visual impairments, including those with multiple disabilities. New York: AFB Press.  Corn, A. L., & Koenig, A. J. (2002). Literacy for students with low vision: A framework for delivering instruction. Journal of Visual Impairment and Blindness, 96, 303-321. Corn, A.L. & Spungin, S.J. (2003). Free and appropriate public education and the personnel crisis for students with visual impairments and blindness. (COPSSE Document No. IB-10). Gainesville, FL: University of Florida, Center on Personnel Studies in Special Education. Correa-Torres, S., & Howell, J. (2004). Facing the challenges of itinerant teaching: Perspectives and suggestions from the field. Journal of Visual Impairment & Blindness, 98, 420-433.  Crockett, J. B. (2002). Special education’s role in preparing responsive leaders for inclusive schools. Remedial and Special Education, 23, 157–167.      189 Crockett, J. B. (2012). Developing leaders for the realities of special education in the 21st century. In J. B. Crockett, B. S. Billingsley, & M. L. Boscardin (eds.). Handbook of leadership and administration for special education (pp. 52-66). New York, NY: Routledge.  Crockett, J. B., Becker, M. K., & Quinn, D. (2009). Reviewing the knowledge base of special education leadership and administration from 1970-2009. Journal of Special Education Leadership, 22, 55-67. Curry, S. A., & Hatlen, P. H. (1988). Meeting the unique educational needs of visually impaired pupils through appropriate placement. Journal of Visual Impairment and Blindness, 82, 417-424. Dajani, J. S., Sincoff, M. Z., & Talley, W. K. (1979). Stability and agreement criteria for the termination of Delphi studies. Technological Forecasting and Social Change, 13, 83–90. doi:10.1016/0040-1625(79)90007-6 D’Andrea, F. M., & Siu, Y. T. (2015). Students with visual impairments: Considerations and effective practices for technology use. In D. L. Edyburn (ed.) Efficacy of Assistive Technology Interventions (Advances in Special Education Technology, Volume 1) (pp.111 – 138)  Emerald Group Publishing Limited.  Davis, P. & Hopwood, V. (2002). Including children with a visual impairment in the mainstream primary school classroom. Journal of Research in Special Educational Needs, 2, doi: 10.1111/j.1471-3802.2002.00174.x De Vet, E., Brug, J., De Nooijer, J., Dijkstra, A., & De Vries, N. K. (2005). Determinants of forward stage transitions: a Delphi study. Health Education Research, 20, 195-205. doi: 10.1093/her/cyg111      190 Dhuey, E., & Lipscomb, S. (2013). Funding special education by total district enrollment: Advantages, disadvantages, and policy considerations. Education, 8, 316-331. doi: 10.1162/EDFP_a_00098 Diamond, I. R., Grant, R. C., Feldman, B. M., Pencharz, P. B., Ling, S. C., Moore, A. M., & Wales, P. W. (2014). Defining consensus: a systematic review recommends methodologic criteria for reporting of Delphi studies. Journal of Clinical Epidemiology, 67, 401-409. doi: 10.1016/j.jclinepi.2013.12.002 Di Paola, M. F., Walther-Thomas, C. (203). Principals and special education: The critical role of school leaders (COPSE Document No. IB-7). Gainesvile, FL: University of Florida, Center on Personnel Studies in Special Education. Dignan, K. C. (2012). 2012 Summary of Need for VI Professionals. Austin, TX: Texas School for the Blind and Visually Impaired. Retrieved February 2, 2014 from http://www.tsbvi.edu/attachments/needs-2012.doc Douglas, G., McLinden, M., McCall, S., Pavey, S., Ware, J., & Farrell, A. M. (2011). Access to print literacy for children and young people with visual impairment: findings from a review of literature. European Journal of Special Needs Education, 26, 25-38. Dworett, D., & Bennett, S. (2002). A view from the north: Special education in Canada. Teaching Exceptional Children, 34, 22-27. Edgar, D. L., & Rosa-Lugo, L. I. (2007). The critical shortage of speech-language pathologists in the public school setting: Features of the work environment that affect recruitment and retention. Language, Speech, and Hearing Services in Schools, 38, 31-46.      191 Ecken, P., Gnatzy, T., & von der Gracht, H. A. (2011). Desirability bias in foresight: Consequences for decision quality based on Delphi results. Technological Forecasting and Social Change, 78(9), 1654–1670. doi:10.1016/j.techfore.2011.05.006 Embich, J. L. (2001). The relationship of secondary special education teachers' roles and factors that lead to professional burnout. Teacher Education and Special Education, 24, 58-69. doi: 10.1177/088840640102400109 Erffmeyer, R. C., Erffmeyer, E. S., & Lane, I. M. (1986). The Delphi technique: An empirical evaluation of the optimal number of rounds. Group Organization Management, 11, 120-128. doi: 10.1177/105960118601100110 Erten, O., & Savage, R. S. (2011). Moving forward in inclusive education research. International Journal of Inclusive Education, 16, 221-233. doi: 10.1080/13603111003777496 Ferrell, K. A. (2007). Issues in the field of blindness and low vision. Greeley, CO: National Center on Low-Incidence Disabilities.  Fink, A., Kosecoff, J., Chassin, M., & Brook, R. H. (1984). Consensus methods: characteristics and guidelines for use. American Journal of Public Health, 74, 979-983. doi: 10.2105/AJPH.74.9.979 Fisher, D., & Frey, N. (2001). Access to the core curriculum: Critical ingredients for student success. Remedial and Special education, 22, 148-157. Fielder, C. R., & Van Haren, B. (2009). A comparison of special education administrators' and teachers' knowledge of and application of ethics and professional standards. Journal of Special Education, 43, 160-173. doi: 10.1177/0022466908319395 Fore, C., Martin, C., & Bender, W. N. (2002). Teacher burnout in special education: The causes and the recommended solutions. The High School Journal, 86, 36-44.      192 Foster, S., & Cue, K. (2009). Roles and responsibilities of itinerant specialist teachers of deaf and hard of hearing students. American Annals of the Deaf, 153, 435-449. Furman, G. C. (1988). The work of the special education director: A field study.  Paper presented at the Annual Meeting of the American Educational Research Association, New Orleans, LA. Retrieved December 12, 2013 from http://files.eric.ed.gov/fulltext/ED300943.pdf Garvar, A., & Schmelkin, L. P. (1989). A multidimensional scaling study of administrators' and teachers' perceptions of disabilities. Journal of Special Education, 22, 463-478. doi: 10.1177/002246698902200407 Gersten, R., Keating, T., Yovanoff, P., & Harniss, M. K. (2001). Working in special education: Factors that enhance special educators' intent to stay. Exceptional Children, 67, 549-567. Griffin-Shirley, N., Koenig, A. K., Layton, C. A., Davidson, R. C., Siew, L. K., Edmonds, A. R., & Robinson, M. C. (2004). A survey of teachers of students with visual impairments: Responsibilities, satisfactions, and needs. RE:view, 36, 7-20. Griffin-Shirley, N., & Matlock, D. (2004). Paraprofessionals speak out: A survey. RE: view, 36, 127-136. Griffin-Shirley, N., Pogrund, R., & Grimmet, E. (2011). View of dual-certified vision education professionals across the United States. Insight, 4, 15-21. Hallowell, M. R., & Gambatese, J. A. (2009). Qualitative research: Application of the Delphi method to CEM research. Journal of Construction Engineering and Management, 136, 99-107. doi: 10.1061/ASCECO.1943-7862.0000137 Harley, R. K. (1990). Future directions in training teachers of visually impaired children. Peabody Journal of Education, 67, 135-143. doi: 10.1080/01619569009538686      193 Harley, R. K., Garcia, M., & Williams, M. F. (1989). The educational placement of visually impaired children. Journal of Visual Impairment and Blindness, 83, 512-517.  Hasson, F., & Keeney, S. (2011). Enhancing rigour in the Delphi technique research. Technological Forecasting and Social Change, 78, 1695–1704. doi:10.1016/j.techfore.2011.04.005 Hasson, F., Keeney, S., & McKenna, H. (2000). Research guidelines for the Delphi survey technique. Journal of Advanced Nursing, 32, 1008-1015.  Hatlen, P. (1996). The core curriculum for blind and visually impaired students, including those with additional disabilities. RE:view, 28, 25-32. Hatlen, P. (2000). Historical perspectives. In M. C. Holbrook & A. J. Koenig (Eds.), Foundations of Education. Volume 1: History and Theory of Teaching Children and Youths with Visual Impairments (pp. 1-55). 2nd ed. New York, NY: AFB Press. Hatlen, P. (2009). The opportunity to be equal, the right to be different. Austin, TX: Texas School for the Blind and Visually Impaired. Hatton, D. (2001). Model registry of early childhood visual impairment: First-year results. Journal of Visual Impairment and Blindness, 95, 418-433. Hatton, D. D., Ivy, S. E., & Boyer, C. (2013). Severe visual impairments in infants and toddlers in the United States. Journal of Visual Impairment and Blindness, 107, 325-336. Hatton, D. D., Schwietz, E., Boyer, B., & Rychwalski, P. (2007). Babies Count: the national registry for children with visual impairments, birth to 3 years. Journal of American Association for Pediatric Ophthalmology and Strabismus, 11, 351-355. Haynes, N. M. (2003). Addressing students' social and emotional needs: the role of mental health teams in schools. Journal of Health and Social Policy, 16, 109-123.      194 Hazekamp, J., & Huebner, K. M. (1989). Program planning and evaluation for blind and visually impaired students: National guidelines for educational excellence. New York, NY: American Foundation for the Blind. Herzberg, T. S., & Stough, L. M. (2007). The production of brailled instructional materials in Texas public schools. Journal of Visual Impairment and Blindness, 101, 465-478.  Heumann, J. E. (1996). Policy guidance on educating blind and visually impaired students. RE: view, 28, 71-79. Hocutt, A. M. (1996). Effectiveness of special education: Is placement the critical factor?. The Future of Children, 6, 77-102. Holey, E. A., Feeley, J. L., Dixon, J., & Whittaker, V. J. (2007). An exploration of the use of simple statistics to measure consensus and stability in Delphi studies. BMC Medical Research Methodology, 7, 52. doi: 10.1186/1471-2288-7-52 Hsu, C. C., & Sandford, B. A. (2007). The Delphi technique: making sense of consensus. Practical Assessment, Research & Evaluation, 12, 1-8. Hume, D. (2011). Assistive technology use by Kentucky students with visual impairments. (Doctoral Dissertation). Retrieved February 15, 2014 from http://ir.library.louisville.edu/cgi/viewcontent.cgi?article=1650&context=etd Hung, H.-L., Altschuld, J. W., & Lee, Y.-F. (2008). Methodological and conceptual issues confronting a cross-country Delphi study of educational program evaluation. Evaluation and Program Planning, 31, 191–8. doi: 10.1016/j.evalprogplan.2008.02.005 Hutchins, T. L., Howard, M., Prelock, P. A., & Belin, G. (2010). Retention of school-based SLPs: Relationships among caseload size, workload satisfaction, job satisfaction, and best practice. Communication Disorders Quarterly, 31, 139-154.      195 Isaac, J. (2014). Special education administrators' perceptions of responsibilities and challenges (Doctoral dissertation). Retrieved June 27, 2016 from https://poar.twu.edu/bitstream/handle/11274/3654/Issacc2.pdf?sequence=1&isAllowed=y Iselin, S., & Lewis, S. (2002). A comparison of the independent living skills of primary students with visual impairments and their sighted peers: A pilot study. Journal of Visual Impairment and Blindness, 96, 335-344. Jackson, T. L. (2003). Caseload/class size in special education. Alexandria, VA: National Association of State Directors of Special Education. Retrieved October 20, 2013 from http://nasdse.org/DesktopModules/DNNspot-Store/ProductFiles/10_d2fad293-9994-4b81-b27b-a67b78a10104.pdf Jan, J. E., Heaven, R. K., Matsuba, C., Langley, M. B., Roman-Lantzy, C., & Anthony, T. L. (2013). Windows into the visual brain: New discoveries about the visual system, its functions, and implications for practitioners. Journal of Visual Impairment and Blindness, 107, 251-261. Johnstone, C., Thurlow, M., Altman, J., Timmons, J., & Kato, K. (2009). Assistive technology approaches for large-scale assessment: Perceptions of teachers of students with visual impairments. Exceptionality, 17, 66-75. doi: 10.1080/09362830902805756 Katz, L. A., Maag, A., Fallon, K. A., Blenkarn, K., & Smith, M. K. (2010). What makes a caseload (un) manageable? School-based speech-language pathologists speak. Language, Speech, and Hearing Services in Schools,41, 139-151. Kavale, K. A., & Forness, S. R. (2000). History, rhetoric, and reality analysis of the inclusion debate. Remedial and Special Education, 21, 279-296.      196 Keeney, S., Hasson, F., & McKenna, H. P. (2001). A critical review of the Delphi technique as a research methodology for nursing. International Journal of Nursing Studies, 38, 195-200. doi: 10.1016/S0020-7489(00)00044-4 Kirchner, C. & Diament, S. (1999). USABLE data report: Estimates of the number of visually impaired students, their teachers, and orientation and mobility specialists: Part 1. Journal of Visual Impairment and Blindness, 93, 600-606. Kluwin, T. N., Morris, C. S., & Clifford, J. (2004). A rapid ethnography of itinerant teachers of the deaf. American Annals of the Deaf, 149, 62-72. Koenig, A. J., & Farrenkopf, C. (1997). Essential experiences to undergird the early development of literacy. Journal of Visual Impairment and Blindness, 91, 14-24. Koenig, A. J., & Holbrook, M. C. (1995). Learning media assessment of students with visual impairments: A resource guide for teachers. Austin, TX: Texas School for the Blind and Visually Impaired. Koenig, A. J., & Holbrook, M. C. (2000a). Ensuring high-quality instruction for students in braille literacy programs. Journal of Visual Impairment and Blindness, 94, 677-694. Koenig, A. J. & Holbrook, M. C. (2000b). Professional practice. In M. C. Holbrook & A. J. Koenig (Eds.), Foundations of Education. Volume 1: History and Theory of Teaching Children and Youths with Visual Impairments (pp. 260-276). 2nd ed. New York, NY: AFB Press. Kornitzer, H. (1947). Problems for Research in the Education of Partially Seeing Children. The Journal of Educational Research, 592-597. Landeta, J. (2006). Current validity of the Delphi method in social sciences. Technological Forecasting and Social Change, 73, 467-482. doi: 10.1016/j.techfore.2005.09.002      197 Larrivee, B., & Cook, L. (1979). Mainstreaming: A study of the variables affecting teacher attitude. The Journal of Special Education, 13, 315-324. Lashley, C., & Boscardin, M. L. (2003). Special education administration at a crossroads. Journal of Special Education Leadership, 16, 63-75. Lashley, C., & Boscardin, M.L. (2003). Special education administration at a crossroads: Availability, licensure, and preparation of special education administrators. (COPSSE Document No. IB-8). FL: University of Florida, Center on Personnel Studies in Special Education. Lewis, S. & Allman, C. B. (2000). Educational programming. In M. C. Holbrook & A. J. Koenig (eds.), Foundations of Education. Volume 1: History and Theory of Teaching Children and Youths with Visual Impairments (pp. 219-259). 2nd ed. New York, NY: AFB Press. Lewis, S., & McKenzie, A. R. (2009). Knowledge and skills for teachers of students with visual impairments supervising the work of paraeducators. Journal of Visual Impairment and Blindness, 103, 481-494. Lindsay, G. (2007) Annual review: Educational psychology and the effectiveness of inclusive education/mainstreaming. British Journal of Educational Psychology, 77, 1–24. Lohmeier, K., Blankenship, K., & Hatlen, P. (2009). Expanded core curriculum: 12 years later. Journal of Visual Impairment and Blindness, 103, 103-112. Lowenfeld, B. (1973). The visually handicapped child in school. New York, NY: John Day. Lowenfeld, B. (1983). The education of the blind in public schools. In B. Lowenfeld (ed.). Berthold Lowenfeld on Blindness and Blind People. New York, NY: AFB Press.      198 Luckner, J. L., & Ayantoye, C. (2013). Itinerant teachers of students who are deaf or hard of hearing: Practices and preparation. Journal of Deaf Studies and Deaf Education, 18, 409-423. Ludlow, B. L., Conner, D., & Schechter, J. (2005). Low incidence disabilities and personnel preparation for Rural Areas: Current Status and Future Trends. Rural Special Education Quarterly, 24, 15-24. Lusk, K., & Schwartz, T. L. (2016). Management of Vision Impairment in Children. In E. Traboulsi & V. Utz (eds.). Practical Management of Pediatric Ocular Disorders and Strabismus (pp. 725-733). New York, NY: Springer. MacCuspie, P. A. (2002). Access to literacy instruction for students who are blind or visually impaired: A discussion paper. Toronto, ON: The Canadian National Institute for the Blind. Retrieved October 19, 2014 from http://www.pathstoliteracy.org/sites/pathstoliteracy.perkinsdev1.org/files/uploaded-files/access_to_literacy_instruction_for_students_0.doc Manset, G., & Semmel, M. I. (1997). Are inclusive programs for students with mild disabilities effective? A comparative review of model programs. Journal of Special Education, 31, 155-180. doi: 10.1177/002246699703100201 Mason, C. & Davidson, R. (2000). National plan for training personnel to serve children with blindness and low vision. Alexandria. VA: Council for Exceptional Children. Retrieved November 12, 2014 from http://files.eric.ed.gov/fulltext/ED439549.pdf Mason, C., McNerney, C., & Davidson, R. (2000). Shortages of personnel in the low incidence area of blindness: Working and planning together. Teaching Exceptional Children, 32, 91.       199 McBride, S. (2008). A cross-Canada review of selected issues in special education. Calgary, AB: Alberta Education Special Programs Branch. Retrieved November 18, 2013 from http://education.alberta.ca/media/938183/crosscanadareview_mcbride.pdf McCarty, B., Hazelkorn, M., & Boreson, L. (2003). Caseload concerns of “front line” professionals. Special Education Leadership, 16, 96-103. McCrea, L. D. (1996). A review of literature: Special education and class size. Lansing, MI: Michigan State Board of Education. Retrieved July 21, 2014 from http://files.eric.ed.gov/fulltext/ED407387.pdf McKenzie, A. R., & Lewis, S. (2008). The role and training of paraprofessionals who work with students who are visually impaired. Journal of Visual Impairment and Blindness, 102, 459-471. McLeskey, J., Tyler, N. C., & Flippin, S. S. (2004). The supply of and demand for special education teachers: A review of research regarding the chronic shortage of special education teachers. The Journal of Special Education, 38, 5-21. McLinden, M., Douglas, G., Cobb, R., Hewett, R., & Ravenscroft, J. (2016). ‘Access to learning’ and ‘learning to access’: Analysing the distinctive role of specialist teachers of children and young people with vision impairments in facilitating curriculum access through an ecological systems theory. British Journal of Visual Impairment, 34, 177-195. doi: 10.1177/0264619616643180 Mead D.M. & Mosely, L.G. (2001) The use of Delphi as a research approach. Nurse Researcher 8, 4–37.      200 Mervis, C. A., Boyle, C. A., & Yeargin-Allsopp, M. (2002). Prevalence and selected characteristics of childhood vision impairment. Developmental Medicine and Child Neurology, 44, 538-541. doi: 10.1111/j.1469-8749.2002.tb00326.x Michigan Department of Education (2013a). The Michigan Vision Services Severity Rating Scale. Retrieved November 10, 2014 from https://mdelio.org/sites/default/files/documents/BVI/SRS/VSSRS.pdf Michigan Department of Education (2013b). The Michigan Vision Services Severity Rating Scale for Students with Additional Needs. Retrieved November 10, 2014 from https://mdelio.org/sites/default/files/documents/BVI/SRS/VSSRS+.pdf Monk, D. H., Hussain, S. (2000). Structural influences on the internal allocation of school district resources: Evidence from New York State. Educational Evaluation and Policy Analysis, 22, 1-26. doi: 10.3102/01623737022001001 Moore, M. T. (1988). Patterns in special education service delivery and cost. Washington, DC: Office of Special Education and Rehabilitative Services. Retrieved August 19, 2013 from http://files.eric.ed.gov/fulltext/ED303027.pdf Morgan, C. (2003). A brief history of special education. Toronto, ON: Elementary Teachers Federation of Ontario. Retrieved March 28, 2014 from http://www.etfo.ca/SiteCollectionDocuments/Publication%20Documents/Voice%20-%20School%20Year%202002-3/Winter%202003/Brief_History_Special_Ed.pdf Mullen, P. M. (2003). Delphi: myths and reality. Journal of Health Organization and Management, 17, 37-52. Müller, E. (2006). Blindness and visual impairment: State infrastructures and programs. Alexandria, VA: National Association of State Directors of Special Education. Retrieved      201 October 7, 2013 from http://nasdse.org/DesktopModules/DNNspot-Store/ProductFiles/34_cb379544-6f2f-4e06-87cc-ad4107cfd498.pdf Murphy, J. L., Hatton, D., & Erickson, K. A. (2008). Exploring the early literacy practices of teachers of infants, toddlers, and preschoolers with visual impairments. Journal of Visual Impairment and Blindness, 102, 133. Okoli, C., & Pawlowski, S. D. (2004). The Delphi method as a research tool: An example, design considerations, and applications. Information and Management, 42, 15-29. doi: 10.1016/j.im.2003.11.002 National Coalition for Vision Health (2003). Canadian national standards for the education of children and youth who are blind or visually impaired, including those with additional disabilities. Toronto, ON: Author. Retrieved November 11, 2013 from http://www.apsea.ca/MainPage/Cdn_Nat_Stands.pdf Nichols, A. S., & Sosnowsky, F. L. (2002). Burnout among special education teachers in self-contained cross-categorical classrooms. Teacher Education and Special Education, 25, 71-86.  Odom, S. L., & Diamond, K. E. (1998). Inclusion of young children with special needs in early childhood education: The research base. Early Childhood Research Quarterly, 13, 3-25. Olmstead, J. E. (1995). Itinerant personnel: A survey of caseloads and working conditions. Journal of Visual Impairment and Blindness, 89, 546-548. Olmstead, J. E. (2005). Itinerant teaching: Tricks of the trade for teachers of students with visual impairments. New York, NY: American Foundation for the Blind.      202 Osborne, J., Collins, S., Ratcliffe, M., Millar, R., & Duschl, R. (2003). What “ideas-about-science” should be taught in school science? A Delphi study of the expert community. Journal of Research in Science Teaching, 40, 692-720. Palmer, D. J., Stough, L. M., Burdenski, Jr., T. K., & Gonzales, M. (2005). Identifying teacher expertise: An examination of researchers’ decision making. Educational Psychologist, 40, 13–25. doi:10.1207/s15326985ep4001_2 Parker, A T., & Nelson, C. N. (2016). Toward a comprehensive system of personnel development in deafblind education. American Annals of the Deaf, 161, 486-501. doi: 10.1353/aad.2016.0040 Pérez-Pereira, M., & Castro, J. (1997). Language acquisition and the compensation of visual deficit: New comparative data on a controversial topic. British Journal of Developmental Psychology, 15, 439-459. doi: 10.1111/j.2044-835X.1997.tb00740.x Pogrund, R. L., Darst, S., & Boland, T. (2013). Evaluation study of short-term programs at a residential school for students who are blind and visually impaired. Journal of Visual Impairment & Blindness, 107, 30-42. Pogrund, R. L., & Wibbenmeyer, K. A. (2008). Interpreting the meaning of the terms certified and highly qualified for teachers of students with visual impairments. Journal of Visual Impairment and Blindness, 102, 5-15. Pollard, C., & Pollard, R. (2004). Research priorities in educational technology: A Delphi study. Journal of Research on Technology in Education, 37, 145-160. Porter, A., McMaken, J., Hwang, J., & Yang, R. (2011). Common core standards: The new U.S. intended curriculum. Educational Researcher, 40, 103-116. doi: 10.3102/0013189X11405038      203 Powell, C. (2003). The Delphi technique: myths and realities. Journal of Advanced Nursing, 41, 376-382.  Praisner, C. L. (2003). Attitudes of elementary school principals toward the inclusion of students with disabilities. Exceptional Children, 69, 135-145. doi: 10.1177/001440290306900201 Project FORUM (2000). Special education issues in caseload/class size. Alexandria, VA: National Association of State Directors of Special Education. Available at www.nasdse.org/forum.htm  Pugh, G. S., & Erin, J. (Eds.). (1999). Blind and Visually Impaired Students: Educational Service Guidelines. Watertown, MA: Perkins School for the Blind.  Rahi, J. S., & Cable, N. (2003). Severe visual impairment and blindness in the UK. The Lancet, 362, 1359-1365. doi: 10.1016/S0140-6736(03)14631-4 Ray, P. K., & Sahu, S. (1990). Productivity management in India: a Delphi study. International Journal of Operations & Production Management, 10, 25-51.  Rayens, M. K., & Hahn, E. J. (2000). Building consensus using the policy Delphi method. Policy, Politics, & Nursing Practice, 1, 308-315. doi: 10.1177/152715440000100409 Rice, C. (2009). Prevalence of Autism Spectrum Disorders: Autism and Developmental Disabilities Monitoring Network, United States, 2006. Morbidity and Mortality Weekly Report. Surveillance Summaries. Volume 58, Number SS-10. Centers for Disease Control and Prevention. Riggio, M. & McLetchie, B. (Eds.) (2008). Deafblindness: Educational Service Guidelines. Watertown, MA: Perkins School for the Blind. Rosenblum, L. P. (2000). Perceptions of the impact of visual impairment on the lives of adolescents. Journal of Visual Impairment and Blindness, 94, 434-445.      204 Rowe, G., & Wright, G. (2001). Expert opinions in forecasting: the role of the Delphi technique. In J.S. Armstrong (Ed.). Principles of forecasting (pp. 125-144). Springer: New York NY. Rude, H., Jackson, L., Correa-Torres, S., Luckner, J., Muir, S., & Ferrell, K. A. (2005). Perceived needs of students with low-incidence disabilities in rural areas. Rural Special Education Quarterly, 24, 3-14. Rude, H. A., & Sasso, G. M. (1988). Colorado special education administrative competencies. Teacher Education and Special Education, 11, 139-143. Ruppar, A. L., Allcock, H., & Gonsier-Gerdin, J. (2016). Ecological factors affecting access to general education content and contexts for students with significant disabilities. Remedial and Special Education, 0741932516646856. Russ, S., Chiang, B., Rylance, B. J., & Bongers, J. (2001). Caseload in special education: An integration of research findings. Exceptional Children, 67, 161-172. Sapp, W., Blades, A., & Cernkovich, J. (2013). Caseloads based on students’ assessed needs (Position paper). Retrieved July 12, 2014 from https://aerbvi.org/wp-content/uploads/2016/01/IP-Div-16-Position-Paper-2.doc Sapp, W., & Hatlen, P. (2010). The expanded core curriculum: Where we have been, where we are going, and how we can get there. Journal of Visual Impairment and Blindness, 104, 338-348. Schmidt, R. C. (1997). Managing Delphi surveys using nonparametric statistical techniques. Decision Sciences, 28, 763-774. Scholl, G. T. (1968). The principal works with the visually impaired. Council for Exceptional Children, Washington, DC. Retrieved December 12, 2014 from http://files.eric.ed.gov/fulltext/ED025058.pdf      205 Seitz, J.A. (1994). Seeing through the isolation: A study of first-year teachers of the visually impaired. Journal of Visual Impairment and Blindness, 88, 299–309 Sharma, U., Forlin, C., Loreman, T., & Earle, C. (2006). Pre-service teachers' attitudes, concerns, and sentiments about inclusive education: An international comparison of novice pre-service teachers. International Journal of Special Education, 21, 80-93. Skulmoski, G., Hartman, F., & Krahn, J. (2007). The Delphi method for graduate research. Journal of Information Technology Education, 6. Retrieved October 12, 2014 from http://www.editlib.org/p/111405 Sebald, A. M., Jackson, L. L., Pearson, B., Birjulin, A. (n.d.). Colorado Department of Education Special Education AU Survey: Year Three. Greeley, CO: National Center on Low-Incidence Disabilities. Silberman, R. K., Ambrose-Zaken, G., Corn, A. L., & Trief, E. (2004). Profile of personnel preparation programs in visual impairment: A status report. Journal of Visual Impairment and Blindness, 98, 741-765. Singer, J. D. (1992). Are special educators’ career paths special? Results from a 13-year longitudinal study. Exceptional Children, 59, 262–279. Spungin, S. J., & Ferrell, K. (2007). The role and function of the teacher of students with visual impairments (Position paper). Alexandria, VA: Division on Visual Impairments, Council for Exceptional Children. Stanovich, K. E., West, R. F., & Toplak, M. E. (2013). Myside bias, rational thinking, and intelligence. Current Directions in Psychological Science, 22, 259-264. doi: 10.1177/0963721413480174      206 Statistics Canada (2006). Participation and activity limitation survey of 2006: A profile of education for children with disabilities in Canada. Ottawa, ON: Author. Retrieved March 21, 2013 from http://www.statcan.gc.ca/pub/89-628-x/89-0628-x2008004-eng.htm Stempien, L. R., & Loeb, R. C. (2002). Differences in job satisfaction between general education and special education teachers: Implications for retention. Remedial and Special Education, 23, 258-267. doi: 10.1177/07419325020230050101 Stephens, T. L., & Fish, W. W. (2010). Motivational factors toward pursuing a career in special education. Education, 130(4).581-594. Stough, L. M., Palmer, D. J. (2003). Special thinking in special settings: A qualitative study of expert special educators. Journal of Special Education, 36, 222. doi: 10.1177/002246690303600402 Smith, A., Geruschat, D., & Huebner, K. M. (2004). Policy to practice: Teachers’ and administrators’ views on curricular access by students with low vision. Journal of Visual Impairment and Blindness, 98, 612-628. Smith, D. W., Kelley, P., Maushak, N. J., Griffin-Shirley, N., & Lan, W. Y. (2009). Assistive technology competencies for teachers of students with visual impairments. Journal of Visual Impairment and Blindness, 103, 457-469. Stempien, L. R., & Loeb, R. C. (2002). Differences in job satisfaction between general education and special education teachers: Implications for retention. Remedial and Special Education, 23, 258-267. Sumbera, M. J., Pazey, B. L., & Lashley, C. (2014). How building principals made sense of Free and Appropriate Public Education in the Least Restrictive Environment. Leadership and Policy in Schools, 13, 297–333. doi:10.1080/15700763.2014.922995      207 Suter, J. C., & Giangreco, M. F. (2009). Numbers that count exploring special education and paraprofessional service delivery in inclusion-oriented schools. The Journal of Special Education, 43, 81-93. Suvak, P. A. (1999). What do they really do? Activities of teachers of students with visual impairments. RE View, 36, 22-31. Thompson, J. R. & O’Brian, M. (2007). Many hats and a delicate balance: The lives and times of today’s special education directors. Journal of Special Education Leadership, 20, 33-43. Toelle, N. M., & Blankenship, K. E. (2008). Program accountability for students who are visually impaired. Journal of Visual Impairment & Blindness, 102, 97-102. Topor, I. L., Holbrook, M. C., & Koenig, A. J. (2000). Creating and nurturing effective education teams. In A. J. Koenig & M. C. Holbrook (Eds.), Foundations of education. Volume 2: Instructional strategies for teaching children and youths with visual impairments (2nd ed., pp. 3-26). New York: AFB Press. Tyler, T. A., & Brunner, C. C. (2014). The case for increasing workplace decision-making: Proposing a model for special educator attrition research. Teacher Education and Special Education. doi: 10.1177/0888406414527118 U.S. Department of Education (2008). 30th Annual Report to Congress on the Implementation of the Individuals with Disabilities Education Act, 2008, Washington, D.C.. Retrieved March 10, 2013 from http://www2.ed.gov/about/reports/annual/osep/2008/parts-b-c/index.html U.S. Department of Education. (2012). The condition of education 2012, NCES 2012045. Washington, DC: U.S. Government Printing Office. U.S. Department of Education (2014). 36th Annual Report to Congress on the Implementation of the Individuals with Disabilities Education Act, 2014, Washington, D.C.. Retrieved May      208 17, 2016 from http://www2.ed.gov/about/reports/annual/osep/2014/parts-b-c/36th-idea-arc.pdf Vannest, K. J., Hagan-Burke, S., Parker, R. I., & Soares, D. A. (2011). Special education teacher time use in four types of programs. The Journal of Educational Research, 104, 219-230. doi: 10.1080/00220671003709898 Vaughn, S., & Swanson, E. A. (2015). Special education research advances knowledge in education. Exceptional Children, 82, 11-24. VISSIT. (2014). Visual Impairment Scale of Service Intensity of Texas. Retrieved June 15, 2015 from http://www.tsbvi.edu/vissit Vittek, J. E., Floyd, K. K., & Hayes, S. B. (2013). Stakeholders’ perceptions of special education induction programs. Journal of Research Initiatives, 1, 4. Retrieved October 18, 2014 from http://digitalcommons.uncfsu.edu/jri/vol1/iss1/4 Voltz, D. L., & Collins, L. (2010). Preparing special education administrators for diverse, standards-based contexts: Beyond the Council for Exceptional Children and the Interstate School Leaders Licensure Consortium. Teacher Education and Special Education, 33, 70-82. doi: 10.1177/0888406409356676 Von der Gracht, H. A. (2012). Consensus measurement in Delphi studies. Technological Forecasting and Social Change, 79, 1525–1536. doi:10.1016/j.techfore.2012.04.013 Wall, R. (2002). Teachers' exposure to people with visual impairments and the effect on attitudes toward inclusion. RE: view, 34, 111-19.  Wall, R. & Corn, A. L. (2002). Production of textbooks and instructional materials in the United States. Journal of Visual Impairment and Blindness, 96, 212-22.      209 Wall, R., & Corn, A. L. (2004). Students with visual impairments in Texas: Description and extrapolation of data. Journal of Visual Impairment and Blindness, 98, 341-350. Wall Emerson, R. S., & Corn, A. L. (2006). Orientation and mobility content for children and youths: A Delphi approach pilot study. Journal of Visual Impairment and Blindness, 100, 331-342. Wallace, H. M., Wrightstone, J. W., & Gall, E. (1954). Special classes for handicapped children. American Journal of Public Health and the Nations’ Health, 44, 1045-1058. Whitburn, B. (2013). The dissection of paraprofessional support in inclusive education: ‘You’re in mainstream with a chaperone’. Australasian Journal of Special Education, 37, 147-161. Wigle, S. B., & Wilcox, D. J. (2002). Special education directors and their competencies on CEC-identified skills. Education, 123, 276-288. Wolffe, K. E., Sacks, S. Z., Corn, A. L., Erin, J. N., Huebner, K. M., & Lewis, S. (2002). Teachers of students with visual impairments: What are they teaching?. Journal of Visual Impairment and Blindness, 96, 293-304. Woltmann, J., & Camron, S. C. (2009). Use of workload analysis for caseload establishment in the recruitment and retention of school-based speech-language pathologists. Journal of Disability Policy Studies, 20f, 178-183. Worrell, J. L., Di Gangi, P. M., & Bush, A. A. (2013). Exploring the use of the Delphi method in accounting information systems research. International Journal of Accounting Information Systems, 14, 193-208. doi: 10.1016/j.accinf.2012.03.003 Zabel, R. H., & Zabel, M. K. (2001). Revisiting burnout among special education teachers: Do age, experience, and preparation still matter? Teacher Education and Special Education, 24, 2, 128-139. doi: 10.1177/088840640102400207      210 Zaretksy, L., Moreau, L., & Faircloth, S. (2008). Voices from the field: School leadership in special education. Alberta Journal of Educational Research, 54, 161-177. Zambone, A. M., & Allman, C. (1988). Accessing services: State procedures for identifying and determining eligibility of young children with visual impairments. Topics in Early Childhood Special Education, 8, 75-85. Zarghami, F., & Schnellert, G. (2004). Class size reduction: No silver bullet for special education students' achievement. International Journal of Special Education, 19, 89-96. Zigmond, N. (2003). Where should students with disabilities receive special education services? Is one place better than another?. The Journal of Special Education, 37, 193-199. doi: 10.1177/00224669030370030901 Zigmond, N., & Baker, J. M. (1995). Concluding comments: Current and future practices in inclusive schooling. The Journal of Special Education, 29, 245-250. doi: 10.1177/002246699502900215 Zuvela, B. (2009). Educational services for children with vision loss in Canada. Calgary, AB: Alberta Education. Retrieved March 17, 2013 from http://visionalberta.ca/media/78649/educational%20services%20for%20students%20with%20vision%20loss%20in%20canada.doc           211 APPENDICES Appendix A Consent Form  Special Education Administrators and Workload Determination for Teachers of Students with Visual Impairment: A Delphi Study  Principal Investigator: Dr. Cay Holbrook Department of Educational and Counseling Psychology, and Special Education Faculty of Education University of British Columbia  Co-Investigator: Adam Wilton PhD Candidate Department of Educational and Counseling Psychology, and Special Education Faculty of Education University of British Columbia  This research project is being conducted in fulfillment of the co-investigator's doctoral program.  Study Purpose The purpose of the current study is to identify the administrative factors that impact workload determination for itinerant teachers of students with visual impairment (TSVIs). The issue of unmanageable caseloads for TSVIs has been well documented in peer-reviewed articles and professional writing in the field of education for students with visual impairment in the United States and Canada. Among the challenges associated with itinerant service delivery, unmanageable workload has been often cited by TSVIs. The administrative mechanisms that result in unmanageable workload are poorly understood. The current study seeks experts to identify those factors (e.g., legislation, personnel shortage) that impact workload determination for TSVIs. A nomination panel was struck in late 2015 and you were identified as an expert in special education administration and visual impairment. Expert participants fall into one of the three following groups:  Special education administrators working at in a local education authority (LEA) in the United States or Canada with three years or more experience in their current role, with at least five years previous experience as a teacher of students with visual impairment.  Administrators working at a state/provincial resource centre for students with visual impairment in the United States or Canada. Administrators must have three years or more experience in their current role, with at least five years previous experience as a teacher of students with visual impairment. This group includes administrators of outreach programs at state/provincial specialized schools for students with visual impairment.      212  Recognized expert in the area of service delivery for students with visual impairment in the United States or Canada, as identified through his or her record of scholarly or professional publications.  Study Procedures The current study seeks to capitalize on experts' knowledge and experience through the Delphi approach. The Delphi approach seeks to build consensus among a panel of experts through an iterative multistage process. Through an online survey interface (i.e., FluidSurveys.com), each participant will rate various factors according to his or her perceptions of the importance of that factor to the process of determining workloads for itinerant teachers of students with visual impairment. After each survey round, each participant will be provided with feedback in the form of aggregated results from the previous round, so that he/she can evaluate his/her perceptions against those of the group. Survey items that meet predetermined stability and convergence criteria will be not appear in the next iteration of the survey. Iterations will continue until no survey items meet these criteria. Following the last survey round, a concluding survey will be sent to participants. This survey will summarize the results of the Delphi survey process, and will ask participants to rate the relative importance of each survey item that met consensus/stability criteria.  Based on previous Delphi research, this study will be expected to run for approximately four rounds. There will be a feedback round following the end of the Delphi survey rounds. Participants will be given one month to complete each survey. With each survey expected to take between 45-60 minutes to complete, the anticipated time commitment will be a maximum of six hours over a span of five months. However, this may increase to five hours if a fifth round is required. It is not possible to predict the need for subsequent rounds at this time.  Study Results The results of the project will be made publicly available via https://circle.ubc.ca/ upon successful defense and submission of the co-investigator's dissertation. The results of the study will also be the topic of presentations given by the researchers at professional conferences. Dissemination via manuscripts submitted to peer-reviewed journals and professional publications is also anticipated.  Confidentiality Your confidentiality will be respected at every stage of this study. Your identity will remain hidden from other participants throughout the course of the study. The principal and co-investigator will be aware of the identity of each study participant, as direct communication between individual participants and the researcher is essential to the Delphi approach. Individual members of the nomination panel will be aware of the identities of the individuals they have nominated, but will not be made aware of whether or not those individuals are participants in the current study. The results of the study will be reported in aggregate, with no individual participants' results singled out.  Quotes from qualitative data entry fields may be included in the dissemination of the results of the study, but if any identifying information is included in the quote, it will be removed and      213 replaced with a generic label (e.g., "Idaho" – "State"). Survey questions will be limited to your professional experience and will not ask for any personal information. Questions regarding your profile as a professional (e.g., state/province/territory of residence, years of experience, credentials) are intended to characterize the overall sample and not to identify individual participants.  Potential Benefits of Participation One of the key advantages of the Delphi approach is that it allows individual expert participants to evaluate his or her perspective against aggregated data from the entire group of expert participants. This affords the expert an opportunity to thoughtfully critique his or her own assumptions/beliefs/ideas regarding workload determination for TSVIs.  More generally, participation in this research has the potential to benefit the field of education for students with visual impairment, as any subsequent professional guidelines resulting from this study will assist administrators without a background in visual impairment in making better informed decisions regarding TSVIs workloads in inclusive settings.  Recognition for Participation In order to recognize participants for their time and expertise, all participants will receive a $25 gift certificate to Amazon.com/.ca, depending on their location. Upon completion of the final survey round, participants will be asked to confirm their email address for receipt of the gift certificate via email. Gift cards sent via mail will also be available.  Contact for Information Regarding the Study Should you require any additional information, or to address any concerns prior to making an informed decision regarding consent to participate, you may contact Adam Wilton (co-investigator via email.  Contact for Complaints/Concerns about the Study If you have any concerns or complaints about your rights as a research participant and/or your experiences while participating in this study, contact the Research Participant Complaint Line in the UBC Office of Research Ethics.  As this is study is conducted entirely in an online environment, your signature is not required. By clicking "I AGREE" it will be assumed that you have read the information outlined above and consent to participate in the current study. By clicking "I DO NOT AGREE," it will be assumed that you do not consent to participate. You will be redirected away from the online survey, and will receive no further communication from the researchers.          214 Appendix B Round One Survey Email to Potential Panelists and Round One Survey Dear [Potential Participant],  My name is Adam Wilton, and I am a PhD candidate in Special Education at the University of British Columbia working under the supervision of Dr. Cay Holbrook. I am in the process of recruiting participants for my dissertation project, titled Special Education Administrators and Workload Determination for Teachers of Students with Visual Impairments: A Delphi Study. Late in 2015, I convened a nomination panel with expertise in the area of service delivery for students with visual impairment. I asked this panel to nominate individuals who are leaders in the field of visual impairment and educational administration, and who could expertly comment on issues related to special education administration and educational programming for students with visual impairment. I'm writing today because you were nominated by this panel, and I am asking if you would participate in the study described below. I am conducting a study using the Delphi approach to develop an expert-driven account of the key factors that administrators should consider when determining workloads for itinerant teachers of students with visual impairments. Ultimately, I hope to develop a series of guidelines for administrators who do not have any background in visual impairment, so that these administrators can make informed decisions regarding itinerant teacher workload. The Delphi approach seeks to build consensus among a panel of experts through an iterative multistage process. Through an online survey interface, each participant will rate various factors according to his or her perceptions of the importance of that factor to the process of determining workloads for itinerant teachers of students with visual impairment. After each survey round, each participant will be provided with feedback in the form of aggregated results from the previous round, so that he/she can evaluate his/her perceptions against those of the group. Survey items that meet predetermined stability and convergence criteria will be not appear in the next iteration of the survey. Survey rounds continue until no survey items meet these criteria. Based on previous Delphi research, this study will be expected to run for approximately four rounds. Participants will be given one month to complete each survey. With each survey expected to take between 45-60 minutes to complete, the anticipated time commitment will be six hours over a span of five months. It is not possible to predict the need for subsequent rounds at this time, however a minimum of four rounds is anticipated.  Thank you very much for your time and consideration. Please follow the hyperlink below if you are interested in participating in this study. The hyperlink directs to an online survey where you will be asked to consent to participate in the study. Basic information regarding your professional profile will also be collected to confirm your eligibility to participate. If you are interested in participating, please complete the online consent form and complete the round one survey document by February 25, 2016. Please feel free to contact me with any questions.   [Invite Link]       215  One last note: This link is uniquely tied to this survey and your email address. Please do not forward this message.  Best regards,  Adam Wilton PhD Candidate Department of Educational and Counseling Psychology, and Special Education Faculty of Education, The University of British Columbia  Round One Survey Tool       216       217      218        219        220         221        222        223        224        225       226             227        228        229        230        231        232        233        234        235        236        237        238        239          240        241        242        243       244        245       246            247  Appendix C Round Two Survey Email to Panelists and Round Two Survey Tool Dear [Panelist],  I hope this message finds you well. Thank you very much for completing the Round One survey of my dissertation study: Special Education Administrators and Workload Determination for Teachers of Students with Visual Impairments: A Delphi Study. It was very exciting to work through the data from Round One in preparation for Round Two.  Round Two is the most comprehensive survey tool of this study. The Round Two survey tool contains all personnel, policy, and educational programming factors from the previous round, paired with results from Round One (i.e., aggregated ratings and rationales). Round Two also contains factors that were nominated by participants. After Round Two, factors that meet consensus and stability criteria can be eliminated and will not appear in Round Three.  I received some feedback on the Round One survey tool. I have made some updates in Round Two in response to this feedback – you will find a listing on the first page of the Round Two survey.  [Invite Link]  I am fortunate to have tapped into such a diverse and well-informed pool of experts. I am very appreciative of your time and expertise. Thank you very much for your contribution to this study, and to the knowledge base surrounding the issue of teacher workloads.  Please be in touch if I can be of any assistance or provide any additional information.  Best regards,  Adam Wilton, PhD Candidate Department of Educational and Counseling Psychology, and Special Education Faculty of Education - University of British Columbia        248 Round Two Survey Tool          249        250         251         252        253        254        255        256         257        258        259       260       261       262       263       264       265       266       267       268       269       270       271         272        273        274        275        276        277       278        279       280       281        282        283        284       285       286       287        288        289        290        291        292        293          294 Appendix D Round Three Survey Email to Panelists and Round Three Survey Tool Dear [Panelist],  I hope this message finds you well. Thank you very much for completing the Round Two survey of my dissertation study: Special Education Administrators and Workload Determination for Teachers of Students with Visual Impairments: A Delphi Study. It was very exciting to see the data evolve as we move from Round Two to Round Three. The Round Three survey has approximately half (54%) of the number of items as the previous round. By applying the level of agreement, consensus, and stability criteria to the Round Two data, 30 total items were removed. A breakdown of these removed items can be found on the first page of the Round Three survey. I received some feedback on the Round Two survey tool and have made some adjustments to the Round Three tool. These include a larger default font setting as well as greater contrast between the various elements of each individual survey item. The link to the Round Three survey can be found below:  [Invite Link]  As the Round Three survey is considerably shorter than Round Two, I would ask that you complete Round Three by Monday, June 27, 2016. Just a reminder – typed qualitative responses are optional. Please let me know via email if you will not be able to complete the Round Three survey by this date.  Thank you very much for your time and expertise. We are rapidly approaching the end of the Delphi process.  Best regards,  Adam Wilton, PhD Candidate Department of Educational and Counseling Psychology, and Special Education Faculty of Education - University of British Columbia            295 Round Three Survey Tool       296          297              298                299       300        301       302       303             304         305        306       307       308       309       310       311       312       313       314       315       316       317            318         319       320       321       322       323       324       325       326       327       328       329       330       331       332         333  Appendix E Round Four Survey Email to Panelists and Round Four Survey Tool Dear [Panelist],  Thank you very much for your participation as an expert panelist in my dissertation study -  Special Education Administrators and Workload Determination for Teachers of Students with Visual Impairments: A Delphi Study. I appreciate your time in completing the Round Three survey. Following Round Three, there were three items that were not stable. There was a statistically significant difference between ratings in Round Two and Round Three across panelists for these items. As a result, the study was not able to arrive at a conclusive final set of confirmed factors following Round Three. I am in touch today to ask you to complete this very short Round Four survey.   [Invite Link]  Since this survey has only three items, it should take less than 5 minutes to complete. I would ask you to complete the Round Four survey by Monday, November 27, 2016.  Thank you very much for your time and expertise. This has been a very interesting and enlightening process, and I appreciate your support. I would like to send you a small token of my appreciation by way of an Amazon gift card. If you have an email address associated with an Amazon account other than the email contact I have for you, please enter it on page one of the survey. Otherwise, I will send the gift card electronically to the email contact I have.  All my best, and sincerest thanks.  Adam Wilton, PhD Candidate Department of Educational and Counseling Psychology, and Special Education Faculty of Education - University of British Columbia            334 Round Four Survey Tool            335           336                       

Cite

Citation Scheme:

        

Citations by CSL (citeproc-js)

Usage Statistics

Share

Embed

Customize your widget with the following options, then copy and paste the code below into the HTML of your page to embed this item in your website.
                        
                            <div id="ubcOpenCollectionsWidgetDisplay">
                            <script id="ubcOpenCollectionsWidget"
                            src="{[{embed.src}]}"
                            data-item="{[{embed.item}]}"
                            data-collection="{[{embed.collection}]}"
                            data-metadata="{[{embed.showMetadata}]}"
                            data-width="{[{embed.width}]}"
                            data-media="{[{embed.selectedMedia}]}"
                            async >
                            </script>
                            </div>
                        
                    
IIIF logo Our image viewer uses the IIIF 2.0 standard. To load this item in other compatible viewers, use this url:
https://iiif.library.ubc.ca/presentation/dsp.24.1-0343976/manifest

Comment

Related Items