Open Collections

UBC Theses and Dissertations

UBC Theses Logo

UBC Theses and Dissertations

The role of school personnel perceptions in predicting sustained implementation of school-wide positive… Mathews, Susanna 2011

Your browser doesn't seem to have a PDF viewer, please download the PDF to view this item.

Item Metadata

Download

Media
24-ubc_2011_fall_mathews_susanna.pdf [ 419.26kB ]
Metadata
JSON: 24-1.0072111.json
JSON-LD: 24-1.0072111-ld.json
RDF/XML (Pretty): 24-1.0072111-rdf.xml
RDF/JSON: 24-1.0072111-rdf.json
Turtle: 24-1.0072111-turtle.txt
N-Triples: 24-1.0072111-rdf-ntriples.txt
Original Record: 24-1.0072111-source.json
Full Text
24-1.0072111-fulltext.txt
Citation
24-1.0072111.ris

Full Text

THE ROLE OF SCHOOL PERSONNEL PERCEPTIONS IN PREDICTING SUSTAINED IMPLEMENTATION OF SCHOOL-WIDE POSITIVE BEHAVIOR SUPPORT by Susanna Mathews  B.A., The University of British Columbia, 2008  A THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF  MASTER OF ARTS in THE FACULTY OF GRADUATE STUDIES (School Psychology)  THE UNIVERSITY OF BRITISH COLUMBIA (Vancouver)  August 2011  © Susanna Mathews, 2011  Abstract The current study explored the extent to which implementation of critical features of School-wide Positive Behavior Support (SWPBS) predicted sustained implementation three years later. The respondents included school personnel from 261 schools across the United States who had been implementing SWPBS for at least three years. The PBIS Self-Assessment Survey (PBIS SAS) was administered to assess whether self-reported ratings of school personnel regarding fidelity of implementation in different SWPBS settings (school-wide, non-classroom, classroom, individual) predicted the fidelity of SWPBS practices in schools through an independent evaluation using the Benchmarks of Quality (BoQ) measure. Regression analyses indicated that self-reported fidelity of implementation of Classrooms Systems significantly predicted sustained implementation. Within Classroom Systems, regular acknowledgement of expected behaviors, matching instruction and curriculum materials to student ability, and access to additional support were the strongest predictors of sustained implementation. Results are discussed in terms of critical factors for focusing SWPBS training and professional development to increase the likelihood of sustained implementation.  ii  Preface This study was conducted solely by the graduate student under the advisement of her research supervisor, Dr. Kent McIntosh. The graduate student was primarily responsible for design, data preparation, and analysis of an extant dataset. Therefore, this thesis is representative of her work as the primary researcher and lead author. Data were extracted from an extant database in the Positive Behavioral Interventions and Supports Technical Assistance Center in the Educational and Community Supports area in the College of Education at the University of Oregon from schools who have signed agreements with the center to allow their data to be available for research purposes. Ethics approval was obtained from the UBC Behavioral Research Ethics Board (BREB) before conducting this research. The UBC BREB certificate number is H08-01010.  iii  Table of Contents Abstract .......................................................................................................................................... ii Preface ........................................................................................................................................... iii Table of Contents ......................................................................................................................... iv List of Tables ................................................................................................................................. v Acknowledgements ...................................................................................................................... vi Chapter 1: Introduction ............................................................................................................... 1 1.1 1.2 1.3 1.4  School-Wide Positive Behavior Support .................................................................. 1 Fidelity of Implementation as a Critical Mechanism Leading to Sustainability ...... 3 Implementer-level Variables Related to Fidelity and Sustainability ........................ 4 Purpose of the Study ............................................................................................... 11  Chapter 2: Method ...................................................................................................................... 12 2.1 2.2 2.3 2.4  Participants .............................................................................................................. 12 Measures ................................................................................................................. 13 Procedure ................................................................................................................ 16 Analyses .................................................................................................................. 16  Chapter 3: Results....................................................................................................................... 18 3.1  Prior Implementation in SWPBS Systems Predicting Sustained Implementation ..19  Chapter 4: Discussion ................................................................................................................. 24 4.1 4.2 4.3 4.4  Implementer-level Variables that Predicted Sustained SWPBS Implementation....24 Limitations .............................................................................................................. 29 Future Research ...................................................................................................... 30 Implications for Practice ......................................................................................... 31  References .................................................................................................................................... 33  iv  List of Tables Table 2.1  School Demographic Information……………...……………………………......13  Table 3.1  Descriptive Statistics for BoQ Total Score and PBIS SAS Subscale Scores…....18  Table 3.2  Correlations between Study Variables…………………………………………..19  Table 3.3  Prediction of Sustained SWPBS Implementation in 2009-10 from Prior Implementation (All SWPBS Systems) in 2006-07……………………………..20  Table 3.4  Descriptive Statistics for PBIS SAS Prior Implementation (Classroom Systems) Critical Feature Scores in 2006-07……………………....……………21  Table 3.5  Correlations between Classroom Systems Critical Features and Sustained Implementation………………………………….………………………………22  Table 3.6  Prediction of Sustained SWPBS Implementation in 2009-10 from Prior Implementation (Classroom Systems) Critical Features in 2006-07....………....23  v  Acknowledgements I am extremely grateful to my research supervisor, Dr. Kent McIntosh, for his unwavering support, guidance and confidence in me throughout my Masters program and with this project. I would also like to express my thanks to my committee members, Dr. Ruth Ervin, Dr. Cay Holbrook, and Dr. Sterett Mercer, for their invaluable feedback. In addition, Dr. Kent McIntosh, Dr. William McKee, and Dr. Laurie Ford have also been inspiring role models that have shaped both my academic and professional career. Of course, I would be nowhere without the support of my amazingly wonderful family and friends who have brought richness and balance in my life in the form of unconditional love, joy, and understanding. Lastly, I am thankful to the employees in the many coffee shops who allowed me to use them as a second home and treated me to many delicious delights when completing this project.  vi  Chapter 1: Introduction Today's schools serve a multicultural and multilingual student population with a diverse set of skills, strengths, and challenges. School personnel have prime opportunities to implement and sustain effective practices that promote the academic, social-emotional, and behavioral success of their students, as students spend more than 14,000 hours in the school environment from Kindergarten to Grade 12. However, the prevalence of moderate to severe violence in schools, coupled with a rising number of students suspended or expelled, suggests that systems of punitive consequences (e.g., detentions, suspensions) and zero tolerance policies do little to foster positive outcomes for students (Skiba & Peterson, 2000). 1.1  School-Wide Positive Behavior Support As an alternative to exclusionary measures of discipline, School-Wide Positive Behavior  Support (SWPBS; Sugai & Horner, 2009) is a preventive approach featuring research-validated strategies derived from principles of applied behavior analysis and person-centered planning to support social and learning outcomes, improve overall school climate, and minimize problem behavior. The SWPBS model includes three foundational aspects. First, effective research-based practices, data, and systems (e.g., reinforcement, discipline) are integrated to achieve valued student outcomes, support staff efforts, guide decision-making, and support student behavior. Second, SWPBS is implemented across the many settings in a school, such as the school-wide setting (i.e., applies to all students in all settings), non-classroom settings (e.g., applies to cafeteria, hallways), classroom settings, and individual student settings (i.e., applies to supporting individual students based on their specific needs). Third, when compared to narrowly designed programs that target a specific population or focus on specific aspects of behavior (e.g., bullying), SWPBS offers a continuum of support provided for all students across all school 1  settings, matched to the level of support needed. Sugai and Horner (2002) emphasized the importance of implementing SWPBS within a three-tiered model. The universal level exposes all students to common behavioral instruction and aims to prevent the development of chronic problem behavior in students (McIntosh, Filter, Bennett, Ryan, & Sugai, 2010). The secondary and tertiary levels provide increasingly intensive additional support to match the intensity of student needs if their behaviors are generally not responsive to universal instruction. Schools are complex “host environments” that are shaped by different social, political, economic, pedagogical, legal, cultural, demographic, and historical forces (Kame’enui, Simmons, & Coyne, 2000). Consequently, SWPBS practices respond to the unique needs of a school to facilitate its effectiveness as a host environment, so that teachers can implement and sustain effective evidence-based practices (McIntosh et al., 2010). Typical implementation of SWPBS includes addressing the school-wide, non-classroom, classroom, and individual settings (Sugai, Horner, & Todd, 2000). All school personnel are involved in establishing the schoolwide framework of behavior support, which is then incorporated into classroom and nonclassroom settings. However, although the other settings are an extension of school-wide settings, implementation is often dependent on individual school personnel (e.g., classroom teachers, bus drivers, cafeteria supervisors), whose regular interactions with students are intended to be consistent with school-wide expectations. Individual supports are then layered onto each setting for students requiring additional support across the school. Thus, establishing a school-wide framework of support provides a common foundation upon which the other settings can be established (Lewis & Sugai, 1999). SWPBS has been adopted by over 14,000 schools in the last 15 years (Center on Positive Behavioral Interventions and Supports, 2011) and has been associated with valued outcomes 2  such as (a) decreases in office discipline referrals, suspensions, and expulsions, (b) increases in classroom instructional time, positive student-teacher interactions, academic achievement, and social competency, and (c) improvement in social climate (e.g., Algozzine & Algozzine, 2007; Bohanon et al., 2006; Bradshaw, Mitchell, & Leaf, 2010; Luiselli, Putnam, Handler, & Feinberg, 2005; Muscott, Mann, & LeBrun, 2008; Nelson, 1996; Scott & Barrett, 2004).  1.2  Fidelity of Implementation as a Critical Mechanism Leading to Sustainability Han and Weiss (2005) defined sustainability as the continued implementation of a  practice with ongoing implementation fidelity to the core program principles, after supplemental resources used to support initial training and implementation are withdrawn. The continual reimplementation of different interventions every few years is costly (e.g., loss of money, training, time, school programming) and breeds negative attitudes and resistance in school personnel in implementing and sustaining practices (Greenberg, Weissburg, & O'Brien, 2003). Thus, sustained implementation of effective well-designed, multiyear, school-based preventative practices should be identified as a key goal at the onset of planning activities (Pluye, Potvin, Denis, Pelletier, & Mannoni, 2005). Because fidelity of implementation is the mechanism by which valued outcomes are obtained, fidelity therefore becomes critical in sustainability, as without it, the practice will no longer be capable of achieving those outcomes (McIntosh, Horner, & Sugai, 2009). However, quality implementation of SWPBS is a necessary but insufficient condition for sustaining the practice in the long term (Coffey & Horner, in press; Fixsen, Naoom, Blase, Friedman, & Wallace, 2005). Previous sustainability research has identified systems-level factors responsible for program sustainability, including teaming at the school and district levels, systems of ongoing coaching to ensure high fidelity of implementation, 3  and the use of data for decision-making at all levels of support (Barrett, Bradshaw, & LewisPalmer, 2008; McIntosh et al., 2010). Coffey and Horner (in press) investigated differences in schools who were sustainers and non-sustainers of SWPBS measured by an independent fidelity of implementation evaluation tool known as the School-wide Evaluation Tool (SET; Horner et al., 2004). They reported that sustainers and non-sustainers had statistically significant differences in implementation of the following areas: expectations defined, expectations taught, systems for rewarding behavior, systems for responding to behavioral violations, data based decision-making, and effective teaming. They also noted that the use of a reward system had the largest effect size (Cohen’s d = .78) in differentiating schools identified as sustainers and non-sustainers of SWPBS.  1.3  Implementer-level Variables Related to Fidelity and Sustainability In addition to these important systems-level factors, it is also important to consider  factors at the implementer-level (i.e., those implementing the practice with students on a daily basis). Han and Weiss (2005) noted that school personnel are critical to the sustainability of school-based programs because of the limited resources available to meet student needs. In addition, school personnel have regular interactions with the student population and ongoing opportunities to implement and sustain effective practices. Thus, school personnel perceptions and beliefs regarding SWPBS will ultimately determine the quality and sustainability of their efforts at the grassroots level. When investigating the key teacher-specific variables that promote sustainable practices, Han and Weiss (2005) hypothesized that a sustainable teacherimplemented program is (a) acceptable, (b) effective, (c) feasible to implement on an ongoing basis with minimal but sufficient resources, and (d) flexible and adaptable. 4  1.3.1  Acceptability to Teachers  The priority for implementing a practice is contingent on whether a teacher accepts the core values and structure of a program (Han & Weiss, 2005). Kazdin (1981) defined treatment acceptability as judgments made by the clients or laypersons regarding whether the selected intervention is appropriate and reasonable for treating the identified problem. For a practice to be seen as acceptable, the practice must be viewed as both meeting the needs of students and complementing the individual teacher’s style (Han & Weiss, 2005). Von Brock and Elliott (1987) noted the reciprocal relationship between perceived effectiveness and the acceptability of an intervention. Not surprisingly, interventions that are anticipated to be effective are perceived to be more acceptable. Practices that are time-efficient (Witt, Martens, & Elliott, 1984), address an identified gap in existing services (Han & Weiss, 2005), and meet the diverse needs of all students in the classroom (Gersten, Chard, & Baker, 2000) are also more likely to be deemed as acceptable. Elliott, Witt, Galvin, and Peterson (1984) stated that the severity of student behavior also influences acceptability ratings, as complex and time-intensive interventions are more acceptable when student behavior is severe (e.g., destruction of property). Elliott and colleagues identified that positive interventions (e.g., praise, reinforcement, token economy) were rated as more acceptable than punitive practices (e.g., ignoring, time-out), though other research suggests that school personnel believe punitive practices (e.g., suspension) to be a logical response to problem behavior regardless of empirical evidence showing their detrimental effects (Lohrmann, Forman, Martin, & Palmieri, 2008; Von Brock & Elliott, 1987). Rohrbach and colleagues (1993) found that teachers who exhibited greater openness and enthusiasm in training were more likely to implement with fidelity, even if they had less overall teaching experience. 5  However, even if deemed potentially effective, implementers may refuse to implement the practice with fidelity if they deem it unacceptable to their values or those of the recipients, regardless of whether the practice is empirically proven to improve student outcomes (Witt et al., 1984). For example, some educators may believe that they are already meeting the behavior needs of their students without needing to change their practice (McKevitt & Braaksma, 2008) or that positive reinforcement for appropriate behavior is unacceptable and possibly harmful, as students should be intrinsically motivated to act in socially appropriate and valued ways (Lohrmann et al., 2008). Thus, the compatibility of the program with teachers’ own beliefs regarding behavior change and classroom management techniques influences the perceived acceptability of the practice, which is a critical factor in determining their priority to implement the practice. 1.3.2  Effectiveness  Han and Weiss (2005) stated that the effectiveness of a practice is judged by the extent to which desired goals and outcomes are reached (e.g., positive student outcomes), and potential for effectiveness is dependent on whether the practice is both evidence-based and implemented with a high degree of fidelity. High acceptability of a practice is not sufficient to ensure the quality of implementation (Noell et al., 2005), as school personnel must continue to implement the practice with fidelity and adherence to the core principles that influence its effectiveness. Consequently, providing school personnel with only the procedural skills needed to implement the practice is not adequate to ensure sustained fidelity of implementation. When investigating the sustained implementation of Peer-Assisted Learning Strategies (PALS) in mathematics, Baker and colleagues (2004) noted that it was critical that school personnel developed and possessed conceptual understanding of the core principles of the practice to 6  ensure its effectiveness. A practice using a manualized curriculum, with little room for teacher individualization, may lead to a lack of interest in implementation. However, if the scope of the practice is too broad or abstract, low understanding of core principles may prevent its fidelity of implementation. If teachers are only taught the procedures of the practice without understanding why it works and how it differs from their original beliefs and underlying assumptions about how students learn, teachers may either modify surface features (e.g., activities, materials, classroom organization) or merely add the new practice onto existing efforts without changing classroom norms (Coburn, 2003). Gersten and Dimino (2001) advocated for building practice mastery by changing teachers’ observed practice while simultaneously developing conceptual understanding. High fidelity can then transform procedural understanding into conceptual understanding (Vaughn, Klingner, & Hughes, 2000). Teacher efficacy may also contribute to sustained implementation. Berman and McLaughlin (1977) defined teacher efficacy as teachers’ perceptions about their abilities to teach students effectively, regardless of the number of years of teaching experience, student challenges, or lack of student motivation. Teachers who are confident in their teaching abilities appear to be most receptive to learning and implementing new practices (Guskey, 1988; Sparks, 1988), can set higher expectations for their students while simultaneously providing feedback and support (Baker et al., 2004), and are more likely to achieve valued outcomes and sustain the practice despite setbacks (Forman & Burke, 2008). Conversely, teachers with low efficacy may be more likely to experience professional burnout, feel emotionally tired and fatigued, display indifferent or even negative attitudes toward students, and may stop implementation prematurely or after administrative funding ends (Han & Weiss, 2005). However, as school personnel  7  experience positive outcomes from consistent and quality SWPBS implementation, their sense of efficacy may increase (Bennett & McIntosh, in press; Ross & Horner, 2006). In addition, initial effectiveness of the practice and the visibility of the results has a positive impact on the beliefs and motivation of teachers to sustain implementation (Gersten & Dimino, 2001). Observing visible change in student behavior is often a powerful motivator in influencing teachers’ attitudes toward a practice (Guskey, 1986). Moreover, the effectiveness of the practice in meeting the needs of students and improving the school climate may predict the sustained implementation of the practice more than its empirical validation or any perceived pressure to use a particular practice (Boardman, Argüelles, Vaughn, Hughes, & Klingner, 2005; Klingner, Arguelles, Hughes, & Vaughn, 2001). Han and Weiss (2005) noted that teachers must attribute the positive student outcomes to their implementation rather than extraneous factors. The use of data-based problem solving provides a concrete and visible framework for systematically assessing the usefulness, effectiveness, and efficiency of a practice (McIntosh, Horner, et al., 2009). As a result, school personnel can see through graphs how their implementation is related to improved student outcomes. 1.3.3  Feasibility of Ongoing Implementation with Minimal but Sufficient Resources  For sustained implementation, there must be a balance between the effort required to implement the practice and the effectiveness of the practice in attaining desired outcomes (McIntosh, Horner, et al., 2009). A practice will only be sustained if it meets the “reality principle,” that teachers can feasibly integrate the practice into the their existing job responsibilities and expectations (Gersten & Dimino, 2001). The typical school schedule does not leave much opportunity for planning, and the overall SWPBS process can be seen as time 8  and resource intensive due to the length of time required for implementation (Bambara, Nonnemacher, & Kern, 2009). If SWPBS is typically implemented by staff during school hours, the time spent implementing SWPBS decreases the time available for other school-related activities (Blonigen et al., 2008). School personnel may eventually rate the perceived time demand and effort for SWPBS implementation to decrease as they become more skilled in implementation (Ervin, Schaughency, Matthews, Goodman, & McGlinchey, 2007). However, the practice may not be sustained if overall costs associated with continued implementation interfere with other practices or exceed the capacity of the school even if the outcomes are valued (McIntosh et al., 2010). The lack of continued administrative support has been identified as a barrier for teacher implementation of new practices (Kincaid, Childs, Blase, & Wallace, 2007; Lohrmann et al., 2008). Han and Weiss (2005) proposed that the influence of principal support operates through both instrumental and emotional channels. Continued administrative support includes allocating resources (e.g., time, incentives, training) to the practice, communicating expectations, and addressing competing practices that may decrease resources and shift priorities (McIntosh, Horner, et al., 2009). In addition, research indicates that teachers’ fidelity of implementation increases when school principals provide explicit encouragement and monitoring (Rohrbach et al., 1993). School personnel may not adopt the practice if they fear that the practice will not be allocated the resources needed to sustain itself (Boardman et al., 2005). A state or district leadership team that secures resources and political support, in addition to coordinating training, coaching, and evaluation, assures school staff that implementation will be supported (Sugai & Horner, 2006).  9  If initial external funding is not used to build capacity under existing resource constraints, continued implementation of the practice is threatened once that external funding ends (Adelman & Taylor, 2003). Financial reliance on outside resources (e.g., grants) and involvement of external consultants should decrease to facilitate the sustained implementation of a practice (Han & Weiss, 2005). If local expertise has not been cultivated, school personnel may lack the skills to maintain adequate fidelity of implementation, which affects the effectiveness of the practice and its subsequent sustained implementation (Han & Weiss, 2005). 1.3.4  Flexibility and Adaptability  For sustained implementation to occur, the practice must be integrated into the local school culture while retaining the intervention features critical to its effectiveness (Baker et al., 2004; Castro, Barrera, & Martinez, 2004). As a result, consideration must be given to the balance between the fidelity of implementation and its contextual fit, the extent to which the practice meets the identified needs of a school while being aligned with the values, resources, and administrative support found within the host environment (Albin, Lucyshyn, Horner, & Flannery, 1996). Contextual fit is an important component of building consensus and buy-in among all stakeholders, which ultimately facilitates their commitment to implementation (Fixsen et al., 2005). Contextualized intervention plans are created based on school members’ perceptions regarding the needs of the school, which guides the selection of relevant evidencebased practices to fit student needs (Sugai, Horner, & McIntosh, 2008). In addition, after building staff commitment to the practice, contextual adaptations take place in individual classrooms, where the practice is adapted to fit the culture of the classroom and values of the teacher (Castro et al., 2004). Adaptations to the practice usually occur after initial training and consultation is completed, which may lead to decreased fidelity of implementation, thereby 10  threatening both the effectiveness and sustained implementation of the practice. School personnel also need a repertoire of strategies to adapt instruction to avoid removing the critical components of the practice that make it effective (Klingner, Ahwee, Pilonieta, & Menendez, 2003; Vaughn et al., 2000). Initial and ongoing professional development allows the continued refinement of implementation, as school personnel can successfully contextualize the practice when responding to the demands of a changing host environment (Baker et al., 2004).  1.4  Purpose of the Study The purpose of this study was to investigate the extent to which implementation of  critical features of SWPBS predicted sustained implementation three years later. This study evaluated self-reported ratings of school personnel regarding their fidelity of implementation in different SWPBS systems (school-wide, non-classroom, classroom, individual), as school personnel are central change agents and bottom-line determinants of sustained implementation in their role as core implementers with students (Bambara et al., 2009). The current study operationalized sustained implementation as the level of SWPBS implementation (as measured through validated fidelity of implementation measures) obtained after a minimum of three years of implementation, a commonly used length of time to indicate sustainability (Mihalic & Irwin, 2003). As part of the study, both systems-level and implementer-level practices were examined. Two research questions were addressed: 1. To what extent do school personnel ratings of fidelity of implementation of SWPBS systems predict sustained implementation? 2. Within significantly predictive systems, which critical features of implementation significantly predict sustained implementation? 11  Chapter 2: Method 2.1  Participants The respondents included school personnel from 261 schools across the United States  who had been implementing SWPBS for at least three years. Criteria for inclusion in this study specified that schools were administered the PBIS Self-Assessment Survey (PBIS SAS) in 200607 and the Benchmarks of Quality (BoQ) measure in 2009-10. School demographic data, including school size, urbanicity, geographic area, ethnicity, and Title I funding were available for 44% of the sample and are reported in Table 2.1. Of the demographic data available, most schools (72%) were located in Illinois. The majority of schools were small to mid-sized and located in suburban areas or small to medium-sized cities.  12  Table 2.1 School Demographic Information Geography Illinois  Oregon  Iowa  Michigan  15%  4%  4%  South  Carolina  Carolina  1%  1%  Colorado  (n = 115) 72%  North  2%  Small to Urbanicity  SemiRural  (n = 115)  Large Suburban  Medium  Rural  City City  Enrollment  6%  11%  48%  31%  ≤300  301-750  751-999  ≥1000  22%  66%  8%  4%  White  Hispanic  Black  Asian  2%  (n = 113)  Ethnicity  American  (n = 113)  Indian 51%  28%  16%  5%  1%  Title I Not Status  Title I Title I  (n = 83) 57%  2.2  43%  Measures 2.2.1  PBIS Self-Assessment Survey (PBIS SAS)  The PBIS SAS (Sugai, Horner, & Todd, 2003) is completed by school personnel to selfreport assessment of SWPBS settings in their school. It is a 43-item survey with four sections: 13  School-wide Systems (across all settings in the school, 15 items); Non-classroom Systems (in settings such as cafeterias, playgrounds, and hallways; 9 items), Classroom Systems (in classrooms; 11 items), and Individual Systems (support for individual students with chronic problem behaviors; 8 items). Each item corresponds to a critical feature of SWPBS in a particular setting, and respondents self-report the implementation status of each critical feature (in place, partially in place, or not in place) to identify their perceptions regarding whether a specific critical feature is currently in place at their school. Individual responses are aggregated to provide school-level data in terms of percent of features in place at the individual item and systems level. Psychometric data for the PBIS SAS are available from two studies. A study by HaganBurke and colleagues (2005) involving 37 schools reported strong internal consistency for the scales. Safran (2006) further investigated the measure’s internal consistency and construct validity. Cronbach’s alpha scores across three schools indicated strong internal consistency for the overall measure and moderate to strong internal consistency for the SWPBS setting subscales. Validity was assessed by evaluating the extent staff in one school used the results to guide action planning and future implementation. When the information was shared with the team members of one school, observations indicated that the results provided a valuable means of facilitating staff participation, generating discussion, and linking results to action planning. Results also indicated that each school displayed a different pattern of ratings, which suggests that the PBIS SAS was able to identify the unique strengths and weaknesses of SWPBS settings in each school. In the current study, PBIS SAS subscale scores were used to measure prior implementation for each of the SWPBS four settings (i.e., School-wide, Non-classroom, 14  Classroom, Individual) in 2006-07. Prior implementation was indicated through the percent of critical features self-reported as in place at each school. Sample reliability for the PBIS SAS subscale scores was strong, as indicated by high Cronbach’s alpha coefficients (α range = .86 to .97). 2.2.2  Benchmarks of Quality (BoQ)  The BoQ (Kincaid, Childs, & George, 2005) is a 53-item independent evaluation of fidelity of SWPBS implementation based on critical features of SWPBS. The BoQ consists of 10 subscales: PBS Team (4 items), Faculty Commitment (3 items), Effective Discipline Procedures (7 items), Data Entry (5 items), Expectations and Rules (5 items), Reward System (8 items), Lesson Plans (6 items), Implementation Plans (7 items), Crisis Plans (3 items), and Evaluation (5 items). A coach who is external to the school (e.g., district-level administrator) completes the BoQ using his or her knowledge of the school and the Scoring Guide, which provides operational definitions of the ratings for each item. Each item has a maximum value between 1 and 3 points, and the maximum possible BoQ total score is 100. A BoQ total score of 70 or more indicates adequate fidelity of SWPBS implementation. Psychometric properties for the BoQ were examined by Cohen and colleagues (2007). They reported strong one week test-retest reliability (r = .94). Two-week interrater reliability, comparing the ratings of school team members to those of other individuals who were familiar with the school (e.g., external coach, district coordinator), was also high (r = .87). The total score correlation of the BoQ with the School-wide Evaluation Tool (SET; Horner et al., 2004), a widely used research-based tool for evaluating SWPBS implementation, produced a Pearson product-moment correlation of .51, indicating moderate concurrent validity. High predictive validity for the BoQ was established, as schools with higher BoQ total scores (>70%) for two 15  years of implementation had decreased rates of office discipline referrals. The BoQ total score was used in this current study as a measure of sustained SWPBS fidelity of implementation. Sample reliability for the BoQ total score was strong, as indicated by a high Cronbach’s alpha coefficient (α = .90).  2.3  Procedure Data extracted for this study included PBIS SAS scores in 2006-07 and the BoQ total  score in 2009-10 for 261 schools in the United States that have been implementing SWPBS for at least three years. Data used in this study were obtained from an extant database in the Positive Behavioral Interventions and Supports Technical Assistance Center in the Educational and Community Supports area in the College of Education at the University of Oregon. Data were only accessed from schools who signed agreements with the center to allow their data to be available for research purposes.  2.4  Analyses Two sets of regression analyses were conducted to investigate the extent to which self-  reported prior implementation predicted sustained SWPBS implementation. In all analyses, the BoQ total score in 2009-10 served as the outcome variable, and various subscales from the PBIS SAS administered in 2006-07 served as predictor variables. First, a multiple regression analysis was conducted to evaluate the extent to which prior implementation of each SWPBS setting (i.e., School-wide, Non-classroom, Classroom, Individual) predicted sustained implementation, as measured by the BoQ total score in 2009-10. Second, a multiple regression analysis was conducted to analyze specific critical features of any SWPBS setting identified in the first 16  analysis that significantly predicted sustained implementation. Assumptions were tested, with no violations of independence, linearity or homoscedasticity of residuals being detected. Outliers were removed from the analysis if they exceeded the critical Mahalanobis distance allowed for the analysis (Tabachnick & Fidell, 2007; 4 predictors = 18.27; 5 predictors = 20.23; 11 predictors = 31.26). Evaluations of the Trimmed Mean and Cook’s Distance did not reveal any further problems with outliers. All predictor variables were mean-centered to enhance interpretability of results and mitigate the effects of high predictor intercorrelations.  17  Chapter 3: Results Descriptive statistics for the BoQ total score and the PBIS SAS prior implementation subscale scores for each of the SWPBS settings are presented in Table 3.1. Sustained implementation of SWPBS in 2009-10 was adequate for most schools, as indicated by the mean BoQ total score (M = 83%, SD = 11.40, Range = 38 to 100%). Table 3.2 displays the results of intercorrelations between the PBIS SAS prior implementation subscales in the different systems (School-wide, Non-classroom, Classroom, Individual) with sustained implementation. There were strong, statistically significant correlations between all the prior implementation variables of each SWPBS setting. Prior implementation in each of the SWPBS settings had weak to moderate, statistically significant positive correlations with subsequent sustained implementation. Table 3.1 Descriptive Statistics for BoQ Total Score and PBIS SAS Subscale Scores Measure  Mean  SD  Range  BoQ (2009-10)  83%  11.40  38 – 100%  School-wide  61%  19.30  7 – 100%  Non-Classroom  56%  20.28  8 – 100%  Classroom  62%  16.18  15 – 100%  Individual  42%  18.27  3 – 100%  PBIS SAS Prior Implementation (2006-07)  18  Table 3.2 Correlations between Study Variables PBIS SAS BoQ Prior Implementation SW BoQ  NC  CR  IND  -SW  .30***  --  NC  .33***  .93***  --  CR  .33***  .87***  .87***  --  IND  .29***  .84***  .80***  .81***  PBIS SAS Prior Implementation --  Note. *p < .05. **p < .01. ***p < .001. SW = School-wide Systems, NC = Non-classroom Systems, CR = Classroom Systems, IND = Individual Systems  3.1  Prior Implementation in SWPBS Systems Predicting Sustained Implementation Results of the multiple regression analysis (displayed in Table 3.3) indicate that the  overall model involving self-reported fidelity of prior implementation in all SWPBS settings was statistically significant, F(4, 254) = 8.18, p < .001 and explained 10% of the variance in sustained implementation. When evaluating the individual predictors, only prior implementation in Classroom Systems was a statistically significant predictor, β = .28, p < .05.  19  Table 3.3 Prediction of Sustained SWPBS Implementation in 2009-10 from Prior Implementation (All SWPBS Systems) in 2006-07 b  SE b  β  Model Constant  83  .68  School-wide  -.16  .11  -.26  Non-classroom  .14  .10  .25  Classroom  .19  .09  .28*  Individual  .05  .07  .08  R2  Adjusted R2  .11***  .10***  Note. *p < .05. **p < .01. ***p < .001. For the second research question, the critical features of the Classroom Systems subscale were examined. Descriptive statistics for the PBIS SAS prior implementation (Classroom Systems) critical features in Table 3.4 indicate that defining (M = 83%) and teaching (M = 76%) expected behaviors was rated as strongly implemented. The other critical items had weak implementation (M = 47 - 67%). Table 3.5 indicates there were statistically significant small to medium positive correlations between each Classroom Systems feature and the overall BoQ score. There were strong, statistically significant intercorrelations between all Classroom System features.  20  Table 3.4 Descriptive Statistics for PBIS SAS Prior Implementation (Classroom Systems) Critical Feature Scores in 2006-07 PBIS SAS Prior Implementation (Classroom Systems; 2006-07) 1.  Expected student behavior and routines in classrooms are  Mean  SD  Range  83%  15.38  21 - 100%  stated positively and defined clearly 2.  Problem behaviors are defined clearly  67%  19.19  0 - 100%  3.  Expected student behavior and routines in classrooms are  76%  19.71  0 - 100%  59%  22.77  0 - 100%  taught directly 4.  Expected student behaviors acknowledged regularly (positively reinforced; >4 positives to 1 negative)  5.  Problem behaviors receive consistent consequences  50%  21.23  0 - 100%  6.  Procedures for expected and problem behaviors are  57%  23.70  0 - 100%  52%  21.24  0 - 100%  61%  19.28  0 - 100%  47%  22.54  0 - 100%  48%  20.05  0 - 100%  55%  20.79  0 - 100%  consistent with school-wide procedures 7.  Classroom-based options exist to allow classroom instruction to continue when problem behavior occurs  8.  Instruction and curriculum materials are matched to student ability (math, reading, language)  9.  Students experience high rate of academic success (≥75%)  10. Teachers have regular opportunities for access to assistance and recommendations 11. Transitions between instructional and non-instructional activities are efficient and orderly  21  Table 3.5 Correlations between Classroom Systems Critical Features and Sustained Implementation BoQ  1  2  3  4  5  6  7  8  9  10  BoQ  --  1  .25***  --  2  .28***  .73***  --  3  .24***  .71***  .57***  --  4  .31***  .63***  .67***  .74***  --  5  .26***  .59***  .68***  .68***  .73***  --  6  .28***  .67***  .77***  .75***  .76***  .83***  --  7  .29***  .57***  .70***  .65***  .73***  .76***  .82***  --  8  .23***  .48***  .54***  .50***  .56***  .55***  .51***  .53***  --  9  .23***  .40***  .47***  .55***  .66***  .59***  .57***  .58***  .64***  --  10  .21***  .47***  .55***  .58***  .62***  .57***  .63***  .61***  .53***  .55***  --  11  .30***  .62***  .57***  .77***  .66***  .69***  .73***  .67***  .56***  .57***  .66***  11  --  Note. *p < .05. **p < .01. ***p < .001. 1 = Expected student behavior/routines are stated positively and defined clearly 2 = Problem behaviors are defined clearly 3 = Expected student behaviors and routines in classrooms are taught directly 4 = Expected student behaviors are acknowledged regularly (positively reinforced) 5 = Problem behaviors receive consistent 6 = Procedures for expected/problem behaviors are consistent with school-wide procedures  7 = Classroom-based options exist to allow instruction to continue when problem behavior occur 8 = Instruction and curriculum materials are matched to student ability 9 = Students experience high rates of academic success 10 = Teachers have regular opportunities for access to assistance and recommendations 11 = Transitions between instructional and non-instructional activities are efficient and orderly  22  Table 3.6 indicates that multiple regression results for the overall model predicting sustained implementation from the critical features of the Classroom Systems subscale was statistically significant and explained 14% of the variance, F(11, 234) = 4.61, p < .001. Significant predictors at the .05 level included: Expected behaviors are acknowledged regularly (β = .29); instruction and curriculum materials are matched to student ability (β = .22); and teachers have opportunities for access to assistance and recommendations (β = -.21). The negative beta weight for access to assistance, combined with its positive correlation to the outcome indicates net suppression (Cohen & Cohen, 1975). Table 3.6 Prediction of Sustained SWPBS Implementation in 2009-10 from Prior Implementation (Classroom Systems) Critical Features in 2006-07 `  b  SE b  β  Model Constant  83  .70  Expected Behaviors Defined  -.05  .10  -.07  Problem Behaviors Defined  .10  .09  .16  Expected Behaviors Taught  -.11  .09  -.16  Expected Behaviors Acknowledged  .16  .07  .29*  Consequences  -.09  .07  -.16  Procedures  -.01  .08  -.02  Options Exist  .11  .07  .18  Instruction/Materials Match Ability  .14  .06  .22*  Academic Success  -.03  .05  -.06  Access to Assistance  -.13  .06  -.21*  Transitions Efficient  .12  .07  .21  R2  Adjusted R2  .18***  .14***  Note. *p < .05. **p < .01. ***p < .001. 23  Chapter 4: Discussion The current study explored the extent to which implementation of critical SWPBS features predicted sustained implementation three years later. In this study, sustained implementation was defined as the continued implementation of a practice with ongoing implementation fidelity after three years. Ratings of school personnel from 261 schools were assessed to investigate to what extent their implementation in different SWPBS settings predicted sustained SWPBS implementation three years later. Self-reported ratings were assessed using the Positive Behavior and Intervention Support Self-Assessment Survey (PBIS SAS). An independent evaluation of SWPBS practices schools using the Benchmarks of Quality (BoQ) measure was administered three years later as an indicator of sustained SWPBS implementation. Results of multiple regression analyses indicated that prior implementation in Classroom Systems was a statistically significant unique predictor of sustained SWPBS implementation. Within the individual features of the Classroom Systems subscale, acknowledging expected student behaviors regularly and positively and instructionally matching curriculum to student need were statistically significant unique predictors of sustained SWPBS implementation. Access to additional support was also a significant predictor of sustained implementation.  4.1  Implementer-level Variables that Predicted Sustained SWPBS Implementation 4.1.1  School-wide versus Classroom Practices  The finding that the Classroom System subscale was a stronger predictor of school-wide implementation than the School-wide and Non-classroom Systems subscales may be somewhat surprising to general audiences but is supported by the implementer-specific theory proposed by 24  Han and Weiss (2005). Students spend the vast majority of their school day in the classroom. As such, school-wide interventions will be most effective when fully implemented in the classroom setting. In their roles as core implementers of SWPBS, classroom teachers have regular and ongoing opportunities to implement SWPBS practices in their classrooms by creating environments that increase the likelihood of students learning both academic and behavior skills. Although SWPBS is a school-wide approach, implementation quality and sustained implementation may be contingent on the extent to which individual teachers implement SWPBS classroom practices with high fidelity. However, the results do not suggest that implementation in the other settings is not critical. The strong correlations between each of the PBIS SAS subscales show that implementation in each area is strongly linked to implementation in the others. Thus, an overwhelming focus on one area to the detriment of others is not supported by these findings. Creating a common school-wide framework of expectations, values, and systems of support is critical in developing the school into an effective host environment by building internal capacity within the school environment and school personnel to implement effective interventions in all SWPBS settings without the reliance on supplemental funding or support, which subsequently increases the likelihood of sustained SWPBS implementation with the continued attainment of valued outcomes (e.g., student success, improved organizational health). For example, when exploring barriers to sustainability of Individual Systems, Bambara and colleagues (2009) found that the lack of a supportive school-wide culture (including conflicting beliefs and techniques regarding effective behavior management practices) made it difficult to sustain Individual SWPBS systems. Teachers also face many challenges in responding to the multiple demands inherent in their responsibilities (e.g., curriculum development, paperwork, meetings), which 25  may lead to a sense of burnout and emotional exhaustion (Han & Weiss, 2005). The ease, efficiency, and quality of implementation demonstrate the degree of individuals’ procedural and conceptual understanding of the core values of SWPBS and its alignment to their individual beliefs. Thus, although developing a common underlying framework of SWPBS values and expectations is critical, it may also be effective to focus on helping school personnel translate these core values into their everyday teaching practices, which increases both quality implementation and enhanced prospects for sustained implementation. 4.1.2  Regular Positive Reinforcement  The results of the current study highlight the critical importance of regular acknowledgement of expected student behaviors as a key instructional variable that predicts sustained SWPBS implementation. It is noted that although defining and teaching expected behaviors were not significant unique predictors of sustained implementation, these two features had the highest implementation in the classroom. Positive acknowledgement of appropriate behavior, a central tenet of SWPBS practices (Sugai & Horner, 2006), is the next step in increasing the likelihood of the desired behavior in the future while also fostering positive student-teacher interactions. However, educators and parents hold diverse views on whether to and how to acknowledge appropriate student behavior, as some individuals may believe that students should be intrinsically motivated to behave appropriately (Lohrmann et al., 2008) and fear that the use of an acknowledgement system could be detrimental to intrinsic motivation (Deci, Koestner, & Ryan, 2001). Despite consistent recommendations for increasing ratios of positive to negative statements in the classroom (e.g., four positive acknowledgements for each error correction; Sugai, 2002), research regularly shows higher ratios of negative to positive statements for social 26  behavior (Harrop & Swinson, 2000; Jenson, Olympia, Farley, & Clark, 2004).Within a SWPBS approach, many schools implementing SWPBS use tangible rewards (e.g., tickets that can be exchanged for school supplies or social activities) as a systems-level intervention for increasing the implementer-level behavior of specific verbal praise. Consequently, using a school-wide acknowledgement system with a continuum of varied, age-appropriate strategies provides a consistent and effective mechanism to provide high rates of positive reinforcement in the school environment (Sugai et al., 2008). Cameron, Banko, and Pierce (2001) indicated that the use of rewards was only associated with a decrease in intrinsic motivation when they were expected, provided only once, and not directly tied to level of performance. Witzel and Mercer (2003) further noted that explicit and specific acknowledgement of the student’s behavior may be more important than the reward itself. Hence, the approach of using a system of tangible rewards may only be effective to the extent to which it promotes regular and specific acknowledgement of student behaviors. As seen in this and other research (Andreou & McIntosh, 2011; Coffey & Horner, in press), the focus on positive reinforcement and visible improvement seen by teachers in using acknowledgement systems effectively seems to be a critical mechanism in sustaining the practice. 4.1.3  Matching Instruction and Materials to Student Ability  Matching instruction and curriculum materials to the needs of the students was also a significant unique predictor of sustained implementation. The instructional match is defined by how closely each student’s skills match the difficulty of the instructional material (Daly, 1996). For example, teachers using differentiated instruction are able to adapt their curriculum and materials to the varying levels of student skill in their classroom with the goal of maximizing the potential of each learner in a given subject area (Tomlinson, 2005). In addition, matching 27  instructional demands to student skill levels has been shown to reduce problem behavior, even in the absence of additional behavior support (Filter & Horner, 2009; Lee, Sugai, & Horner, 1999; Preciado, Horner, & Baker, 2009). As a critical mechanism for sustained SWPBS practices, an appropriate match between instructional materials and student skills likely prevents student frustration and subsequent problem behavior while maximizing student success. Thus, attention to supporting teachers’ academic instructional skills can be effective in preventing problem behavior, ensuring academic success, and creating a positive context in which SWPBS can be sustained. When valued student outcomes (i.e., academic and behavior performance) and teacher outcomes (i.e., facing fewer instances of problem behavior) are maximized, the general priority for sustained implementation is enhanced, and the practice is seen as more acceptable, effective, and feasible (Han & Weiss, 2005). 4.1.4  Access to Assistance and Recommendations  Findings from the current study indicate that access to additional support was significantly positively correlated with sustained implementation but acted as a suppressor variable when controlling for the previous two instructional practices. In general, receiving additional support, in the form of coaching or peer networking, is associated with improved teaching (McLaughlin, 1994; Rose & Church, 1998). The positive bivariate correlation between access to assistance and sustained implementation is consistent with this research. However, when positive reinforcement and instructional matching were held constant in the final model, access to additional support was associated with lower levels of implementation. Although the negative relation may be a spurious artifact of the regression, results can be interpretable if there is a plausible theoretical explanation for the relation (Messick & Van de Geer, 1981). The item in the measure notes only access to assistance and recommendations, not 28  the focus or quality of the assistance provided. It is consistent with these results that additional support that did not improve the pivotal classroom teaching practices of positive reinforcement and instructional matching was potentially iatrogenic or a marker of a more chaotic host environment. For example, receiving ineffective coaching (either focused on less relevant teaching variables or coaching that did not result in improved instruction) may have been a barrier to subsequent sustained efforts. In addition, some schools may also have been receiving additional, mandated instructional support (e.g., as a persistently poor performing school or from the Office of Civil Rights) that indicated potential external challenges in classroom management or school culture, which may have also been a barrier. Thus, it may be that access to additional support was predictive of sustained SWPBS implementation when it focused on improving key instructional practices, yet simply providing access to additional support that does not improve these instructional practices may be a barrier to sustained implementation.  4.2  Limitations There are several limitations to the current study. One important limitation was the  limited variability in BoQ total scores, as most schools had reached the criterion for adequate fidelity of implementation by 2009-10. Schools that abandoned SWPBS in the three year period were not included in the sample, and results may have been different if non-sustaining schools were also included. The first year of implementation for the schools in the sample is also unknown. As a result, the sample most likely includes both schools that began implementing in 2006-07 and schools that had been implementing for a number of years previously. There are different variables that affect fidelity of implementation at different time points. During initial implementation, efforts are focused on developing a school-wide set of practices and 29  expectations to provide a framework for the other systems to be built upon. However, after school-wide expectations have been formalized, they are rarely changed across the years. Thus, the results of the current study may be most applicable for schools that have already been implementing and sustaining SWPBS for three or more years.  4.3  Future Research Results of the current study indicated that school personnel ratings of SWPBS  implementation in classrooms plays a statistically significant role in predicting sustained SWPBS efforts, but it is also clear that other variables facilitate sustained implementation. Future research may further investigate the role of implementer-specific variables using a measure of contextual variables related to sustainability. For example, the School-wide Universal Behavior Sustainability Index – School Teams (SUBSIST; McIntosh, Doolittle, Vincent, Horner, & Ervin, 2009) is a research-based measure designed to evaluate factors affecting SWPBS sustainability. The current study also assessed schools at two different time points to see if implementation in 2006-07 predicted subsequent SWPBS sustained implementation in 2009-10. Future studies could track the fidelity of implementation each year to identify increases or dips in implementation due to different facilitators or barriers to sustained implementation associated with each year. For example, although implementation quality may increase after years of implementation, a large staff turnover in one year may negatively impact fidelity for that particular year.  30  4.4  Implications for Practice 4.4.1  Development of School-wide and Classroom Practices  Classroom teachers are considered the primary implementers of school-based practices. Therefore, understanding their beliefs and practices is critical in predicting their efforts in implementing and sustaining a practice. A typical approach to SWPBS implementation includes implementing systems to define, teach, and acknowledge school-wide expectations in schoolwide and non-classroom settings and then leaving it to classroom teachers to extend these common school-wide practices to their classrooms (McKevitt & Braaksma, 2008). The logic behind this approach is that classroom teachers may be more likely to implement SWPBS in the classroom if they see positive effects in shared areas, where it is easier to build consensus. The danger in this approach is that classroom teachers may never implement practices in their classrooms without collective support. Or worse, they may believe that SWPBS practices are intended only for non-classroom areas and not the classroom. Training that focuses on increasing the understanding of key principles of the practice, in addition to information on the intervention’s effectiveness, may improve teacher acceptability of the practice by increasing expectations, intentions, and motivation to implement the practice (Han & Weiss, 2005). Yet building teacher acceptance of SWPBS may not be enough, as school personnel may also need to be shown how to translate the core SWPBS components into their daily routines. These findings support the approach to create an effective host environment by building consensus to implement SWPBS in classroom settings as soon as it is implemented in non-classroom settings. Access to additional support to address SWPBS implementation in the classroom may also promote full classroom implementation when associated with improved quality of teaching practices. However, this recommendation requires validation through further research. 31  4.4.2 Ongoing Data Collection and Visibility of Results SWPBS emphasizes the importance of systematic documentation of the needs of school personnel, the fidelity of implementation, and its impact on valued outcomes (Sugai & Horner, 2006). Regular data collection and interpretation of student performance facilitates the development of quality instructional practices in the classroom. Accurate identification of the different skill levels and appropriate instructional strategies maximize student success in successfully developing the skill. Additionally, the visible attainment of valued outcomes often leads to changes in implementer beliefs and motivation to sustain a practice (Guskey, 1986). Linking changes in student behavior directly to implementer efforts would help build consensus to implement SWPBS in classroom settings. Comparing the results of self-reported fidelity of implementation and independent evaluations of SWPBS can also provide important information regarding both the level of implementation and school personnel understanding of core SWPBS principles. Completion of a yearly needs assessment, such as the PBIS SAS, allows for implementer-level ownership in action planning for sustained implementation. In addition, although fidelity of school-wide implementation in school-wide settings is commonly assessed, specific assessment of fidelity of implementation in classrooms may be beneficial. Some PBIS fidelity of implementation measures have recently been revised to include more classroomspecific items (Kincaid, Childs, & George, 2010; Sugai, Horner, Lewis-Palmer, & Rossetto Dickey, 2011). These measures may be more effective in measuring sustained efforts in both classroom and non-classroom areas.  32  References Adelman, H. S., & Taylor, L. (2003). On sustainability of project innovations as systemic change. Journal of Educational and Psychological Consultation, 14, 1-25. doi: 10.1207/S1532768XJEPC1401_01 Albin, R. W., Lucyshyn, J. M., Horner, R. H., & Flannery, K. B. (1996). Contextual fit for behavioral support plans: A model for "goodness of fit." In L. K. Koegel, R. L. Koegel & G. Dunlap (Eds.), Positive behavioral support: Including people with difficult behavior in the community (pp. 81-98). Baltimore, MD: Brookes. Algozzine, K., & Algozzine, B. (2007). Classroom instructional ecology and school-wide positive behavior support. Journal of Applied School Psychology, 24, 29-47. Andreou, T., & McIntosh, K. (2011). Sustainability of school-wide positive behavior support: A qualitative study of critical incidents. Manuscript in preparation. Baker, S., Gersten, R., Dimino, J. A., & Griffiths, R. (2004). The sustained use of research-based instructional practice: A case study of peer-assisted learning strategies in mathematics. Remedial and Special Education, 25, 5-24. doi: 10.1177/07419325040250010301 Bambara, L., Nonnemacher, S., & Kern, L. (2009). Sustaining school-based individualized positive behavior support. Journal of Positive Behavior Interventions, 11, 161-178. doi: 10.1177/1098300708330878 Barrett, S. B., Bradshaw, C., & Lewis-Palmer, T. L. (2008). Maryland statewide positive behavior interventions and supports initiative: Systems, evaluation, and next steps. Journal of Positive Behavior Interventions, 10, 105-114. doi: 10.1177/1098300707312541  33  Bennett, J. L., & McIntosh, K. (in press). Effects of School-wide Positive Behavior Support on teacher self-efficacy Psychology in the Schools. Berman, P., & McLaughlin, M. W. (1977). Federal programs supporting educational change, Vol. III: Implementing and sustaining innovations. Santa Monica, CA: The Rand Corporation. Blonigen, B. A., Harbaugh, W. T., Singell, L. D., Horner, R. H., Irvin, L. K., & Smolkowski, K. S. (2008). Application of economic analysis to School-Wide Positive Behavior Support (SWPBS) programs. Journal of Positive Behavior Interventions, 10, 5-19. doi: 10.1177/1098300707311366 Boardman, A. G., Argüelles, M. E., Vaughn, S., Hughes, M. T., & Klingner, J. (2005). Special education teachers' views of research-based practices. The Journal of Special Education, 39, 168-180. doi: 10.1177/00224669050390030401 39 Bohanon, H., Fenning, P., Carney, K. L., Minnis-Kim, M. J., Anderson-Harriss, S., Moroz, K. B., . . . Sailor, W. (2006). Schoolwide application of positive behavior support in an urban high school: A case study. Journal of Positive Behavior Interventions, 8, 131-145. doi: 10.1177/10983007060080030201 Bradshaw, C. P., Mitchell, M. M., & Leaf, P. J. (2010). Examining the effects of schoolwide positive behavioral interventions and supports on student outcomes: Results from a randomized controlled effectiveness trial in elementary schools. Journal of Positive Behavior Interventions, 12, 133-148. doi: 10.1177/1098300709334798 Cameron, J., Banko, K. M., & Pierce, W. D. (2001). Pervasive negative effects of rewards on intrinsic motivation: The myth continues. The Behavior Analyst, 24, 1-4.  34  Castro, F. G., Barrera, M., & Martinez, C. R. (2004). The cultural adaptation of prevention interventions: Resolving tensions between fidelity and fit. Prevention Science, 5, 41-45. doi: 10.1023/B:PREV.0000013980.12412.cd Center on Positive Behavioral Interventions and Supports. (2011). "Schools that are implementing SWPBIS." Retreived from www.pbis.org Coburn, C. E. (2003). Rethinking scale: Moving beyond numbers to deep and lasting change. Educational Researcher, 32(6), 3-12. doi: 10.3102/0013189X032006003 Coffey, J., & Horner, R. H. (in press). The sustainability of school-wide positive behavioral interventions and supports. Exceptional Children. Cohen, J., & Cohen, P. (1975). Applied multiple regression/correlation analysis for the behavioral sciences. New York: Wiley. Cohen, R., Kincaid, D., & Childs, K. E. (2007). Measuring school-wide positive behavior support implementation: Development and validation of the Benchmarks of Quality. Journal of Positive Behavior Interventions, 9, 203-213. doi: 10.1177/10983007070090040301 Daly, E. J. (1996). The effects of instructional match and content overlap on generalized reading performance. Journal of Applied Behavior Analysis, 29, 507 - 518. doi: 10.1901/jaba.1996.29-507 Deci, E. L., Koestner, R., & Ryan, R. M. (2001). Extrinsic rewards and intrinsic motivation in education: Reconsidered once again. Review of Educational Research, 71, 1-27. Elliott, S. N., Witt, J. C., Galvin, G. A., & Peterson, R. (1984). Acceptability of positive and reductive behavioral interventions: Factors that influence teachers' decisions. Journal of School Psychology, 22, 353-360. doi: 10.1016/0022-4405(84)90022-0 35  Ervin, R. A., Schaughency, E., Matthews, A., Goodman, S. D., & McGlinchey, M. T. (2007). Primary and secondary prevention of behavior difficulties: Developing a data-informed problem-solving model to guide decision making at a school-wide level. Psychology in the Schools, 44, 7-18. doi: 10.1002/pits.20201 Filter, K. J., & Horner, R. H. (2009). Function-based academic interventions for problem behavior. Education and Treatment of Children, 32, 1-19. Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation research: Synthesis of the literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network (FMHI Publication #231). Forman, S. G., & Burke, C. R. (2008). Best Practices in Selecting and Implementing EvidenceBased School Interventions. In A. Thomas & J. P. Grimes (Eds.), Best practices in school psychology V. Bethesda, MD: National Association of School Psychologists. Gersten, R., Chard, D. J., & Baker, S. (2000). Factors enhancing sustained use of research-based instructional practices. Journal of Learning Disabilities, 33, 445-457. doi: 10.1177/002221940003300505 Gersten, R., & Dimino, J. (2001). The realities of translating research into classroom practice. Learning Disabilities Research & Practice, 16, 120-130. doi: 10.1111/0938-8982.00013 Greenberg, M. T., Weissburg, R. P., & O'Brien, M. E. (2003). Enhancing school-based prevention and youth development through coordinated social, emotional, and academic learning. American Psychologist, 58, 466-474. doi: 10.1037/0003-066X.58.6-7.466 Guskey, T. R. (1986). Staff development and the process of teacher change. Educational Researcher, 15, 5-12. doi: 10.3102/0013189X015005005 36  Guskey, T. R. (1988). Teacher efficacy, self-concept, and attitudes toward the implementation of instructional innovation. Teaching and Teacher Education, 4, 63-69. doi: 10.1016/0742051x(88)90025-x Hagan-Burke, S., Burke, M. D., Martin, E., Boon, R. T., Fore, C., III, & Kirkendoll, D. (2005). The internal consistency of the school-wide subscales of the effective behavioral support survey. Education & Treatment of Children, 28, 400-413. Han, S. S., & Weiss, B. (2005). Sustainability of teacher implementation of school-based mental health programs. Journal of Abnormal Child Psychology, 33, 665-679. doi: 10.1007/s10802-005-7646-2 Harrop, A., & Swinson, J. (2000). Natural rates of approval and disapproval in British infant, junior and secondary classrooms. British Journal of Educational Psychology, 70(4), 473483. doi: 10.1348/000709900158236 Horner, R. H., Todd, A. W., Lewis-Palmer, T., Irvin, L. K., Sugai, G., & Boland, J. B. (2004). The School-wide Evaluation Tool (SET): A research instrument for assessing schoolwide positive behavior support. Journal of Positive Behavior Interventions, 6, 3-12. doi: 10.1177/10983007040060010201 Jenson, W. R., Olympia, D., Farley, M., & Clark, E. (2004). Positive psychology and externalizing students in a sea of negativity. [Article]. Psychology in the Schools, 41(1), 67-79. Kame’enui, E. J., Simmons, D. C., & Coyne, M. D. (2000). Schools as host environments: Toward a schoolwide reading improvement model. Annals of Dyslexia, 50, 31-51.  37  Kazdin, A. E. (1981). Acceptability of child treatment techniques: The influence of treatment efficacy and adverse side effects. Behavior Therapy, 12, 493-506. doi: 10.1016/s00057894(81)80087-1 Kincaid, D., Childs, K., Blase, K. A., & Wallace, F. (2007). Identfiying barriers and facilitators in implementing schoolwide positive behavior support. Journal of Positive Behavior Interventions, 9, 174-184. doi: 10.1177/10983007070090030501 Kincaid, D., Childs, K., & George, H. (2005). School-wide benchmarks of quality. Unpublished instrument, University of South Florida. Kincaid, D., Childs, K., & George, H. (2010). School-wide benchmarks of quality (revised). Unpublished instrument, University of South Florida. Klingner, J. K., Ahwee, S., Pilonieta, P., & Menendez, R. (2003). Barriers and facilitators in scaling up research-based practices. Exceptional Children, 69, 411-429. Klingner, J. K., Arguelles, M. E., Hughes, M. T., & Vaughn, S. (2001). Examining the schoolwide "spread" of research-based practices. Learning Disability Quarterly, 24, 221234. Lee, Y., Sugai, G., & Horner, R. H. (1999). Using an instructional intervention to reduce problem and off-task behaviors. Journal of Positive Behavior Interventions, 1, 195-204. Lewis, T. J., & Sugai, G. (1999). Effective behavior support: A systems approach to proactive schoolwide management. Focus on Exceptional Children, 31, 1-24. Lohrmann, S., Forman, S., Martin, S., & Palmieri, M. (2008). Understanding school personnel's resistance to adopting schoolwide positive behavior support at a universal level of Intervention. Journal of Positive Behavior Interventions, 10, 256-269. doi: 10.1177/1098300708318963 10: 38  Luiselli, J. K., Putnam, R. F., Handler, M. W., & Feinberg, A. B. (2005). Whole-school positive behaviour support: Effects on student discipline problems and academic performance. Educational Psychology, 25, 183-198. doi: 10.1080/0144341042000301265 McIntosh, K., Doolittle, J., Vincent, C. G., Horner, R. H., & Ervin, R. A. (2009). School-wide universal behavior sustainability index: School teams. Vancouver, BC: University of British Columbia. McIntosh, K., Filter, K. J., Bennett, J. L., Ryan, C., & Sugai, G. (2010). Principles of sustainable prevention: Designing scale-up of school-wide positive behavior support to promote durable systems. Psychology in the Schools, 47, 5-21. doi: 10.1002/pits.20448 McIntosh, K., Horner, R. H., & Sugai, G. (2009). Sustainability of systems-level evidence-based practices in schools: Current knowledge and future directions. In W. Sailor, G. Dunlap, G. Sugai & R. H. Horner (Eds.), Handbook of positive behavior support (pp. 327-352). New York: Springer. McKevitt, B. C., & Braaksma, A. D. (2008). Best Practices in Developing a Positive Behavior Support System at the School Level In A. Thomas & J. P. Grimes (Eds.), Best practices in school psychology IV. Bethesda, MD: National Association of School Psychologists. McLaughlin, M. W. (1994). Strategic sites for teachers' professional development. In P. P. Grimmett & J. Neufeld (Eds.), Teachers' struggle for authentic development (pp. 31-51). New York: Teachers College Press. Messick, D. M., & Van de Geer, J. P. (1981). A reversal paradox. Psychological Bulletin, 90(3), 582-593. doi: 10.1037/0033-2909.90.3.582  39  Mihalic, S. F., & Irwin, K. (2003). Blueprints for violence prevention: From research to realworld settings--Factors influencing the successful replication of model programs. Youth Violence and Juvenile Justice, 1, 307-329. doi: 10.1177/1541204003255841 Muscott, H. S., Mann, E. L., & LeBrun, M. R. (2008). Positive behavioral interventions and supports in New Hampshire: Effects of large-scale implementation of schoolwide positive behavior support on student discipline and academic achievement. Journal of Positive Behavior Interventions, 10, 190-205. doi: 10.1177/1098300708316258 Nelson, J. R. (1996). Designing schools to meet the needs of students who exhibit disruptive behavior. Journal of Emotional and Behavioral Disorders, 4, 147-161. Noell, G. H., Witt, J. C., Slider, N. J., Connell, J. E., Gatti, S. L., Williams, K. L., . . . Duhon, G. J. (2005). Treatment implementation following behavioral consultation in schools: a comparison of three follow-up strategies. School Psychology Review, 34, 87-106. Pluye, P., Potvin, L., Denis, J.-L., Pelletier, J., & Mannoni, C. (2005). Program sustainability begins with the first events. Evaluation and Program Planning, 28, 123-137. doi: 10.1016/j.evalprogplan.2004.10.003 Preciado, J. A., Horner, R. H., & Baker, S. K. (2009). Using a function-based approach to decrease problem behavior and increase reading academic engagement for Latino English language learners. Journal of Special Education, 42, 227-240. Rohrbach, L. A., Graham, J. W., & Hansen, W. B. (1993). Diffusion of a school-based substance abuse prevention program: Predictors of program implementation. Preventive Medicine, 22, 237-260. doi: 10.1006/pmed.1993.1020  40  Rose, D. J., & Church, R. J. (1998). Learning to teach: The acquisition and maintenance of teaching skills. Journal of Behavioral Education, 8, 5-35. doi: 10.1023/A:1022860606825 Ross, S. W., & Horner, R. H. (2006). Teacher outcomes of school-wide positive behavior support. Teaching Exceptional Children Plus, 3(6), Retrieved from http://escholarship.bc.edu/education/tecplus/vol3/iss6/art6. Safran, S. P. (2006). Using the Effective Behavior Supports Survey to guide development of schoolwide positive behavior support. Journal of Positive Behavior Interventions, 8, 3-9. doi: 10.1177/10983007060080010201 Scott, T. M., & Barrett, S. B. (2004). Using staff and student time engaged in disciplinary procedures to evaluate the impact of school-wide PBS. Journal of Positive Behavior Interventions, 6, 21-27. doi: 10.1177/10983007040060010401 Skiba, R. J., & Peterson, R. L. (2000). School discipline at a crossroads: From zero tolerance to early response. Exceptional Children, 66, 335-347. Sparks, G. M. (1988). Teachers' attitudes toward change and subsequent improvements in classroom teaching. Journal of Educational Psychology, 80, 111-117. doi: 10.1037/00220663.80.1.111 Sugai, G. (2002). Designing School-Wide Systems for Student Success Retrieved 9/24/02, from http://www.pbis.org. Sugai, G., Horner, R., Lewis-Palmer, T., & Rossetto Dickey, C. (2011). Team Implementation Checklist, Version 3.1. Eugene, OR: Educational and Community Supports.  41  Sugai, G., & Horner, R. H. (2002). The evolution of discipline practices: School-wide positive behavior supports. Child and Family Behavior Therapy, 24, 23-50. doi: 10.1300/J019v24n01_03 Sugai, G., & Horner, R. H. (2006). A promising approach for expanding and sustaining the implementation of school-wide positive behavior support. School Psychology Review, 35, 245-259. Sugai, G., & Horner, R. H. (2009). Defining and describing schoolwide positive behavior support. In W. Sailor, G. Dunlap, G. Sugai & R. H. Horner (Eds.), Handbook of positive behavior support (pp. 307-326). New York: Springer. Sugai, G., Horner, R. H., & McIntosh, K. (2008). Best practices in developing a broad-scale system of support for school-wide positive behavior support. In A. Thomas & J. P. Grimes (Eds.), Best practices in school psychology V (Vol. 3, pp. 765-780). Bethesda, MD: National Association of School Psychologists. Sugai, G., Horner, R. H., & Todd, A. W. (2000). Effective Behavior Support Self-Assessment Survey 2.0. Eugene, OR: Educational and Community Supports. Available at http://www.pbis.org. Sugai, G., Horner, R. H., & Todd, A. W. (2003). Effective Behavior Support Self-Assessment Survey. Eugene, OR: Educational and Community Supports. Available at http://www.pbis.org. Tabachnick, B. G., & Fidell, L. S. (2007). Using multivariate statistics (5th ed.). Needham Heights, MA: Allyn & Bacon. Tomlinson, C. (2005). This Issue: Differentiated Instruction. Theory Into Practice, 44(3), 183184. doi: 10.1207/s15430421tip4403_1 42  Vaughn, S., Klingner, J., & Hughes, M. (2000). Sustainability of research-based practices. Exceptional Children, 66, 163-171. doi: 10.1177/074193259902000502 Von Brock, M. B., & Elliott, S. N. (1987). Influence of treatment effectiveness information on the acceptability of classroom interventions. Journal of School Psychology, 25, 131-144. doi: 10.1016/0022-4405(87)90022-7 Witt, J. C., Martens, B. K., & Elliott, S. N. (1984). Factors affecting teachers' judgments of the acceptability of behavioral interventions: Time involvement, behavior problem severity, and type of intervention. Behavior Therapy, 15(2), 204-209. doi: 10.1016/s00057894(84)80022-2 Witzel, B. S., & Mercer, C. D. (2003). Using rewards to teach students with disabilities. Remedial and Special Education, 24(2), 88-96. doi: 10.1177/07419325030240020401  43  

Cite

Citation Scheme:

        

Citations by CSL (citeproc-js)

Usage Statistics

Share

Embed

Customize your widget with the following options, then copy and paste the code below into the HTML of your page to embed this item in your website.
                        
                            <div id="ubcOpenCollectionsWidgetDisplay">
                            <script id="ubcOpenCollectionsWidget"
                            src="{[{embed.src}]}"
                            data-item="{[{embed.item}]}"
                            data-collection="{[{embed.collection}]}"
                            data-metadata="{[{embed.showMetadata}]}"
                            data-width="{[{embed.width}]}"
                            async >
                            </script>
                            </div>
                        
                    
IIIF logo Our image viewer uses the IIIF 2.0 standard. To load this item in other compatible viewers, use this url:
http://iiif.library.ubc.ca/presentation/dsp.24.1-0072111/manifest

Comment

Related Items