Top Banner
School-wide Positive Behavior Support Evaluation Template October, 2005 Rob Horner, George Sugai, and Teri Lewis-Palmer University of Oregon Purpose This document is prepared for individuals who are implementing School-wide Positive Behavior Support (PBS) in Districts, Regions or States. The purpose of the document is to provide a formal structure for evaluating if School- wide PBS implementation efforts are (a) occurring as planned, (b) resulting in change in schools, and (c) producing improvement in student outcomes. The organization of this template provides (a) an overview of the context within which School-wide PBS is being used, (b) a common set of evaluation questions, (c) evaluation instruments/procedures that address each questions, and (d) samples of evaluation data summaries that can be provided and used to build formal evaluation reports. Context School-wide positive behavior support (SW-PBS) includes a range of systemic and individualized strategies for achieving social and learning outcomes while preventing or reducing problem behavior for all students. School-wide PBS includes universal prevention of problem behavior through the active teaching and rewarding of appropriate social skills, consistent consequences for problem behavior, and on-going collection and use of data for decision-making. In addition, School-wide PBS includes an array of more intensive supports for those students with more severe
34

Evaluation Template: - PBIS.org Home Page · Web viewEvaluation Template October, 2005 Rob Horner, George Sugai, and Teri Lewis-Palmer University of Oregon Purpose This document is

Apr 03, 2018

Download

Documents

vuongdiep
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Evaluation Template: - PBIS.org Home Page · Web viewEvaluation Template October, 2005 Rob Horner, George Sugai, and Teri Lewis-Palmer University of Oregon Purpose This document is

School-wide Positive Behavior SupportEvaluation Template

October, 2005

Rob Horner, George Sugai, and Teri Lewis-PalmerUniversity of Oregon

Purpose

This document is prepared for individuals who are implementing School-wide Positive Behavior Support (PBS) in Districts, Regions or States. The purpose of the document is to provide a formal structure for evaluating if School-wide PBS implementation efforts are (a) occurring as planned, (b) resulting in change in schools, and (c) producing improvement in student outcomes.

The organization of this template provides (a) an overview of the context within which School-wide PBS is being used, (b) a common set of evaluation questions, (c) evaluation instruments/procedures that address each questions, and (d) samples of evaluation data summaries that can be provided and used to build formal evaluation reports.

Context

School-wide positive behavior support (SW-PBS) includes a range of systemic and individualized strategies for achieving social and learning outcomes while preventing or reducing problem behavior for all students. School-wide PBS includes universal prevention of problem behavior through the active teaching and rewarding of appropriate social skills, consistent consequences for problem behavior, and on-going collection and use of data for decision-making. In addition, School-wide PBS includes an array of more intensive supports for those students with more severe behavior support needs. The goals within School-wide PBS are to prevent the development of problem behavior, to reduce on-going patterns of problem behavior, and to improve the academic performance of students through development of a positive, predictable and safe school culture.

School-wide PBS is being implemented today in over 4300 schools throughout the United States. Each of these schools has investing in training on school-wide PBS strategies, has a team that is coordinating implementation, and is actively monitoring the impact of implementation on student outcomes.

As more schools, districts, states and regions adopt School-wide PBS there will be an increasing need to formally evaluate if these training and technical assistance efforts (a) result in change in the way schools address social behavior in schools, and (b) result in change in the behavioral and academic outcomes for students.

Page 2: Evaluation Template: - PBIS.org Home Page · Web viewEvaluation Template October, 2005 Rob Horner, George Sugai, and Teri Lewis-Palmer University of Oregon Purpose This document is

Need for Evaluation

School-wide PBS will continue to be adopted across the U.S. only if careful, on-going evaluation of the process and outcomes remains a central theme. Evaluation outcomes will both document the impact of School-wide PBS, and guide improvement in the strategies and implementation procedures. Evaluation may occur at different scales (one school, versus a district, versus a state or region), and different levels of precision (Local self-assessment, versus state outcome assessment, versus national research-quality analysis). The major goal of evaluation is always to provide accurate, timely, valid and reliable information that is useful for decision-making. The stakeholders and decisions being made will always shape the evaluation. We recognize that the type, amount and level of information gathered for an evaluation will vary. It is very likely that no two evaluation efforts will be exactly the same. At the same, there will be value in identifying common evaluation questions, data sources, and reporting formats that may be useful across evaluation efforts. This evaluation template is intended to benefit those building evaluation plans to assess school-wide PBS. Our hope is that the measures and procedures defined in the template will make it easier for others to design evaluation plans, and that over time a collective evaluation database may emerge that will benefit all those attempting to improve the social culture of schools.

In building an evaluation plan we recommend (a) beginning with the decisions that will be made by stakeholders, (b) organizing the core evaluation questions that will guide decision-making, (c) defining valid, reliable and efficient measures that address the evaluation questions, and (d) presenting information in an iterative, timely and consumable format.

Evaluation Decisions

Our experience suggests that most efforts to implement School-wide PBS begin with a “special” investment by the state, region or federal government in a demonstration effort designed to assess (a) if School-wide PBS can be implemented effectively in the local area, (b) if School-wide PBS results in valued outcomes for children, families and schools, and (c) if School-wide PBS is an approach that can be implemented in a cost-effective manner on a large scale.

The decisions that guide a formal evaluation will focus simultaneously on issues of (a) accountability and oversight (e.g., did the project conduct the activities proposed?), (b) the impact of the project (e.g. was there change in school practices, student behavior, academic outcomes), and (c) implications for further investment needed to take the effort to a practical scale.

An on-going challenge for any evaluation of School-wide PBS is that the individual behavior of children and adults functions as the target of intervention efforts, but the “whole school” is the unit of most evaluation analyses. In essence the goal of School-wide PBS is to create a “whole school” context in which individuals (both faculty and

Page 3: Evaluation Template: - PBIS.org Home Page · Web viewEvaluation Template October, 2005 Rob Horner, George Sugai, and Teri Lewis-Palmer University of Oregon Purpose This document is

students) are more successful. Most evaluations will reflect this attention to individual behavior, with summaries that reflect the global impact on the whole school.

As evaluation plans are formed there are some common traps that are worth avoiding.

1. Evaluation plans are strongest when focused on real outcomes (change in school practices and student behavior)

a. It is possible for evaluation reports to focus only on counts of training events and participant satisfaction. These are necessary, but insufficient pieces of information.

2. Evaluation plans should examine student outcomes only when School-wide PBS practices have been implemented.

a. It is important to know first if training and technical assistance resulted in change in the behavior support practices used in schools

b. An important “next” question is if those schools that implemented to criterion saw change in student outcomes. If schools did not implement School-wide PBS practices, we do not expect to see changes in student outcomes.

3. Evaluation plans often focus only on initial training of demonstration sites, and ignore the capacity development needed for large-scale implementation.

a. School-wide PBS efforts focus simultaneously on establishing demonstrations of effectiveness (individual schools), and the capacity to expand to a socially important scale. There often is the assumption that initiatives start with a small demonstration, and only after the demonstration is documented as viable and effective does work begin on large-scale capacity building. Within School-wide PBS there is an immediate emphasis on building the (a) coaching network, (b) local trainers, and (c) formal evaluation structure that will be keys to taking School-wide PBS to scale. Establishing a Leadership Team with broad vision and mandate is part of the first step toward implementation of School-wide PBS.

Evaluation Questions

In different contexts different evaluation questions will be appropriate. In general, however, School-wide PBS will be implemented within the context of an initiative to change school discipline across a portion of schools in a geographic area (district, state, region). Efforts to provide evaluation of the School-wide PBS implementation effort often will address the following evaluation questions:

1. Who is receiving training and support?a. What schools are receiving implementation support?b. What proportion of schools in the target area is implementing school-wide

PBS?

Page 4: Evaluation Template: - PBIS.org Home Page · Web viewEvaluation Template October, 2005 Rob Horner, George Sugai, and Teri Lewis-Palmer University of Oregon Purpose This document is

2. What training and technical assistance has been delivered as part of the implementation process?

a. What training events have been conducted?b. Who participated in the training events?c. What was the perceived value of the training events by participants?

3. Has the training and TA resulted in change in the behavior support practices used in schools?

a. Are the faculty in participating schools implementing universal school-wide PBS?

b. Are the faculty in participating schools implementing targeted and intensive individual positive behavior support?

4. If schools are using SW-PBS is there an impact on student behavior?a. Has there been a change in reported student problem behavior?

1. Office discipline referrals2. Suspensions3. Expulsions4. Referrals to special education

b. Has there been change in student attendance?c. Has there been change in student academic performance?d. Has there been a change in perceived risk factors and protective factors

that affect mental health outcomes?

5. Has the Training and Technical Assistance resulted in improved capacity for the state/district/region to sustain SW-PBS, and extend implementation to scale?

a. To what extent has the implementation effort resulted in improved capacity of the area to train others in school-wide PBS

b. To what extent has the implementation effort resulted in improved capacity to coach teams in school-wide PBS procedures?

c. To what extent do local teams have evaluation systems in place that will allow them to monitor and improve school-wide PBS?

d. To what extent does the state or district Leadership Team have an evaluation system in place to guide broad scale implementation efforts?

6. Are faculty, staff, students, families, and community stakeholders satisfied?a. Are faculty satisfied that implementation of school-wide PBS is worth the

time and effort?b. Are students satisfied that implementation of school-wide PBS is in their

best interest?

7. Policy impacta. Have changes in student behavior resulted in savings in student and

administrator time allocated to problem behavior?

Page 5: Evaluation Template: - PBIS.org Home Page · Web viewEvaluation Template October, 2005 Rob Horner, George Sugai, and Teri Lewis-Palmer University of Oregon Purpose This document is

b. Have policies and resource allocation within the area (district, school, state) changed?

8. Implicationsa. Given evaluation information, what recommendations exist for (1)

expanding implementation, (2) allocating resources, (3) modifying the initiative or evaluation process?

Evaluation Measures/Instruments

Evaluation plans often incorporate an array of measures to address the core evaluation questions. Some measures are purely descriptive, or uniquely tied to the local context. Other measures may be more research-based, standardized, and experimentally rigorous. Issues of cost, time, and stakeholder needs will affect which measures are adopted. To accommodate variability in evaluation needs, a comprehensive model should offer multiple measures (some more rigorous, and some more efficient) for key questions. The list of measures provided below is not offered with the assumption that all measures would be used in every evaluation plan, but that each plan may find a sampling of these measures to be useful. We also recognize and encourage the use of additional, locally relevant measures.

Evaluation Questions/Focus

Measures(Research-Validated Measures in Bold)

Typical Data Collection Cycle

Metric and use of data

Who is receiving training and technical support?

School Profile Completed when a school begins implementation.

Updated annually

Name, address, contact person, enrollment, grades, student ethnicity distribution.

What training and TA has been delivered?

Was the training and TA identified as useful by participants?

List of training events, persons participating, and training content.

Training Evaluation Form

Collected as part of each major School-wide PBS Team Training Event

Documents the teams and individuals present, the content of training, and the participant perception of workshop usefulness

Has the training and TA resulted in change in the

Team Implementation Checklist (TIC)

The TIC is collected at the first training event, and at least

The TIC provides a % implementation of Universal Level

Page 6: Evaluation Template: - PBIS.org Home Page · Web viewEvaluation Template October, 2005 Rob Horner, George Sugai, and Teri Lewis-Palmer University of Oregon Purpose This document is

behavior support practices in schools? (Change in adult behavior)

EBS Self-Assessment Survey

Systems-wide Evaluation Tool (SET)

Individual-Student Systems Evaluation Tool (I-SSET)

School-wide Benchmarks of Quality (Florida)

quarterly thereafter until 80% criterion met.

The EBS Self-Assessment Survey completed during initial training, and annually thereafter

The SET is completed annually as an external, direct observation measure of SW-PBS practice implementation.

The I-SSET is administered with the SET annually by an external evaluator.

The BoQ is completed by school

SW-PBS practices. Plus a sub-scale score for each of the SET subscales.

The EBS Survey produces % of staff indicating if School-wide, Specific Setting, Classroom and Individual Student systems are in place, and important for improvement.

The SET produces a total % score and % for seven subscales related to Universal level SW-PBS practices.

The I-SSET produces three scores: The % to which “foundation” practices are in place for individual student support; the % to which “Targeted” practices are in place; and the % to which “Individualized, Function-based Support” practices are in place.

The BoQ produces a summary score for

Page 7: Evaluation Template: - PBIS.org Home Page · Web viewEvaluation Template October, 2005 Rob Horner, George Sugai, and Teri Lewis-Palmer University of Oregon Purpose This document is

teams, and assesses the same features as the SET, but based on team perception

implementation, and sub-scale scores for SET factors.

If SW-PBS is implemented at criterion, is there improvement in the social and academic outcomes for students?

School-wide Information System (SWIS)

School Safety Survey (SSS)

Yale School Climate Survey (SCS)

State Academic Achievement Scores(Unique to each state)

SWIS data are collected and summarized continuously.

The SSS typically is administered annually by an external observer at the same time as SET evaluations.

The SCS is a direct survey of students and/or adults that is collected annually, or on a formal research/evaluation schedule.

Annual assessment of literacy, math, etc. scores.

SWIS data indicate the frequency and proportion of office discipline referrals, suspensions and expulsions.

The SSS produces a perceived Risk Factor score and a perceived Protective Factors score. The SSS is one index of the overall “safety” of the school.

The SCS produces a standardized score indexing the perceived quality of the social culture of the school.

The typical outcome is the proportion of students within identified grades (e.g., 3, 5, 8, 10) who meet state standards.

Have the training and TA efforts resulted in improved local capacity to implement SW-PBS?

SW-PBS Registry Completed when initiative begins, and maintained as new people are identified.

Provides listing of*Leadership Team

*Coordinators

*Local Trainers

Page 8: Evaluation Template: - PBIS.org Home Page · Web viewEvaluation Template October, 2005 Rob Horner, George Sugai, and Teri Lewis-Palmer University of Oregon Purpose This document is

Leadership Team Self-Assessment Survey

Completed by the Leadership Team at least annually. Provides a summary score and sub-scale scores.

*Coaching Network

*Evaluation Team

*Schools Implementing SW-PBS

Are Faculty, Staff, Students, Families, Community Stakeholders Satisfied?

Faculty Impact Assessment

Student Impact Assessment

Completed 3-6 weeks after teaching school-wide expectations

Completed 3-6 weeks after teaching school-wide expectations

Provides a Likert-like rating of perceived usefulness of SW-PBS practices

Provides an index of whether students learned school-wide expectations, and if they find the SW-PBS process useful.

Do improvements sustain over time?

TIC, SET, BoQ, I-SSET

Leadership Team Self-Assessment

SWIS, SSS, Standardized Test

Annual assessment of the proportion of schools that meet criterion from one year to the next.

Provides summary of extent to which schools that reach criterion and produce valued gains, sustain those achievements.

Policy and Future Implementation Evaluation Concerns.

Cost Analysis(Unique to each initiative)

Collected annually Document if savings accrue as a function of investing in SW-PBS.

Evaluation Report Outline

This section provides an overview of how a School-wide PBS Evaluation Report may be organized, and the evaluation data summaries that fit for each section.

Purpose of EvaluationThe purpose section should indicate the intended audience (e.g., stakeholders) and

core decisions that are intended to be influenced by the evaluation report.

Page 9: Evaluation Template: - PBIS.org Home Page · Web viewEvaluation Template October, 2005 Rob Horner, George Sugai, and Teri Lewis-Palmer University of Oregon Purpose This document is

Description of School-wide PBS and History of ImplementationA section is recommended that provides the reader with a short overview of the

School-wide PBS. List the major features of the approach, and offers a short history of School-wide PBS implementation in the District, Region, or State. This section may also be the place to indicate the funding sources that support implementation efforts.

Evaluation QuestionsA short section is recommended that operationally defines a list of specific

evaluation questions.

Evaluation Measures and ActivitiesAn evaluation report typically covers a specific time period (e.g. six months, one

year, a 3-year project). Indicate the timeframe of the overall project, and the specific timeframe covered in the report. Within this timeframe provide a table of the School-wide PBS implementation activities that were proposed, and those carried out.

List the specific measures used to collect data, and provide copies of the measures in appendices. Where appropriate consider including published articles defining the psychometric properties of research-quality instruments.

Evaluation ResultsList the evaluation questions under consideration, the data source addressing each

question, and a summary of the current data. Take care to both present the data in an objective and complete manner, and summarize the results to directly answer each evaluation question.

Examples of possible evaluation questions and data sources are provided below from Evaluation Reports prepared in Illinois (Dr. Lucille Eber), New Mexico (Cathy Jones, and Carlos Romero), Iowa (Drs. Marion Panyon and Carl Smith) and Maryland (Susan Barrett, Jerry Bloom and Milt McKenna).

Evaluation Question: Who is adopting School-wide PBS?

Data Source: School Profile; Registry of Teams; State List of Schools

Data Display: From Eber et al., Illinois Positive Behavior Interventions and Support Progress Report 03-04.

Figure 1: Number of Schools Adopting PBIS by academic year.

Page 10: Evaluation Template: - PBIS.org Home Page · Web viewEvaluation Template October, 2005 Rob Horner, George Sugai, and Teri Lewis-Palmer University of Oregon Purpose This document is

Table 1: Percentage of schools in Illinois adopting PBIS by region by implementation year.

Percent of Total IL. Schools Implementing PBIS Regionally as of June 2004

Region Chicago North Central South Total

Total # of Schools 602 1874 1049 666 4149

% of total in PBIS 9/99 0% (0) 0.7% (14) 0.6% (6) 0.5% (3) 0.6% (23)

% of total in PBIS 9/00 0.4% (2) 1.7% (32) 4.2% (44) 6.3% (42) 3.0% (120)

% of schools in PBIS 9/01 1.0% (5) 3.6% (68) 7.1% (74) 7.1% (47) 4.7% (194)

% of schools in PBIS 9/02 2.7% (15) 7.1% (133) 9.4% (99) 8.4% (56) 7.3% (303)

% of schools in PBIS 6/03 3.8% (21) 10.7% (201) 10.9% (114) 8.7% (58) 9.5% (394)

% of schools in PBIS 6/04 4.3% (24) 12.1% (227) 12.6% (132) 9.2% (61) 10.7% (444)

Evaluation Question: Are schools implementing School-wide PBS?

Data Source: Team Implementation Checklist (Target Criterion = 80%)

Data Display: Iowa Elementary School Team Implementation Checklist Data

Page 11: Evaluation Template: - PBIS.org Home Page · Web viewEvaluation Template October, 2005 Rob Horner, George Sugai, and Teri Lewis-Palmer University of Oregon Purpose This document is

Data Source: Team Implementation Checklist (Target Criterion 80% Total)

Data Display: Individual School Report from Iowa 03-04 (Panyon & Smith, 2004)

Data Source: EBS Survey Data (Target Criterion 50% “In Place”)

Data Display: Illinois 02-03 District Evaluation (Eber et al., 2003)

1 2 3 4 5 6 7 8 9 10Schools

Page 12: Evaluation Template: - PBIS.org Home Page · Web viewEvaluation Template October, 2005 Rob Horner, George Sugai, and Teri Lewis-Palmer University of Oregon Purpose This document is

Data Source: EBS Survey

Data Display: Illinois 20-03 Individual Schools Evaluation (Eber et. al., 2003)

1 2 3 4 5 6 7 8 9 10

Page 13: Evaluation Template: - PBIS.org Home Page · Web viewEvaluation Template October, 2005 Rob Horner, George Sugai, and Teri Lewis-Palmer University of Oregon Purpose This document is

Data Source: Systems-wide Evaluation Tool (SET) (Target Criteria 80% Total plus 80% Teaching)

Data Display: Illinois District-Level Assessment 03-04 (Eber er al., 2004)

Data Source: Systems-wide Evaluation Tool (SET)

Data Display: Illinois Individual Schools Assessment 03-04 (Eber et al., 2004)

1 2 3 4 5 6 7 8 9 10 11Schools

Page 14: Evaluation Template: - PBIS.org Home Page · Web viewEvaluation Template October, 2005 Rob Horner, George Sugai, and Teri Lewis-Palmer University of Oregon Purpose This document is

Summary Evaluation Questions from SET Scores:

1. What proportion of schools receiving training and TA have implemented SW-PBS to criterion?

Page 15: Evaluation Template: - PBIS.org Home Page · Web viewEvaluation Template October, 2005 Rob Horner, George Sugai, and Teri Lewis-Palmer University of Oregon Purpose This document is

2. What proportion of schools meeting SW-PBS criteria sustain across time?

a. Of 52 schools meeting the SET criteria in Illinois during 02-03, 85% sustained or improved performance in SET in 03-04.

Evaluation Question: Is Implementation of SW-PBS Improving Student Outcomes?

Data Source: SWIS

Data Display: New Mexico SWIS summary compared to nation average for schools initiating SW-PBS implementation 03-04 (Jones, Romero, Howarth, 2004)

 ODR/100 students/ Year

ODR / 100 students/ School Day

New MexicoODR/100/day

ElementaryN = 508

Mean = 76Median = 54.5

Mean = .43 Mean = .46N = 6 schools

Middle(Jr. High)N = 153

Mean = 199Median = 132

Mean = .95 Mean = 1.32N = 5 schools

High SchoolN = 29

Mean = 171Median = 151.9

Mean = .99 Mean = .82N = 4 schools

Data Source: SWIS

Page 16: Evaluation Template: - PBIS.org Home Page · Web viewEvaluation Template October, 2005 Rob Horner, George Sugai, and Teri Lewis-Palmer University of Oregon Purpose This document is

Data Display: Illinois Individual School Comparison of SET and ODR rates for one school across three years.

Data Source: SWISData Display: Hawaii and Illinois SWIS summaries for Schools meeting and not meeting SET criteria 03-04 (From Sugai et al. 2004)

Data Sources: SWIS

Page 17: Evaluation Template: - PBIS.org Home Page · Web viewEvaluation Template October, 2005 Rob Horner, George Sugai, and Teri Lewis-Palmer University of Oregon Purpose This document is

Data Display: Comparison of Triangle Summary from SWIS for Schools meeting and not meeting SET criteria in Central Illinois, 03-04 (Eber et al., 2004).

Data Sources: SWIS and Team Checklist

Data Display: Elementary Schools In New Mexico at Different Stages of SW-PBS Implementation, 03-04 (Jones, Romero, & Howarth, 2004).

Data Sources: SWIS, and SET

Page 18: Evaluation Template: - PBIS.org Home Page · Web viewEvaluation Template October, 2005 Rob Horner, George Sugai, and Teri Lewis-Palmer University of Oregon Purpose This document is

Data Display: National Database on Out of School Suspension Rates per 100 Students with an ODR, for Schools that do and do not meet the SET Criteria, and for students with and without IEPs.

Evaluation Question: Is Implementation of SW-PBS associated with improved mental health outcomes (e.g. reduced risk factors and increased protective factors)?

Data Source: School Safety Survey

Data Display: Illinois summary of SSS scores for schools that met and did not meet SET Criteria.

SSS Mean Protective Factor Score: Illinois Schools 03-04 t = 7.21; df = 172; p < .0001

0

0.2

0.4

0.6

0.8

1

Met SET Did Not Meet SET

Mea

n Pr

otec

tive

Fact

or S

core

Page 19: Evaluation Template: - PBIS.org Home Page · Web viewEvaluation Template October, 2005 Rob Horner, George Sugai, and Teri Lewis-Palmer University of Oregon Purpose This document is

Evaluation Question: Is there improvement in Student Academic Outcomes when SW-PBS is implemented to criterion?

Data Sources: SET and Standardized Achievement Scores

Data Display: Illinois mean proportion 3rd Graders achieving state ISAT reading standards for 8 schools meeting and 23 schools not meeting SET Criteria 02-03 (Eber et al., 2004).

SSS Mean Risk Factor Score: Illinois Schools 03-04 t = -5.48; df = 134; p < .0001

0

0.2

0.4

0.6

0.8

1

Met SET Did Not Meet SET

Mea

n SS

S R

isk

Fact

or S

core

Page 20: Evaluation Template: - PBIS.org Home Page · Web viewEvaluation Template October, 2005 Rob Horner, George Sugai, and Teri Lewis-Palmer University of Oregon Purpose This document is

Data Sources: SET and Standardized Achievement Scores

Data Display: Change in percentage of students meeting 3rd grade reading standards for Elementary schools in one Oregon school district for schools that had met or not met SET criteria for four consecutive years.

Page 21: Evaluation Template: - PBIS.org Home Page · Web viewEvaluation Template October, 2005 Rob Horner, George Sugai, and Teri Lewis-Palmer University of Oregon Purpose This document is

School-wide PBS Training Evaluation:

Date of Training___________________________

Disagree Agree

The training event was efficiently organized 1 2 3 4 5 6

The presenter(s) was knowledgeable 1 2 3 4 5 6

Page 22: Evaluation Template: - PBIS.org Home Page · Web viewEvaluation Template October, 2005 Rob Horner, George Sugai, and Teri Lewis-Palmer University of Oregon Purpose This document is

The presenter(s) was organized and effective 1 2 3 4 5 6

The training materials were well organized 1 2 3 4 5 6

The training materials were useful 1 2 3 4 5 6

The physical accommodations for the training 1 2 3 4 5 6were acceptable

The most helpful/useful features of the training were:

Features that would have improved the training were:

Other comments:

Page 23: Evaluation Template: - PBIS.org Home Page · Web viewEvaluation Template October, 2005 Rob Horner, George Sugai, and Teri Lewis-Palmer University of Oregon Purpose This document is

School-wide PBSFaculty Evaluation Survey

The purpose of this survey is to gather information about the effectiveness of the school-wide positive behavior support training held within the last month, and to guide team efforts to organize future training in school-wide discipline.

Date:_____________________________

Disagree AgreeThe training activities focused on important social messages for students. 1 2 3 4 5 6

The training activities were well organized. 1 2 3 4 5 6

There was adequate time to planthe training activities. 1 2 3 4 5 6

The students responded well to thetraining activities. 1 2 3 4 5 6

The school-wide training helped with classroom behavior of students. 1 2 3 4 5 6

We should do the school-wide trainingagain next year. 1 2 3 4 5 6

The most valuable aspects of the training were:

Suggestions for improving training in the future:

Page 24: Evaluation Template: - PBIS.org Home Page · Web viewEvaluation Template October, 2005 Rob Horner, George Sugai, and Teri Lewis-Palmer University of Oregon Purpose This document is

School-wide PBSStudent Evaluation Survey

The purpose of this survey is to learn what you found useful about the training you received on school-wide expectations, and how the training can be improved for the future.

Date:___________________________ Your Grade:______________

Please list the behavioral expectations for your school:

How well do you understand what is expected of you at school?

Not Clear Very Clear

1 2 3 4 5 6

How well do you believe other students follow the behavioral expectations?

Not at all Almost Always1 2 3 4 5 6

Do you believe it is worthwhile to provide orientation to the behavioral expectations for students coming to our school?

Not important Very important1 2 3 4 5

6

What did you find most valuable about the training?

What would you recommend to improve the training?

Page 25: Evaluation Template: - PBIS.org Home Page · Web viewEvaluation Template October, 2005 Rob Horner, George Sugai, and Teri Lewis-Palmer University of Oregon Purpose This document is

RegistryLeadership Team, Trainers, Coaches

Leadership Team

Name Contact Information

Implementation Coordinator(s)

Name Contact Information

Implementation Trainers

Name Contact Information

Coaches

Name Contact Information

Evaluation Team

Name Contact Information

School Implementation Teams

Names Roles Contact Information