Impacting Outcomes for Students with the Most Challenging Behaviors through Schoolwide PBS: OSEP’s Model Demonstration Projects Lucille Eber, Statewide Director Illinois Statewide Technical Assistance Center PBIS Network Jennifer Doolittle, Project Officer Office of Special Education Programs
78
Embed
Lucille Eber, Statewide Director Illinois Statewide Technical Assistance Center PBIS Network
Impacting Outcomes for Students with the Most Challenging Behaviors through Schoolwide PBS: OSEP’s Model Demonstration Projects. Lucille Eber, Statewide Director Illinois Statewide Technical Assistance Center PBIS Network Jennifer Doolittle, Project Officer - PowerPoint PPT Presentation
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Impacting Outcomes for Students with the Most Challenging Behaviors through Schoolwide PBS: OSEP’s Model Demonstration Projects
Lucille Eber, Statewide Director Illinois Statewide Technical Assistance Center PBIS NetworkJennifer Doolittle, Project OfficerOffice of Special Education Programs
Timeline of OSEP’s PBS Investments 1997: IDEA reauthorized 1998: OSEP funds the Center on Positive Behavioral Interventions and Supports
(PBIS). Provides a framework for schoolwide behavior support practices
2002: OSEP funds the Center for Evidence-based Practice: Young Children with Challenging Behavior
2004: “PBS Blueprint for Practice” emphasizes developing State and district level infrastructures that can promote large-scale implementation and sustainability
2006: Tertiary Behavior Model Demonstration Sites 2008: Technical Assistance Center for Social-Emotional Intervention for Young
Children (TACSEI) Scaling up evidence-based behavior support practices at the preschool level IDEA performance measures for social-emotional outcome for birth to five
The Future Continue to scale-up school-wide PBS Assist schools to identify and implement more tertiary-level, intensive behavior support
strategies.
OSEP’s Model Demonstration Projects (MDPs) Model Demonstration Coordinating Center
Creating cross-site evaluation plans, instruments, and procedures to assess the context, implementation, and efficacy of each model and across models within a cohort.
Identifying key issues in translating research to practice and analyzing and synthesizing data across MDPs and cohorts
Applying multiple analytic methods to answer evaluation questions regarding both the efficacy of models and bridging the gap between research and practice.
Producing high-quality, useful, and accessible products that communicate findings to key audiences.
Cohorts Each year a new cohort of three to four MDPs is added; each cohort has a
different focus: 2006: 1st cohort is focusing on progress monitoring for preschool through
4th grade in general and special education classrooms. 2007: 2nd cohort is developing tertiary behavior interventions for
elementary/middle schools for students with challenging behaviors. 2008: 3rd cohort will develop, implement and evaluate early childhood
language interventions. Documenting model development, model implementation, and model
outcomes for each cohort, and analyzing MDP experiences and results across cohorts Help OSEP bridge the gap between identifying evidence-based practices and
achieving their widespread use.
Tertiary Behavior MDPs Illinois PBIS Network and the University of
Kansas The University of Oregon The University of Washington
Tertiary Behavior MDPs Big Ideas School-wide intensive PBS approach guided by a three-tiered
prevention model Individualized, function-based behavior support to children who
exhibit the most challenging behaviors and have not been responsive to primary or secondary prevention efforts
Emphasis on collecting and using data for decision-making Systematic strategies for professional development TA that improves student behavior to help schools promote student
learning and other positive outcomes Cost effective and efficient process for school districts to implement
and sustain A response to intervention (RtI) logic model to show student
progress
RTI for Behavior and Academics Key components
Evidence-based instruction and supports for all students Universal screening to determine which students are not
meeting benchmarks Targeted instruction and support that goes beyond what
all students receive Progress monitoring Intensive and individualized instruction and support for
students who are still not making progress - special education?
Decision points throughout
The Kansas-Illinois
SW-PBS Tertiary Demonstration
Center:
A Response to Intervention (RtI)
Continuum of Support Model
CEC Boston April 4, 2008
Lucille Eber, Illinois PBIS Network
Wayne Sailor, University of Kansas
K-I Center Team Leaders
• Jamie Bezdek, University of Kansas• Kimberli Breen, IL PBIS Network• Jen Rose, Loyola University-IL PBIS Network• Amy McCart, University of Kansas
Evaluation:
• Kelly Hyde (SIMEO)• Holly Lewandowski (PoI and SWIS data)
Big Ideas for this Session
1. How the K-I Center is applying the RtI approach to both behavior and academics to ensure tertiary capacity
2. Implementation experiences and data from IL (Year One and Year Two)
3. What the K-I Center hope to “deliver” in terms of knowledge, tools etc.
Does building a school-wide system of PBIS increase school’s abilities to effectively educate students with more complex needs?
What systems, data and practice structures are needed to ensure that positive behavior support being applied in needed dosage for ALL students?
(RtI) Approach• Investment in prevention• Universal Screening• Early intervention for students not at “benchmark”• Multi-tiered, prevention-based intervention approach• Progress monitoring• Use of problem-solving process at all 3-tiers• Active use of data for decision-making at all 3-tiers• Research-based practices expected at all 3-tiers• Individualized interventions commensurate with
assessed level of need
Continuum of Support for Secondary-Tertiary Level
Systems1. Group interventions (BEP, social or academic skills
groups, tutor/homework clubs, etc)2. Group Intervention with a unique feature for an
individual student, (BEP individualized into a Check & Connect; mentoring/tutoring, etc.)
3. Simple Individualized Function Based Behavior Support Plan for a student focused on one specific behavior (simple FBA/BIP-one behavior; curriculum adjustment; schedule or other environmental adjustments, etc)
4. Complex Function-based Behavior Support Plan across settings (i.e.: FBA/BIP home and school and/or community)
5. Wraparound: More complex and comprehensive plan that address multiple life domain issues across home, school and community (i.e. basic needs, MH treatment, as well as behavior/academic interventions) multiple behaviors
3.8.08
ILLINOS SW-PBS History
• The Context for Implementing the Tertiary Demo process……
IL PBIS Schools Over Nine Years:
Trained & Partially or Fully Implementing
23
120184
303
394444
520587
654
0
100
200
300
400
500
600
700
Year 1 9/98
Year 2 9/00
Year 3 9/01
Year 4 6/02
Year 5 6/03
Year 6 6/04
Year 7 6/05
Year 8 6/06
Year 9 6/07
num
be
r o
f sc
ho
ols
Implementing Tertiary Demos: The IL Context for
IL PBIS Expansion History
June 30, 2005
444 schools in 143 districts
92 new schools trained
520 schools in 155 Districts 97 new schools trained in FY06 12 new districts
654 schools in 170 72 schools trained in FY07 15 new districts
744 Schools in 196 Districts Approximately 90 schools trained in 1st half
of FY08 Approximately 26 new districts in 1st half of
FY08
June 30, 2006
June 30, 2007
January, 2008
Illinois PBIS Schools
74%
14%
12%
83%
12%
5%
50%
60%
70%
80%
90%
100%
% o
f stu
de
nts
with
OD
Rs
Partia lly Implementing
(n=58)
Fully Implementing (n=141)
0-1 ODRs 2-5 ODRs 6+ ODRs
Mean Percentage of Students with Major ODRs 2006-07, Statewide
The differences between fully and partially implementing schools were statistically significant in all three levels of ODRs
PBIS Network Staff School Social Worker PBIS Coach Other School Personnel
Shift in Responsibility for Individual Student
Data Management at Tertiary Demo Sites
IL PBIS Tertiary Demos
“Andy”
Using Data to Keep the Team Moving
“Celebrate Success of current plan”
“Andy”
Next Steps
• Team will use data to plan for academic supports.
• Lower intensity of some supports
• Transition from frequent adult feedback to self-monitoring to boost self-confidence and feel/be less dependent on adults.
“Mary Ellen”
Home, School, Community Tool
SIMEO -Educational Information Tool
00.5
11.5
22.5
33.5
Baseline Time 2 Time 3 Time 4
1=N
eve
r 4
= A
lwa
ys
attends school
participates in extracurricular activities
appropriate behavior in unsupervised settings
“AJ”
• 0 ODRs (from 3)• 0 Time-outs (from 22)• Passing grades in all classes (from D’s & F’s)• Parents report that: “this is different”• Improved partnerships with community
service providers• MH partner participating and providing
effective strategies
“Henry”Reason Referred to Tertiary
Supports “Henry”, an elementary school student,
had: extremely poor attendance failing grades poor homework completion trouble with the law in the community and had a court assigned probation officer
and a mandated Department of Children and Family Services (DCFS) counselor
“Henry”
Engagement and Team Preparation
• Wraparound team initially included – Henry– his mother – the school social worker – his primary classroom teacher – school principal, bi-lingual liaison– district SWPBS tertiary-tier coach
• Team met consistently to identify strengths, big needs and develop a wraparound plan for Henry.
• Henry’s strengths identified by team included:– a good relationship with his teacher – responsiveness to positive attention from adults he
likes – leadership among his peers – and effective self-advocacy
• Henry’s “big needs” as identified by the wraparound team: – Henry needs to feel as if he fits in with the other kids
at school– Henry needs to feel successful at school– Henry to be invested in his education
“Henry”
Identifying Strengths and Big Needs
• Henry was included in the “Check-and-Connect” intervention which was being delivered to other students in the school
• In addition, Henry and his teacher talked about individual behavior goals listed on his daily point card.
• He was put on “safety patrol” in which he was an older youth selected to be a positive role model to help monitor and improve his behavior in the hallways.
• Because Henry’s voice was important, the suggestion that he work with younger students in their classrooms was set aside due to Henry’s lack of interest.
“Henry”
Ongoing Plan Implementation and Refinement
• Henry’s progress was monitored through:– Office discipline reports– Attendance– Grades– DIBEL scores – Check In Check Out behavior card points– SIMEO Tools
“Henry”
Progress Monitoring
• From second quarter to third quarter, with wraparound in progress, Henry’s grades and attendance began to increase – Spelling: 15% to 40% – Math: 15% to 48.5% – Reading: 20% to 63%– His DIBELS score increased from 55 words per
minute in the fall to 67 words per minute in the winter.
– Attendance:15% in 1st quarter 60% in 2nd quarter, 75% in 3rd quarter
“Henry”
Initial Outcomes
Henry’s Risk of Placement Data
Referral-Disposition Tool (SIMEO)
Henry’s Improved Behavior and Emotional Functioning at Home
Henry’s Improved Behavior and Emotional Functioning at School
Henry’s Improved Behavior and Emotional Functioning In the
Community
Educational Environment Data (EE)
• A key item in IL State Performance Plan for feds
• More districts to be “flagged” for monitoring
• Tertiary demo activities focus on IL SPP data points
0
50
100
150
1999-00 2000-01 2001-02 2002-03 2003-04 2004-05
Num
be
r of S
tud
ents
Monitor Resource Self contained
Six Year Comparison of Least Restrictive Environment
Sparta School District
Changes in Least Restrictive Environment
Dewey Elementary School
27
45
16
5
78%60%
0
5
10
15
20
25
30
35
40
45
50
2003-04 2004-05
num
be
r stu
de
nts
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
ISA
T sc
ore
students in SPED < 21% of day students in SPED 21-60% of day ISAT scores
• First step is accessing the data
• Next is discussing with range of stakeholders and determining accuracy or how to make it accurate
• Possible tools/procedures to make a difference
EE Data(continued)
Similar to how we got started with ODR data
• clean up data (e.g. ODR form)
• review data trends and ask questions;
Getting Started with Data-Based Decision-Making
with EE Data
Getting Started (continued)
• decide what it means by those who “live” the data;
• decide what data points to focus on
• design actions that seem likely to effect change;
• monitor/revise action plan
Why We Need MH Partnerships
• One in 5 youth have a MH “condition”• About 70% of those get no treatment• School is “defacto” MH provider• JJ system is next level of system default• 1-2% identified by schools as EBD• Those identified have poor outcomes• Suicide is 4th leading cause of death among
young adults
Why Do We Need to Go
Beyond Use of ODRs?
• Use of “alternative” discipline responses; often w/o documentation
• Over use of “Special Education” placement w/o adequate dosage of interventions
Why Do We Need to Go
Beyond Use of ODRs? (continued)
• High rate of unidentified MH problems
• Youth get identified only after “crisis” which makes it harder and more “costly” to intervene.
The Systematic Screening for Behavior Disorders (SSBD)
(Walker and Severson, 1992)
• Developed as a school-wide (Universal) screening tool for children in grades 1-6
– Similar to annual vision/hearing screenings
Background
• Identifies behaviors that may impede academic and social functioning
• Leads to earlier intervention
• May reduce need for formalized, lengthy “requests for assistance” by using data to identify youth
Implementation
• Between early September-first of November, completed screenings in 6 districts and 18 schools
• Initial results indicate that approximately 5%-10% of students enrolled in grades 1-6 were identified by the SSBD
• A Middle school case example:
– Approximately 320 students enrolled in sixth grade were screened using the SSBD
– 38 six graders or 11% passed gate two
Implementation
• Currently, school-based secondary teams are using SSBD data to implement low-intensity interventions (e.g., check-in/check-out)
Implementation
Ensuring Capacity at All 3 Tiers
• Begin assessment and development of secondary and tertiary tiers at start-up of universal– Assess resources and current practices
(specialized services)– Review current outcomes of students with higher
level needs– Position personnel to guide changes in practice– Begin planning and training with select personnel
• All 3 tiers addressed at all district meetings and at every training
Requirements for IL Tertiary Demos
• District Commitment• Designated Buildings/District Staff• External Tertiary Coach/Coordinator• Continuum of Skill Sets (training, guided
learning, practice, coaching, consultation)• Commitment to use of Data System
• LRE– Building and District Level– By disability group
• Other “places” kids are “parked”– Alternative settings– Rooms w/in the building kids are sent
• Sub-aggregate groups– Sp. Ed.– Ethnicity
Ongoing Self–Assessment of Secondary/Tertiary Implementation
Building Level:• IL Phases of Implementation (PoI) Tool • IL Secondary/Tertiary Intervention Tracking Tool• Sp. Ed Referral Data• Suspensions/Expulsions/Placements (ongoing)• Aggregate Individual Student Data (IL SIMEO data)• LRE Data trends• Subgroup data (academic, discipline, Sp. Ed. Referral, LRE, etc)
District Level:• Referral to Sp.Ed. Data• LRE Data (aggregate and by building)• IL Out-of-Home-School-Tracking Tool (multiple sorts)• Aggregate SIMEO data• Aggregate PoI Data