Being Data-Driven Sheila A. Pires Human Service Collaborative [email protected] Ashley Keenan Parent Support Network of Rhode Island [email protected] Michelle.
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Pires, S. (2001). Utilization management concerns. Washington, DC: Human Service Collaborative.
3
• Planning and Decision Support (Day-to-Day and
Retrospectively)• Utilization Management• Quality Improvement• Cost Monitoring• Research• Evaluation• Social Marketing• Accountability• Education and Advocacy
Examples of How to Use Data
Pires, S. 2005. Building systems of care. Human Service Collaborative. Washington, D.C.
4
Utilization Management (UM) Concerns
Who is using services?
What services are being used?
How much service is being used?
What is the cost of the services being used?
What effect are the services having on those using them? (i.e., are clinical/functional outcomes improving? Are families and youth satisfied?)
Pires, S. (2001). Utilization management concerns. Washington, DC: Human Service Collaborative.
UM
5
Continuous Quality Improvement: Utilizing Data to Drive Quality
Contra Costa County, CA
Internal Evaluators
University-based Evaluator
Evaluation Subcommittee(diverse partners, including families)
Pires, S (2006) Primer Hands On for Child Welfare. From Caliber, Building the Infrastructure to Support Systems of Care.
•Developing activities to ensure CQI for: -Youth with multiple placements -Transition-aged youth -Multi-jurisdiction youth -Youth at-risk for multiple placements
•Developing and Tracking Quality and outcome measures: I.E. reduction in number of youth with 3 or more placements; linkage to needed resources upon emancipation
6
Example of Statewide Quality Improvement Initiative
Michigan: Uses data on child/family outcomes (CAFAS) to:
• Focus on quality statewide and by site• Identify effective local programs and practices• Identify types of youth served and practices associated withgood outcomes (and practices associated with bad outcomes)• Inform use of evidence-based practices (e.g., CBT for depression)• Support providers with training informed by data• Inform performance-based contracting
QI Initiative designed and implemented as a partnership among State, University and Family Organization
K. Hodges. & J. Wotring. 2005. State of Michigan.
Social Marketing
Using commercial marketing practices and techniques topromote social change
Example: Marketing system of care to legislatures – mightuse cost/benefit data
Marketing system of care to diverse families – might use stories of other diverse families who have experienced thesystem as effective
8
Social Marketing/Communications Activities and Resources
• On-call/on-site consultation• Communication listserv• Bimonthly conference calls• Resource center• Tip sheets• Workshops• Training academies• Excellence in Community Communications and Outreach
Lazear, K. (2003). “Primer Hands On” A skill building curriculum. Washington. D.C. Quote: Warren Bennis, Leadership Institute, University of Southern California
10
Example of Quantitative Outcomes - Milwaukee Wraparound
•Reduction in placement disruption rate from 65% to 30%
•School attendance for child welfare-involved children improved
from 71% days attended to 86% days attended
•60% reduction in recidivism rates for delinquent youth from one year prior to enrollment to one year post enrollment
•Decrease in average daily RTC population from 375 to 50
•Reduction in psychiatric inpatient days from 5,000 days to less than 200 days per year
•Average monthly cost of $4,200 (compared to $7,200 for RTC, $6,000 for juvenile detention, $18,000 for psychiatric hospitalization)
Milwaukee Wraparound. 2004. Milwaukee, WI.
11
Example of Qualitative Outcomes:Family/Caregiver Experience Milwaukee Wraparound
Very Much So 64%
Not At All 7%
Somewhat 29%
64% reported Wrap Milwaukee empowered them to handle challenging situations in the future (n=188)
72% felt there was an adequate crisis/safety plan in place (n=172)
91% felt staff were sensitive to their cultural, ethnic and religious needs (n=189)
91% felt they and their child were treated with respect (n=191)
Very Much So 72%
Somewhat 13%
Not At All 15%
Very Much So 91%
Somewhat 5%
Not At All 4%
Very Much So
Somewhat
Not At All
Pires, S. (2006). Primer Hands On – Child Welfare. Washington, D.C.: Human Service Collaborative.
Very Much So 91%
Somewhat 5%
Not At All 4%
1212
Information Management Systems
Importance of web-based, real time datato support care managers, administrators,policymakers, families and youth
Synthesis, The Clinical Manager (TCM), Others
Information and Communications Technology
Information technology – use of electronic computers and software to store, process and transmit information – e.g., electronichealth records
Communications technology – electronic systems used for communication between individuals or groups who are notphysically present at the same location – e.g., video conferencing,Twitter
Telehealth
Using communications technology to provide access to health/behavioral health assessment, diagnosis, intervention,consultation, supervision, education, peer support acrossdistance
Example: Kansas Center for Telemedicine and Telehealth atUniversity of Kansas Medical Center using technology for -
• child psychiatric consultation in remote areas of the state• individual and group therapy; care management• consultation to schools, group homes, and child careprograms in inner city communities
Youth Participation
• Make the process worthwhile for youth• Needs to be a priority during all phases of planning• Access to information in an engaging and
developmentally appropriate way• Young people need support to be involved
Presented by Michelle Zabel, MSSDirector, Maryland Child & Adolescent Innovations Institute, Mental
Health Institute & Juvenile Justice InstituteDivision of Child & Adolescent Psychiatry, School of Medicine,
University of Maryland, Baltimore
Data-Driven Decision Making
Align Outcomes with Shared Results and Indicators and
Performance Measures Already in Use across Systems
Overarching Long Term Outcomes for Populations of Focus
• Maryland’s Children’s Cabinet Results for Child Well-Being
• Maryland Child and Family Services Interagency Strategic Plan
Consistent Performance Measures
• Connect the data requirements across grants and contracts– 1915(c) Psychiatric Residential Treatment Facilities (PRTF) Waiver– SAMSHA funded SOC grants – MD CARES and Rural CARES– Child Welfare’s Place Matters Group Home Diversion– Other Out-of-Home Diversion using Care Coordination
Demystifying Data:Using Understandable Language and Structure to
Collect and Analyze Service DataPerformance Accountability
• How much service was provided?– Number of customers served (by customer characteristic)– Number of activities (by type of activity)
• How well was the service provided?– Customer satisfaction, unit cost, percent of staff fully trained– Activity-specific measures: Percent of clients completing activity,
percent of actions meeting standard
• Is Anyone Better Off? (What effect are the services having?)– Measurable changes in Skills/Knowledge, Attitude/Opinion,
Behavior, CircumstanceAdapted from Friedman, M. 2005. Trying Hard is Not Good Enough. Trafford Publishing, Victoria, BC
Establishing a CORE Set of Data Elements Collected across Populations
• Align your data collection efforts where possible to avoid redundant data collection and reporting
• Federally-funded initiatives will require instruments and tools (performance measures) around which your cross-initiative evaluation plans can be built
• Local evaluations can be allowed for collection of additional measures of interest, such as:– Education – achievement, completion– Child Welfare – permanency, child safety– Juvenile Services – restrictiveness of placement,
recidivism
Making the Data Work for You:Proving Your Initiative’s Effectiveness to Funders
and the larger Community• Planning and day-to-day decision making is reasoned.• Utilization Management changes effectively redirect
resources to where they are needed most.• Quality improvement efforts can be focused on subgroups of
concern with methods of proven effectiveness.• Cost monitoring will show return on the dollar in terms of
desired outcomes.• Research and evaluation questions are answered in clear
measurable terms as well as with effective anecdotal evidence.
• Use data to strengthen social marketing efforts, to target training efforts, to educate about and to advocate for your children, youth and families across venues.
Examples of Statewide Quality Improvement Efforts in Maryland
• Focus on quality statewide and by site
• Identify local programs and practices
• Identify types of youth served and practices associated with good outcomes (and practices associated with bad outcomes)
• Inform use of evidence-based practices (e.g. CBT for depression)
• Support providers with training informed by data
• Inform performance-based contracting
NEW COMMUNITIES TRAININGFebruary 10, 2010
Being Data-Driven
Westchester Community Network
Myra Alfreds, Director, Children’s Mental Health Services
Westchester County, New York
From Data to Sustainability
• Data Collection/Exploration
• Social Marketing
• Cross-System Buy In
• Program Development
• Sustainability
Westchester Community Network, 2003
Identification of Systems Issues•Family Ties•Case Management Programs•Network
Tier 2Meeting
Form a Sub-Committee•Data Collection
On-going Sub-committee•Planning group
Hard Services/Programs
Training
Evaluation/Data
Information To Action: Westchester Model For Change
• Cost-Benefit analysis of savings from residential placement
• County Executive’s State of the County Address
• Single Point of Entry/Single Point of Return is developed across systems
Residential Placement and Psychiatric Hospitalization
0%
5%
10%
15%
20%
25%
30%
35%
40%
45%
50%
Baseline(n=253)
6 Months(n=178)
12 Months(n=144)
18 Months(n=104)
24 Months(n=72)
All Residential Psych Hospital Only
Child Psychiatric Epidemiology Group: Columbia University – MSPH/NYSPI
Example #2
• 10 Kid Study
• Integrated County Planning
• Interdepartmental Agreement
• Sustainability
Example #3
• ER Study of Children Ages 8 and Under• Cross System Planning with New Partners• Early Childhood Networks• Advocacy and County Buy-In• Model Programs and Services• Development of Early Childhood SOC• Foundation Support• New Federal Grant (Project Launch)
• Other Data/Survey/Planning efforts led to new partnerships, successful county-wide cross-system approaches that have been sustained– Juveniles with Sexually Aggressive/Reactive
Behaviors– Juveniles with Fire Setting Behaviors– High Risk Adoptions– Co-occurring Developmental Disabilities
Lessons Learned
• Need for a dedicated Researcher/Evaluator in County
• Exclusive focus on the National Evaluation did not lead to sustainability
• It did lead, however, to a partially funded position – our System of Care Analyst