Top Banner
IMPLEMENTING AND SUSTAINING EVIDENCE-BASED PROGRAMS WITH INTEGRITY: GOAL ALIGNMENT BETWEEN PROGRAMS AND SETTINGS Brian R. Flay, D.Phil. Public Health, Oregon State University Presented at Forum on Emphasizing Evidence-Based Programs for Children and Youth Child Trends, Washington DC, April 27 2011
15

Brian R. Flay, D.Phil. Public Health, Oregon State University

Feb 24, 2016

Download

Documents

thora

Implementing and Sustaining Evidence-Based Programs with Integrity: Goal Alignment between Programs and Settings. Brian R. Flay, D.Phil. Public Health, Oregon State University. Presented at Forum on Emphasizing Evidence-Based Programs for Children and Youth - PowerPoint PPT Presentation
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Brian R. Flay, D.Phil. Public Health, Oregon State University

IMPLEMENTING AND SUSTAINING EVIDENCE-BASED PROGRAMS

WITH INTEGRITY: GOAL ALIGNMENT BETWEEN PROGRAMS AND SETTINGS

Brian R. Flay, D.Phil.Public Health, Oregon State University

Presented at Forum on Emphasizing Evidence-Based Programs for Children and Youth

Child Trends, Washington DC, April 27 2011

Page 2: Brian R. Flay, D.Phil. Public Health, Oregon State University

WORKING WITHIN AN ORGANIZATION RATHER THAN CHANGING THE WHOLE

ORGANIZATION • I focus on the idea of inserting a program into a

setting rather than reforming the setting – Many reading and math education programs– Many substance use prevention programs

• Life Skills Training, ALERT– Many violence prevention programs

• Second Step– Programs to prevent multiple problem behaviors

• Aban Aya – Many social-emotional and character development

(SACD or SECD) programs• Positive Action, PATHS

Page 3: Brian R. Flay, D.Phil. Public Health, Oregon State University

WHAT MAGNITUDE OF EFFECTS

CAN BE EXPECT? • Contrast efficacy and effectiveness trials (Flay, 1986)• Meta-analyses suggest that the effects of programs

evaluated under real-world conditions are generally smaller than the effects of the same program evaluated under controlled conditions– Lipsey (2011, personal communication):

• “We do regularly find smaller effect sizes for ‘routine practice’ programs than ‘research and demonstration’ programs, though have not always commented on that in our published papers.”

– Lipsey (1999): • ES for demonstration programs = .13; • ES for practical programs = .07 (though wide variability)

– Wilson, Lipsey & Derzon (2003): • ES for demonstration programs = .25; • ES for routine-practice programs = .10 (but very small N of routine practice

program evaluations)

Page 4: Brian R. Flay, D.Phil. Public Health, Oregon State University

WHY DOES THIS DIFFERENCE EXIST?

• Efficacy trials are often conducted by the same people who developed the program who also maintain a high level of control over the intervention delivery as well as the research design – Thus, they ensure a high level of implementation with integrity (but

rarely help to sustain a program!)• Effectiveness trials may be conducted by third parties, who

have less control over program implementation and, possibly, lower motivation to ensure program integrity – They also often have to train implementers who may be less

motivated to do the particular program well. • Can this be changed?

– Probably – by ensuring optimum implementation!

Page 5: Brian R. Flay, D.Phil. Public Health, Oregon State University

OPTIMIZING IMPLEMENTATION AND SUSTAINABILITY

• Most of the necessary conditions are the same as for comprehensive setting change (Slavin, this forum; Slavin & Madden, 2007; Wandersman, this forum; Wandersman, et al., 2008)– Support of the whole setting– Be true to the model– Investment and commitment– Ongoing professional development

• One condition is different– Alignment with overarching goals of the setting

• E.g., the goals of Success For All are automatically aligned with the goals of schools

Page 6: Brian R. Flay, D.Phil. Public Health, Oregon State University

SUPPORT OF THE WHOLE SETTING

• Readiness for change (Fixsen, this forum)• Leadership (e.g., principals)

– Need an ongoing champion as well as someone who monitors implementation amount and integrity

– But must avoid creating a “cult” – that might help short-term compliance but not long-term commitment

• Implementers (e.g., teachers) – [Though they are not always the best judges of what

programs to adopt!]– Needs to be a team effort – implementers need to

agree to the adoption of a program, be encouraged to work with each other, problem-solve together, etc.

Page 7: Brian R. Flay, D.Phil. Public Health, Oregon State University

SUPPORT OF THE WHOLE SETTING (CONTINUED)

• Recipients (e.g., students and their families)– For school programs, also need student and family

buy-in– Need for quick and easily observable

improvements/successes• Perceptions of ineffectiveness will bring it down

– Make sure program goals are relevant to receivers• E.g., students don’t always see the value of academic

achievement – they won’t be motivated to achieve if they see getting good scores as being mostly for the benefit of the teacher or the school

Page 8: Brian R. Flay, D.Phil. Public Health, Oregon State University

INVESTMENT IS ESSENTIAL• Settings that pay for program materials, training

and ongoing support usually implement better• Continuous investment is necessary!

– Refresh consumable materials– Ongoing tech support and training

• Continuous purchase of support and training from program developers is also helpful– Homegrown or third-part materials or training are

usually inferior and likely to be less effective• Ongoing investment is a good indicator of ongoing

commitment by decision-makes

Page 9: Brian R. Flay, D.Phil. Public Health, Oregon State University

BE “TRUE TO THE MODEL”

• Integrity with the original theory and key concepts of a program must be maintained if new settings/places are to obtain the same kinds of results as found in prior evaluations– Program elements cannot be re-invented for each setting

• Adaptation for the culture or context can be useful, but the key elements cannot be changed or dropped– Castro et al., 2010; Castro, this forum

• The intensity, breadth, comprehensiveness, etc. of the original program have to be maintained (Elliott & Mihalic, 2004; Mihalic, this forum)– No short-cuts– Unless there is evidence – Or implementers must evaluate the altered version

Page 10: Brian R. Flay, D.Phil. Public Health, Oregon State University

ONGOING PROFESSIONAL DEVELOPMENT

• Initial training is essential– Because of increased pressures/mandates, staff require extensive training,

even for well-packaged, easy to implement programs– Reviews of the effectiveness of the train-the-trainer (TTT) model are mixed

• “Little is known” (Herschell, et al., 2010)• TTT is less effective (ES = .09) than other approaches (ES = .20) (Conn et al., 2011)• Expert training is more cost-effective than TTT (Olmstead, et al., 2011)

• On-going technical support is also essential (Fixsen et al., 2005)– Staff seem to need a lot more hand-holding than they used to (again

because of increased demands)• Annual (re)training

– Refresher training for ongoing staff and new training for new staff (staff turnover is a big issue)

– Implementers are often starting from a low level of understanding and skill - they need ongoing training just like students need ongoing teaching

• You don’t learn most things from one learning session – you need practice, feedback, retraining for re-alignment, etc.

Page 11: Brian R. Flay, D.Phil. Public Health, Oregon State University

Alignment with the Overarching Goals of the Setting• This condition underlies all of the above – to the

extent that this is true, the others should follow.• Must become legitimate at all levels of the system

– E.g., link prevention program outcomes to academic achievement• Claims must be supported by high-quality evaluation evidence

– E.g., school-based or after-school social-emotional or prevention programs must be able to demonstrate that they also improve academics• Other research suggesting the link is not adequate to motivate

leadership or implementers – too big of a leap of faith– This was a big failure of substance use prevention researchers and

program developers and of the Safe and Drug Free Schools program

Page 12: Brian R. Flay, D.Phil. Public Health, Oregon State University

ALIGNMENT WITH THE OVERARCHING GOALS OF THE SETTING (CONTINUED)• Programs need to quickly produce “easy to see” and

desired effects quickly– E.g., improved classroom behavior, followed by improved

learning, followed by improved test scores• Also need to meet the broader overarching

needs/goals of the broader setting– Address the (whole) child, the family & the community

• Integrate developmental and prevention science, together with positive youth development to address the ‘‘whole child’’ (Bhattacharyya, et al., 2009)

• Align with cultural ideals – (Castro et al., 2010, this forum)– e,g, in Hawai’i, Pono Choices values of being right with yourself and

above reproach can help avoid unsafe sex and improve academics

Page 13: Brian R. Flay, D.Phil. Public Health, Oregon State University

CONCLUSIONS

• Inserting a program into a setting so that it is implemented with integrity and is sustained– Requires the same conditions as setting reform:

• Support of the whole setting• Investment and commitment• Being true to the model• Ongoing professional development

– Plus one additional condition:• Alignment with the overarching goals of the setting

– Behavior, Character and Achievement– Consistent with community values and norms

Page 14: Brian R. Flay, D.Phil. Public Health, Oregon State University

REFERENCES• Bhattacharyya, O., Reeves, S., & Zwarenstein, M. (2009). What Is Implementation Research? Research on Social Work

Practice, 19(5), 491-502. • Castro, F. G., Barrera Jr, M., & Holleran Steiker, L. K. (2010). Issues and challenges in the design of culturally adapted

evidence-based interventions. Annual Review of Clinical Psychology, 6, 213-239.• Conn, V. S., Hafdahl, A. R., & Mehr, D. R. (2011). Interventions to Increase Physical Activity Among Healthy Adults: Meta-

Analysis of Outcomes. American Journal of Public Health, 101(4), 751.• Elliott, D. S., & Mihalic, S. (2004). Issues in disseminating and replicating effective prevention programs. Prevention Science,

5(1), 47-53. • Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation Research: A Synthesis of the

Literature. University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network (FMHI Publication #231). Tampa, FL.

• Flay, B. R. (1986). Efficacy and effectiveness trials (and other phases of research) in the development of health promotion programs. Preventive Medicine, 15, 451-474.

• Herschell, A. D., Kolko, D. J., Baumann, B. L., & Davis, A. C. (2010). The role of therapist training in the implementation of psychosocial treatments: A review and critique with recommendations. Clinical Psychology Review, 30(4), 448-466.

• Lipsey, M. W. (1999). Can rehabilitative programs reduce the recidivism of juvenile offenders? Virginia Journal of Social Policy & the Law, 6(3), 611-641.

• Olmstead, T., Carroll, K. M., Canning-Ball, M., & Martino, S. (2011). Cost and cost-effectiveness of three strategies for training clinicians in motivational interviewing. Drug and Alcohol Dependence. doi: 10.1016/j.drugalcdep.2010.12.015

• Slavin, R. E., & Madden, N. A. (2007). Scaling up Success For All: The first 16 years. In S. B & S. McDonald (Eds.), Scale-up in education (pp. 201-228). Lanham, MD: Rowman & Littlefield

• Wandersman, A., Duffy, J., Flaspohler, P., Noonan, R., Lubell, K., Stillman, L., . . . Saul, J. (2008). Bridging the gap between prevention research and practice: The interactive systems framework for dissemination and implementation. American Journal of Community Psychology, 41(3), 171-181.

• Wilson, J. W., Lipsey, M. W., & Derzon, J. H. (2003). The effects of school-based intervention programs on aggressive behavior: A meta-analysis. Journal of Consulting and Clinical Psychology, 71(1), 136-149. doi: 10.1037/0022-006X.71.1.136

Page 15: Brian R. Flay, D.Phil. Public Health, Oregon State University

THANK YOU

• I thank the Office of the Assistant Secretary for Planning and Evaluation for sponsoring this Forum

• Thanks to Child Trends for organizing and hosting it.

• I thank my wife, Carol G. Allred, very much for her help with this! – As the developer, seller and trainer for the

Positive Action program (www.positiveaction.net), she has far more real-world experience with these issue than I do!