Top Banner
Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008
51

Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Dec 28, 2015

Download

Documents

Joanna Craig
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Program Evaluation

Ralph Gonzales, MD, MSPHProfessor of Medicine; Epidemiology & Biostatistics

20 May 2008

Page 2: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Program Evaluation

Outcome Evaluation

Process Evaluation

Resource Evaluation

Relevant Perspectives

• Payor

• Organization

• Public Health/Gov’t

Page 3: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Program Evaluation Can Help To…

• Measure intervention’s effectiveness on targeted process or outcome measures.

• Determine most efficient and effective strategy for implementation of intervention

• Verify the mechanisms through which you believe your intervention is working

• Guide/support replication in other settings

• Align program goals with delivery system or stakeholder goals.

• Determine cost-effectiveness & priority

Page 4: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Pediatrics 120;2007:481-88… A “typical” Outcome Evaluation

Intervention Design: “Quality improvement collaborative”– Audit and feedback– Guideline dissemination/implementation– Buy-in/agreement to participate from CEO/CMO– Multidisciplinary team formation

• 4 state-wide “learning meetings”• Self-measurement activities• Monthly progress reports• One-on-one coaching calls

Outcomes– 13 hospital-based newborn preventive health care services

based on random sample of 30 medical records pre- and 30 medical records post-intervention

Page 5: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Vermont QIC Results

“Process Evaluation”: all Vermont hospitals participated; all formed teams; all attended each learning session; all participated in self-measurement, submitted monthly progress reports and participated in coaching calls.

Page 6: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Vermont QIC Conclusions• “QIC led to significant improvements in assessments for

breastfeeding adequacy, risk of hyperbilirubinemia, infant sleep position, and car safety seat fit”.

• “Costs estimated at $29,000 per hospital (not including hospital staff time)… cost per infant = $41

• “More research is needed to understand the predictors of a hospital’s success in improvement work”

Page 7: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

What else would you like to know before trying this program in your state or hospital?

• What hospital infrastructure is needed in order to successfully participate?

• Will this program work outside Vermont?– Are there unique features of Vermont…?

• Who should pay for the program?

• Which program elements are essential?– Not enough variation to figure this out…

• What did each site actually DO?

Page 8: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Outcome Evaluation-Picking the outcomes

• Start with health/clinical outcome… work backwards to what processes or indicators are feasible, believable and compelling.

• Prioritize surrogate outcome measures– For surrogate measures, consider sensitivity

and specificity (or even validation)

• Specify a benchmark (outcome level) that would define a successful program

Page 9: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Handbook of Practical Program Evaluation JS Wholey, HP Hatry, KE Newcomer, eds. 1994

TABLE OF CONTENTS

Approaches to Evaluation Design1. Assessing feasibility and likely

usefulness of evaluation2. Designing and using process evaluation3. Using qualitative approaches4. Outcome monitoring5. Constructing natural experiments6. Convincing quasi-experiments: the

interrupted time series and regression-discontinuity designs

7. Ethical and practical randomized field experiments

8. Synthesizing evaluation findings

Practical Data Collection Procedures9. Use of ratings by trained observers10.Designing and conducting surveys11.The systematic use of expert judgment12.Acting for the sake of research: the use

of role-playing in evaluation13.How to use focus groups14.Managing field data collection from

start to finish15.Collecting data from agency records

Practical Data Analysis16.Using statistics appropriately17.Using regression models to estimate

program effects18.Benefit-cost analysis in program

evaluation

Page 10: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Program EvaluationData Collection Tree

Outcome Process Resource

Quantitative•Direct observation•Surveys•Chart review•Lab; Pharmacy•Payor admin. data

Qualitative•Direct observation•Focus groups•Personal interviews•Role play

Patients Staff Providers Managers Directors …

Page 11: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Chart Review Methods

Gilbert et al. Ann Emerg Med 1996;27:305-08.

Training -train with set of practice charts; describe in methods

Case selection -use explicit inclusion and exclusion criteria

Definition of variables -define key variables precisely

Abstraction forms -use standardized abstraction forms

Describe and ensure uniform handling of data that is conflicting, ambiguous, missing, or unknown. Use double-data entry

Meetings periodically review disputes and coding rules

Monitoring monitor performance (eg, occasional repeat abstractions)

Blinding blind abstractor to hypothesis, or at least group assignment

Interrater agreement test with 2nd rater… report kappa or ICC

Page 12: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Chart Review Methods

Gilbert et al. Ann Emerg Med 1996;27:305-08. %

Training -train with set of practice charts; describe in methods 18

Case selection -use explicit inclusion and exclusion criteria 98

Definition of variables -define key variables precisely 73

Abstraction forms -use standardized abstraction forms 11

Describe and ensure uniform handling of data that is conflicting, ambiguous, missing, or unknown. Use double-data entry

Meetings periodically review disputes and coding rules

Monitoring monitor performance (eg, occasional repeat abstractions) 4

Blinding blind abstractor to hypothesis, or at least group assignment 3

Interrater agreement test with 2nd rater… report kappa or ICC 5

Page 13: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Errors that can occur when transforming chart data to database

Gilbert et al. Ann Emerg Med 1996;27:305-08.

• Identification of clinical event• Select patients to be studied• Assemble charts• Locate desired information• Read note• Code information• Transfer data to computer database

Page 14: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Another Quantitative Program Evaluation…

Page 15: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Minimizing Antibiotic Resistance in Colorado (MARC)

Project Goals• Decrease community levels of antibiotic

resistance in Colorado• Decrease antibiotic use for viral respiratory

illnesses– Increase clinician knowledge of appropriate treatment

strategies for respiratory illnesses• Clinical practice guidelines & office based materials

– Decrease patient demand for antibiotics• Increase public and patient awareness about antibiotic

resistance and appropriate antibiotic use• Improve public self-care and when-to-seek-care strategies for

respiratory illnesses

Page 16: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

A Social Ecological Model for Changing Patients' Expectations for and Use of

Antibiotics

Mediating FactorsIntrapersonal Level-awareness/knowledge-motivation (and past experiences)-skills and self-efficacyInterpersonal Level-changing social norms-new information in medical settingsCommunity Level-new information in mass media-new clinical practice guidelines

Level of Exposure to Intervention

noneSSCELSCEboth

CLINICAL ENCOUNTER

Patient Outcomes

antibiotic use

self care

no satisfaction

no illness duration

Patient Modifiers

demographysocioeconomicsinsurance typehealth statusmedical and treatment historyillness characteristics

Page 17: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

7/01 11 7/02 11 7/03 11 7/04 7/05

SSCE develop implement reinforce

LSCE develop implement

MD-Ed reinforce reinforce

DIS

AbRx

Pub-S

Pat-S

MD-S

PRSP

Economic Analysis

Year 1 Year 2 Year 3 Year 4

THE MARC PROJECT TIMELINE

Interven

tion

Figure LegendSSCE: small-scale community education: household and office-based educational materialsLSCE: large-scale community education: television, radio, newspaper, and web site; development to include observation of

best practices and community focus groupsMD-Ed: clinician educational intervention: reinforcement via feedback of antibiotic prescription ratesDIS: dissemination activitiesAbRx: antibiotic prescription rates for pharyngitis, bronchitis derived from CMS Data Project MCOs and MedicaidPub-S: public and parent survey from intervention and control communities (Data Project and Medicaid households)Pat-S: pharyngitis and bronchitis patient outcomes survey from intervention and control community office practicesMD-S: clinician vignette study from intervention and control community office practicesPRSP: active invasive PRSP surveillance by Emerging Infections Program at Colorado State Health DepartmentEconomic Analysis: conducted in Year 4 with data from AbRx and Pat-S results from Years 1-3.

Evaluation

Page 18: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

MEDIA IMPRESSIONSMEDIA IMPRESSIONS

PAID MEDIA billboards 2,319,800 bus shelters 64,715,400 bus tails 27,000,000 bus interiors 8,035,800Total OOH 102,071,000 Radio NPR 1,449,558 TV paid+bonus 491,382Total Broadcast 1,940,940Aggregate Total PAID 104,011,940

EARNED MEDIA TV 829,500 Radio 94,400Total Broadcast 923,900Total Print 3,505,400Aggregate Total EARNED 4,429,300

Page 19: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Figure 1A. In the past 3 months, have you seen or heard ads or

news telling you about antibiotic resistance?

0

10

20

30

40

50

60

Pre-BaselineWinter

Baseline Winter Mass MediaWinter

Perc

ent Res

pond

ing 'Yes

'

Mass Media

Comparison

P=0.04

Page 20: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Figure 1B. In the past 3 months, have you seen materials in a medical

provider’s office about problems with overusing antibiotics?

0

10

20

30

40

50

60

Pre-BaselineWinter

BaselineWinter

Mass MediaWinter

Perc

ent Resp

ondin

g 'Yes'

Mass Media

Comparison

P=0.03

Page 21: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

-30

-25

-20

-15

-10

-5

0

5

10

15

1 2 3 4 5 6 7 8 9 10 11 12 1 2 3 4 5 6 7 8 9 10

Net

Ant

ibio

tic P

resc

riptio

ns p

er 1

000

pers

ons

per

mon

th

General Population

campaign

2002 2003

Figure 2A. Antibiotic Prescriptions Filled by General Population in Retail Pharmacies

Mass Media Community Receives Fewer Antibiotics than Control Community

Mass Media Community Receives More Antibiotics than Control Community

P=0.30

Page 22: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

-30

-25

-20

-15

-10

-5

0

5

10

15

11 12 1 2 3 4 5 6 7 8 9 10 11 12 1 2 3 4 5 6 7 8 9 10

Net

Ant

ibio

tic P

resc

riptio

ns p

er 1

000

pers

ons

per

mon

th

Total MCO Members

campaign

2002 2003

Figure 2B: Total Antibiotic Prescriptions Filled by Managed Care Organization Members

Mass Media Community Receives Fewer Antibiotics than Control Community

Mass Media Community Receives More Antibiotics than Control Community

P=0.02

Page 23: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

-30

-25

-20

-15

-10

-5

0

5

10

15

11 12 1 2 3 4 5 6 7 8 9 10 11 12 1 2 3 4 5 6 7 8 9 10

Net

Ant

ibio

tic P

resc

riptio

ns p

er 1

000

pers

ons

per

mon

th

Pediatric MCO Members

campaign

2002 2003

Figure 2C: Antibiotic Prescriptions Filled by Pediatric Managed Care Organization Members

Mass Media Community Receives Fewer Antibiotics than Control Community

Mass Media Community Receives More Antibiotics than Control Community

P=0.01

Page 24: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

-30

-25

-20

-15

-10

-5

0

5

10

15

11 12 1 2 3 4 5 6 7 8 9 10 11 12 1 2 3 4 5 6 7 8 9 10

Net

Ant

ibio

tic P

resc

riptio

ns p

er 1

000

pers

ons

per

mon

th

Adult MCO Members

campaign

2002 2003

Figure 2D: Antibiotic Prescriptions Filled by Adult Managed Care Organization Members

Mass Media Community Receives Fewer Antibiotics than Control Community

Mass Media Community Receives More Antibiotics than Control Community

P=0.09

Page 25: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

-30

-25

-20

-15

-10

-5

0

5

10

15

20

25

11 12 1 2 3 4 5 6 7 8 9 10 11 12 1 2 3 4 5 6 7 8 9 10

Net

Offi

ce V

isits

per

100

0 pe

rson

s pe

r m

onth

Pediatric MCO Members

campaign

2002 2003

Figure 3: Office Visits by Pediatric Managed Care Organization Members

Mass Media Community has Fewer Office Visits than Control Community

Mass Media Community has More Office Visits than Control Community

P=0.01

Page 26: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Net savings in pediatric and adult prescription and visit costs attributable to the mass media intervention

Control Community Mass Media Community

Baseline year costs per

1000 members

Intervention year costs per 1000 members

Percent Change

Baseline year costs per 1000

members

Predicted* costs per

1000 members

Actual costs per

1000 members

Savings per 1000 members

Pediatric prescriptions

$37, 644 $43,239 +14.9% $34,257 $39,348 $33,810 $5,538

Adultprescriptions

$23,404 $23,948 +2.3% $25,527 $26,121 $24,180 $1,941

Pediatric visits

$4710 $5212 +10.6% $5,408 $5,983 $5,016 $968

Adult visits

$2168 $2075 -4.3% $2,133 $2,042 $2,116 $(73)

*Predicted costs based on percent change observed in the control community.

Page 27: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

“Darn It! If only we had done a qualitative process evaluation…”

Questions we could have addressed• Confirm change in visit rates due to mass

media campaign… and patient driven (vs. physician driven).

• Parent experience with and reactions to mass media campaign

• Physician experience patients, and whether they perceived any changes in patient/parent expectations as a result of the mass media campaign.

Page 28: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Process Evaluation-describes how a program works

Page 29: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Process Evaluation-Goals

• Understand your results– Help explain heterogeneity in effects

• Allow replication

• Refine and improve (CQI)

Page 30: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Process Evaluation-Objectives

• Describe the intervention that got delivered

• Assess level of exposure to the intervention

• Assess the experience of those exposed to the intervention– Was the target audience engaged?– Did processes occur en route to the proposed behavior change?

• i.e., did knowledge, attitudes, proximal behaviors improve in a manner that led to an improved main outcome behavior?

• Are there system/organizational factors that modify the effect of the effectiveness of the intervention?

Page 31: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

1. Measure Intervention “Delivery”

• Extent of intervention implementation– Interviews, questionnaires and surveys of

implementers and participants

– Stealth observers

Page 32: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

2. Assessing Exposure to the Intervention

• Measure exposure periodically, continuously or retrospectively

• Use observation, self-report and/or existing data sources

Page 33: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

3. Describing Experience of Those Exposed to Intervention

• Use self-report through interviews, focus groups or surveys

• Assess their experience with intervention, perceived barriers/facilitators that are most closely associated with failure/success

• Link factors to conceptual framework

Page 34: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Process Evaluation-Focus Group Guide

• What do patients, staff and clinicians consider to be strengths of the program?

• What do patients, staff and clinicians dislike about the program?

• What do patients, staff and clinicians recommend for improving the target behavior?

• Do personnel have adequate resources (money, equipment, facilities, training, time) to achieve program goals?

Page 35: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

RE-AIM

Glasgow R et al. Am J Prev Med 2006;30:67-73.

• Reach (participation rate + representativeness)• Effectiveness (effect size… standardized effect

size if comparing interventions)• Adoption/Acceptability (participation rate and

representativeness of practices (or providers)• Implementation (consistency of intervention

delivery across practices/providers• Maintenance (sustainability of effect size).

Page 36: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Performance of DP and DHC on Individual RE-AIM Dimensions

Diabetes Priority• Computer-assisted health

behavior change program• T2DM action plan• Pts completed in office

prior to diabetes visit• Takes about 20-30 min

• Implemented by clinic staff at 30 Colorado mixed-payer PC practices (52 MDs, 886 pts)

• MDs review & endorse pt’s action plan

• Clinic staff discuss plan with pt and schedule f/u visits

• $222 per patient

Diabetes Health Connection• Computer-assisted health behavior

change program• T2DM action plan• Focused soley on diet/exercise• Separate visit with Health Coach• Takes 30-45 min

• Implemented by Intervention Team• 42 MDs, 335 pts @ mixed payer & HMO• $547 per patient

Page 37: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Devising a Process Evaluation for the IMPAACT Trial

Break-out Groups-15 min group meeting

-10 min group presentations-10 minutes wrap-up

Page 38: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Fill-In the BlanksProcess Evaluation

Quantitative Approach- Target group(s)- Data collection methods- Key questions to be addressed

Qualitative Approach- Target group(s)- Data collection methods- Key questions to be addressed

ED/Hospital

Page 39: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

IMPAACT Trial

Clinical Practice Guidelines (national)

+

Performance Feedback (group)

+

Patient Education

Page 40: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

IMPAACT Multi-Dimensional Intervention Strategy

• Each ED randomized to intervention received the following:1. Provider education (practice guidelines)

delivered by local opinion leaders– Opinion leaders attended “train-the-trainer”

session

2. Group audit and feedback

3. Patient education

• Sites provided individualized adaptation of components

Page 41: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Patient Education

• Waiting Room Patient Education– Pamphlets/Cards– Informational Kiosk

• Examination Room Materials– Bronchitis Posters

Page 42: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

ABx Treatment of URIs/Bronchitis Decreased at Intervention Sites

Metlay et al, Ann Emerg Med, 2007.

Page 43: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Conceptual Framework

Things We DidPhysician Education• Seminars (b)• Guidelines (b)• Audit/Feedback (b)• Opinion Leader (b)Patient Education• Waiting Room Brochures (a)• Kiosk (a)• Exam Room Posters (a; b)

Patient FactorsSociodemographicsKnowledgeCase MixExpectations & Demands

Physician FactorsKnowledgeAttitudesStaff vs. MoonlighterType (attending; housestaff;

midlevel)Diagnostic Uncertainty

System FactorsVA vs. EMNetWait TimesFast Track/Urgent Care ZonesAccess to Primary Care; F/UCompeting InterventionsSecular Trends/ActivitiesQI Culture

Abx Rx Decision

Other Outcomes• CXR ordering• Return Visits• Patient Satisfaction

a

b

c

Page 44: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Site

Responder vs. Non-

responder

ARI Antibiotic

Prescription Rates

Year 1 (%)

ARI Antibiotic

Prescription Rates

Year 2 (%)

ARI Antibiotic

Prescription Rates

Year 3 (%)

Percent Change in Antibiotic Rx Rates

VANM

Non-responder

45 55 66+21

EMNM

Non-Responder (near goal) 39 33 33 -6

EMNY

Non-Responder

77 62 59

-18

VAILResponder

75 61 51-24

EMGAResponder

18 29 6-12

EMILResponder

51 29 22-29

VANYResponder

77 59 39-38

Percent change calculated as Year 1-Year 3; Responder is defined as a site that either met goal targets OR had a greater than 20% reduction in antibiotic prescription rates.

Page 45: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Local Project Leader Report Stealth Observer Report

OVERALL Rating

SiteKioskRating

PosterRating

OverallSelf-Report Rating

KioskRating

PosterRating

OverallStealth ObserverRating

*combines local opinion leader, stealth observer, andfocus-group/ interview data

VANM 1 1 0 2 2 2Fair

EMNM 1 2 1 0 0 0Weak

EMNY 1 1 1 0 0 0[RG1] Weak

VAIL 1 1 1 2 1.5 2Fair

EMGA 2 2 2 2 2 2Excellent

EMIL 1 2 2 2 2 2Excellent

VANY 1 1 1 2 2 2Fair

0=Poor1= Fair2= Excellent

Page 46: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Site

Responder vs. Non-

responder

ARI Antibiotic

Prescription Rates

Year 1 (%)

ARI Antibiotic

Prescription Rates

Year 2 (%)

ARI Antibiotic

Prescription Rates

Year 3 (%)

Percent Change in Antibiotic Rx Rates

OverallImplementation

Rating

VANM

Non-responder

45 55 66+21

Fair

EMNM

Non-Responder (near goal) 39 33 33 -6 Weak

EMNY

Non-Responder

77 62 59

-18 Weak

VAILResponder

75 61 51-24 Fair

EMGAResponder

18 29 6-12 Excellent

EMILResponder

51 29 22-29 Excellent

VANYResponder

77 59 39-38 Fair

Percent change calculated as Year 1-Year 3; Responder is defined as a site that either met goal targets OR had a greater than 20% reduction in antibiotic prescription rates.

TRIANGULATION

Page 47: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Site Visits-Qualitative Process Evaluation

• One-on-One Interviews– Site PI– Chief Quality Officer– Nurse Manager

• Focus Groups– ED nursing staff– Noon conference presentation, Q&A

Page 48: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Analyzing TransciptsLikert Scale Ratings

Physician Champion: 1= weak, unknown to participants or not seen as a leader 5= clear strong advocate; leader among peers; well respected by all

Patient satisfaction: 1=patient satisfaction does not appear to be valued and was not mentioned as a barrier to improving antibiotic use5= patient satisfaction is measured and reported at the provider level and appears to greatly influence decisions

QI Culture: 1= no experience with QI efforts or negative experience 5= significant positive experience with QI; involves all members of the healthcare team in a collaborative model

Page 49: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Sample QuotesPhysician Champion

PI :“They’ve been calling me the antibiotic [tyrant] around here for a while”

Nurse Manager:“ He is passionate on this” and “He talks in the ED and the community”

PI: “Because I became so heavily associated with antibiotic use and the appropriate use of antibiotics it comes up a lot of times just when we interact. There have been a number of times when physicians have come up to me and said, ‘I thought about you the other day when I had a patient with a cold and I didn’t give them antibiotics.’”

Page 50: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Site

Responder vs. Non-

responder

ARI Antibiotic

Prescription Rates

Year 1 (%)

ARI Antibiotic

Prescription Rates

Year 2 (%)

ARI Antibiotic

Prescription Rates

Year 3 (%)

Percent Change in Antibiotic Rx Rates

Overall QI

Rating

Overall Influence of

Patient Satisfaction

Rating

Overall Physician Champion

Rating

VANM

Non-responder

45 55 66+21

4.5 2 2.5

EMNM

Non-Responder (near goal) 39 33 33 -6 3.5 4 3

EMNY

Non-Responder

77 62 59

-18

2 4 2

VAILResponder

75 61 51-24

4 4.5 4.5

EMGAResponder

18 29 6-12

5 5 5

EMILResponder

51 29 22-29

3.5 4 5

VANYResponder

77 59 39-38

2 2 4

Percent change calculated as Year 1-Year 3; Responder is defined as a site that either met goal targets OR had a greater than 20% reduction in antibiotic prescription rates.

TRIANGULATION

Page 51: Program Evaluation Ralph Gonzales, MD, MSPH Professor of Medicine; Epidemiology & Biostatistics 20 May 2008.

Main Outcome Surrogate Outcome

Barton

Belkora

Bryant

Davis

Eaton

Flaherman

Guy

Jose

Kim

Kaimal

Nguyen

Shin

Tsui

Velayos