Top Banner
Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics
107

Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Jan 11, 2016

Download

Documents

Ira Stone
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Launching and Nurturinga

Performance Management System

G.S. (Jeb) Brown, Ph.D.

Center for Clinical Informatics

Page 2: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Performance… of what? who?

4 suggestions for performance criteria….

1. Patient benefit is reason the system exists – performance criteria must relate directly to patient benefit.

2. Patients themselves are the best source of information on patient benefit

3. The goal of performance management is to understand what (or who) are the drivers outcomes and to use this information to improve outcomes for patients.

Page 3: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Who benefits?

• Patients!!!!!!!!• Employers and other payers

• Clinicians, providers and behavioral healthcare organizations that can deliver high value services (as measured by outcome)

• The field as a whole…. Real world evidence of outcomes demonstrates the value of behavioral health services within the larger context of overall medical costs

Page 4: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Who loses?

• Providers, facilitates and clinicians that cannot demonstrate the effectiveness (value) of their services.

• Failure to measure performance protects the financial interest of the least effective providers.

Page 5: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Treatments or clinicians?

• Current trends in measuring performance are focused on “Evidenced Based Practices” – identify the most effective treatments and encourage their use.

• This is a good strategy if most of the variance in outcomes is due to the treatment…. But what if it isn’t?

• In order to manage performance it is first necessary to understand the primary sources of variance – what really drives outcomes?

Page 6: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Clinician effects

• Clinicians differ widely in their “effectiveness” resulting in wide differences in outcomes.

– Results cannot be explained by theoretical orientation, treatment methods, years of training or experience.

• The effectiveness of all treatments, including medications, are mediated by clinician effects.

• Failure to measure and account for clinician effects on controlled studies or the real world is ……..

BAD SCIENCE!

Page 7: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Primary barrier… the clinician• Most clinicians believe that their outcomes are

above average and their services are of high value, without the need to actually measure this

• Many clinicians feel discomfort at the thought that their performance might be evaluated by their patients via self report outcome questionnaires

• Many clinicians believe that a simple outcome questionnaire cannot provide useful information about their patients beyond what they obtain by forming their own clinical judgments.

Page 8: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Secondary barriers

• Faith in treatments (therapy methods, drugs) to deliver consistent and predictable outcomes

• Belief that the cost of the services is so low (relative to overall medical costs) that meaningful performance management isn’t cost effective

• Belief that meaningful performance management isn’t necessary to retain existing business or acquire new customers.

• Lack of organizational commitment to place the patient first and/or desire to avoid conflict with clinicians

Page 9: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Overview and agenda…

1. Information drawn from 5 performance management projects

1. Human Affairs International: 1996-1999

2. Brigham Young University Comprehensive Clinic: 1996 – present (Lambert & others)

3. PacifiCare Behavioral Health: 1999 – present

4. Resources for Living: 2001-present

5. Accountable Behavioral Health Care Alliance: 2002 - present

Page 10: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Overview - continued

2. Putting together an performance management system – Measures

– JET (Just Enough Technology)

– Software choices

3. Measurement and feedback methods– Case mix adjustment

– Tracking trajectory of change

– Reporting outcomes

– Identifying high value clinicians

Page 11: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Overview - continued

4. Goldilocks effect – Cause: clinicians and patients exercising broad

discretion in the method, intensity and duration of treatment.

– Result: Patients tend to receive treatment that is “just about right”; not too much and not to little of a treatment that seems to work for them.

– More is not always better.

– Impact on dose (cost) benefit analyses; implications for cost management

Page 12: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Overview - continued

5. Clinicians effects– The impact of clinicians effects on treatment outcomes

is the most important new research finding to immerge in the last few years

– Recently published analyses of data from controlled studies and large samples of patients receiving “treatment as usual” within the community provide compelling evidence that the clinician may be the single most important factor driving the outcome.

– Differences in clinician effectiveness not due to training or years of experience.

Page 13: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Overview - continued

6. Putting it all together, making it work– 4 stages of development and implementation of

outcomes management program.

– Strategies for success & formulas for failure

– Outcomes informed care: the client comes first; one client at a time.

– Nurturing an outcomes informed organizational culture; (here’s hint – show the CFO the ROI)

Page 14: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Human Affairs International (HAI)

• Outcome Questionnaires: Outcome Questionnaire-45 & Youth Outcome Questionnaire (OQ-45 & YOQ)

• Michael Lambert, PhD of Brigham Young University spent six month sabbatical working onsite at HAI to develop a clinical information system

• Several hundred individuals clinicians and over 20 multidisciplinary group practices collected data between 1996 and 1999.

• Magellan Health Services acquired HAI and discontinued the program.

Page 15: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

BYU Comprehensive Clinic

• Outcome measures: OQ-45 & YOQ

• Services university population

• Lambert and colleagues have conducted numerous studies on the use feedback to enhance outcomes – one client at a time.

Page 16: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

PacifiCare Behavioral Health (PBH)

• PBH (now a part of United Behavioral Health) manages behavioral health care for over 5,000,000 covered lives annually Over 100 multidisciplinary clinics and 12,000 psychotherapists participating.

• Outcome measures: Life Status Questionnaire & Youth Life Status Questionnaire

• Measure voluntarily completed by 80% of all clients.

• Other research consultants: Lambert & Burlingame (BYU); Wampold (U of Wisc – Madison); Ettner (UCLA); Doucette, (GWU)

Page 17: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Resources for Living (RFL)

• Provides telephonic EAP services, data collected over the phone at time of service; clinicians receive real time feed back on trajectory of improvement and working alliance (SIGNAL system)

• Outcome measures: Outcome Rating Scale (4 items); also utilizes the Session Rating Scale (4 items) to the working alliance

• Other research consultants: Miller and Duncan, Institute for the Study of Therapeutic Change

Page 18: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Accountable Behavioral Healthcare Alliance (ABHA)

• Managed behavioral healthcare organization servicing Oregon Health Plan members in 5 rural county area

• Outcome measure: Oregon Change Index (4 items; based on the Outcome Rating Scale)

• Other research consultants: Miller, Institute for the Study of Therapeutic Change

Page 19: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

10 Guiding Principles

1. Measure to manage.

2. Management requires frequent feedback over time.

3. Keep it simple, make it matter.

4. Keep it brief, measure often.

5. Create benchmarks, compare results.

Page 20: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

10 Guiding Principles - continued

6. Minimize opportunity for feedback induced bias.

7. Provide the right information at the right time to the right person to make a difference.

8. Build in the flexibility so that the system evolves with the experience of the users.

9. Maintain central control of data and reporting

10.Establish and protect a core data set.

Page 21: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Five Minute Rule

• If it takes more than five minutes to collect the data you’re in trouble.

• To manage outcomes, you need the collect the right data to measure and model the variance in outcomes.

• More data = more variance explained, but with diminishing returns

• Find the sweet spot – variance per minute

• Clinicians may be willing to collect more than 5 minutes worth of data if there is clear benefit

• Be parsimonious!!!!!

Page 22: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Measure often!

• Most of the change (better or worse) occurs in the first few weeks of treatment.

• Frequent measurement results in better detection of patients at risk for premature termination.

• PBH ask for data at 1st, 3rd and 5th sessions, and every 5th session thereafter

• BYU, RFL and ABHA collects outcome measure at every session

Page 23: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Selecting outcome measures

• Clinician completed scales are subject to feedback induced bias.

• Patient completed measures tend to show faster change in the near term and less change in the long term than clinician completed measures.

• Clinician perception of purposes of the measures can induce bias at the clinician level that is difficult/impossible to control for.

Page 24: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

In search of variance

• In order to improve outcomes, it is necessary to understand the sources of variance in outcomes.

• The ability to measure sources of variances is limited by the reliability and validity of the measures.

• More data = greater reliability/validity = more variance explained

• More data = more time, more cost, more hassle and probably lower compliance

Page 25: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Variance per minute

• A little data goes a long way.

• A lot more data doesn’t provide proportionately more information.

• Fine tune the data set through item analysis and other methods to identify those measures (items) that provide the greatest psychometric information in the least amount of time.

• Optimize the variance per minute; find the organization’s “sweet spot”.

Page 26: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Maximizing reliability

• Reliability refers to consistency with which a set of items measures some variable of interest.

• Coefficient alpha reliability is a measure of internal consistency of the measure at one point in time.

• Test retest reliability assesses the stability of scores over time.

• Items that correlate highly with one another increases reliability.

• More items = greater reliability.

Page 27: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Item Response Theory

• Item Response Theory (IRT) uses different assumptions than classical test theory when optimizing items on a questionnaire

• Selects for items that provide information on change for patients with different levels of symptom severity

• Can be used to optimize test length – tends to result in shorter measures.

Page 28: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Finding the item # sweet spot

• More items = greater reliability; but only up to a point.

• OQ-45 has 45 items and reliability of .93 (coefficient alpha).

• OQ-30 has reliability of .93.

• 10 well selected items from OQ-30 have a reliability of .9

• Outcome Rating Scale (4 items) has reported reliability of .8 to .9.

Page 29: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Validity

• Face validity matters!!!

• Does the questionnaire seem to be asking about the right things?

• Are these the kinds of problems that people seeking mental health services commonly report?

• Are these items that we expect to see improve as the result of treatment?

• If the items make sense to the patients, it probably a good set of items.

Page 30: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Global factor

• Items inquiring about symptoms and problems patients most commonly seek help for tend to correlate with one another.

• Example: Items about sadness correlate with items about anxiety. Both correlate with items about relationships.

• Factor analyses of of a variety of outcome measures reveals that most items are load on common factor (“global distress factor”).

Page 31: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Concurrent validity

• Due to existence of a global factor, all patient completed outcome questionnaires tend to correlate highly with one another.

• A global measure with an adequate sampling of symptom items will correlate highly with disease specific measures such as the Beck Depression Inventory or the Zung Anxiety Scale.

Page 32: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Multiple factors in children

• Child and adolescent measures may have more complex factor structure than adults.

• Separate factors for “externalizing” and “internalizing” symptoms.

• Global factor still the most dominant factor in child/adolescent measures.

Page 33: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

JET: Just Enough Technology

• Outcomes management depends on information technology.

• Technology adds cost, complexity and risk of failure.

• Start modestly – use just enough technology to get the job done.

• Add complexity only as necessary.

• Beware of innovation induced paralysis.

Page 34: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Capturing the data

• Computers, PDAs and other devices are cool, but….

They are expensive, someone still needs to enter the data, and if the patient is expected to enter the data, someone has to teach the patient to use the device.

• Advantages of paper and pencil– Low cost

– No instructions needed

– Information immediately available to clinician

– Easily scanned for data capture

Page 35: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Scanning solutions

• Teleform: High end fax to file solution for OCR and OMR; many advance features; ideal for enterprise level use.

http://www.verity.com/

• Remark: Scan to file with OCR and OMR; less costly than Teleform.

http://www.principiaproducts.com/

• Data capture vendors.

http://www.scantron.com/

http://www.ceoimage.com/

Page 36: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Building a system

• Sophisticated outcomes management systems can be created using the off the shelve software.

• Example: PacifiCare ALERT system– Teleform for data capture

– SAS for data warehousing and reporting

– Microsoft Office (Word, Excel, Access) for reporting.

– SAS commands and Visual Basic Scripts used to automate processes, such permitting SAS to output data to Excel for use in a mail merge process by Word to create reports for the clinicians.

Page 37: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

How much should it cost?

• Cost for routine data collection and sophisticated reporting at all levels of the organization should be less than 1% of the cost of care……if you use JET!

Page 38: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Measuring change

• Outcomes are generally evaluated by comparing pre and post treatment test scores.

• Change score = Intake score – last score.

• “Intent to treat” method includes all cases with two or more assessments rather than only cases that “complete” treatment.

• Intent to treat method encourages clinicians to keep patients engaged in treatment.

Page 39: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Standardizing change scores• Change scores are often reported as “effect size”.

• Preferred statistic for research reports.

• Effect size is usually calculated by dividing the change scores by the standard deviation of the outcome measure at intake.

• If adequate normative information is available on the outcome measure, there are advantages to using the the standard deviation of the outcome measure in a non treatment population.

Page 40: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Benchmarking outcomes

• Measuring outcomes is of little use without some basis of comparison.

• Are the outcomes good? Compared to what?

• Clinicians and organizations differ in the kids of cases they treat.

• Benchmarking outcome requires a method of accounting for differences in case mix.

Page 41: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Regression:a fact of life

• With any repeated measurement, regression artifacts are a fact of life.

• Scores are correlated across time.

• A test score at one point in time is the single best predictor of a score at a subsequent point in time.

• Patients with high scores and low scores will tend to have scores closer the the mean on subsequent measurement.

Page 42: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Regression implications

• Patients with high distress report greater overall change and greater change per session than low distress patients.

• Patients with scores in normal (non-clinical) range tend to report little improvement or even show increased distress overt time.

• Focusing treatment resources on patients with the most severe symptoms results in improved outcomes.

Page 43: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Case mix

• Case mix variables are those variables present at the beginning of the treatment episode that are predictive of the outcome

• Intake score accounts for 18% of variance in change scores in PBH data

• Addition of age, sex and diagnosis to predictive model accounts for < 1% additional variance

Page 44: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Benchmark Score

• Regression techniques used to model relationship between intake scores and patient variables (age, diagnosis) and the change measured in treatment.

• Benchmark Score: residualized change score (difference between predicted and actual effect size)

• Clinicians are ranked based on the mean Benchmark Scores for their cases.

Page 45: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Regression and case mix

Diagnosis and Outcome

-2

-1

0

1

2

3

20 30 40 50 60 70 80 90 100

110

120

LSQ Intake Score

Eff

ect

Siz

e

Anxiety

Bi-Polar

Depression

Psychotic

Substance Abuse

Page 46: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

At risk for poor outcome

• Patients with a poor initial response to treatment are at risk for a poor outcome due to the probability of unplanned treatment termination.

• A poor initial response to treatment is not a strong predictor of future response to treatment, so long as the patient remains in treatment.

Page 47: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Predicting change

• The single best predictor of a future test score is the most recent test score.

• Regression analysis reveals that the relationship between intake scores and subsequent test scores is generally linear, with large variance between the predicted and actual scores (residualized scores).

• Predicted trajectory of change can be estimated using simple regression formulas to predict scores at each measurement point.

Page 48: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Regression formulas

Intake score: 95 Predicted score

Slope Interceptpredicted score= intake*slope+inercept

Week 3 0.7694 8.24715 81.34Week 6 0.709 9.90144 77.25Week 9 0.6622 12.00917 74.92

Week 12 0.6167 14.72839 73.31

Page 49: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Trajectory of change graph

45

55

65

75

85

95

105

1 3 6 9

Weeks

LS

Q s

core

Individual patient's score

75th percentile

Projected change (50thpercentile)

25th percentile

At-risk for premature termination

Page 50: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Past and future change

• Prior change is not highly predictive of future change.

• Odds of additional improvement remain good if the test score is in the clinical range and the patient remains engaged in treatment.

• Implication: Remain optimistic; prevent premature termination; keep patient engaged in treatment.

Page 51: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Sample analysis

• Cases began treatment in severe range with no improvement or worse by week 6.

• Average case in sample 5 points worse at week 6.

• Approximately half of these cases had no data after week 6.

• Those that continued in treatment averaged 10 points improvement after week 6!

Page 52: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Remain optimistic!

6264666870727476788082

Inta

ke

Wee

k 6

Wee

k 8

Wee

k 10

Wee

k 12

Last

sco

re

Unplanned termination

Continuedtreatment

Page 53: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Goldilocks Effect

• Describes effects that are due to freedom of choice on the parts of clinicians and patients with regard to method, intensity and duration of treatment

• Present in data collected in naturalistic setting but not in controlled studies

• Most research on treatment outcomes has been designed to eliminate these effects in order to investigate a particular treatment at a particular intensity and duration.

Page 54: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Why Goldilocks?

• In the story of Goldilocks and the Three Bears, Goldilocks keep trying different things (chairs, porridge and bed) each time seeking the one that was just right for her.

• Clinicians and patients continuously make choices about about treatment method(s), frequency of sessions, and duration of treatment based on rate of improvement in prior sessions.

Page 55: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Goldilocks & QI

• Little attention has been given to the possible benefits of encouraging the Goldilocks Effect.

• Many quality improvement initiatives encourage use of “empirically validated treatments” and adherence to various treatment protocols, thus making the implicit assumption to quality is improved by limiting the Goldilocks Effect.

Page 56: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Hypothesized Mechanisms

• Patients seek treatment when level of distress is high.

• Utilization of services (intensity & duration) is a function of the patient’s level of distress and rate of improvement.

• Clinician/patient dyad make decisions in an ongoing, dynamic manner with regard to treatment methods, intensity and duration.

Page 57: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Goldilocks and utilization

• Length of treatment is a much a function of outcome as outcome is of length of treatment.

• Patients with rapid improvement have good outcomes while tending to utilize relative few services.

• Patients with slow rate of change tend to have worse outcomes and utilize more services.

• Result: More treatment appears to be associate with worse outcome.

Page 58: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Time in treatment and outcome

0.000.100.200.300.400.500.600.70

< 1

mon

th

1 m

onth

2 m

onth

s

3 m

onth

s

4 m

onth

5 m

onth

s

> 6

mon

ths

Eff

ec

t s

ize

Total time in treatment episode

Page 59: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Total sessions and outcome

0.00

0.10

0.20

0.30

0.40

0.50

0.60

lessthan 5

5 6 to 10 11 to15

16 to20

21 to30

30+

Eff

ec

t S

ize

Total sessions in treatment episode

Page 60: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Intensity of services

• Measured by frequency of sessions.

• Goldilocks effect prediction: Patients with slow rate of change will tend to receive a higher frequency of sessions.

• Following slide confirms prediction.

Page 61: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Trajectory of change for patients with severe symptoms; high intensity => 1 session per week

20

30

40

50

60

70

80

Intake Weeks 4-6 Weeks 10-12

LS

Q s

core

75th percentile

High Intensity

Low-Medium Intensity

25th percentile

Clinical Cutoff

Page 62: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Goldilocks andUtilization Management

• Goldilocks effect means the clinician/patient dyad tend to arrive at an appropriate length of treatment.

• The PBH ALERT system seeks a rational allocation of resources by encouraging utilization by those patients most likely to benefit.

• PBH implemented utilization on demand: more sessions authorized each time outcome questionnaire submitted.

• No change in the overall average length of treatment.

Page 63: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Reporting outcomes

• Case by case reporting to clinician helpful to prevent premature termination.

• Residualized change scores are used to control for differences in case mix.

• Residual score = predicted last score – actual last score.

• “Benchmark Score” (ABHA) or “Change Index Score” (PBH, RFL)

• Positive score means greater than average improvement.

Page 64: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Evaluating outcomes

• Mean residual change score used to rank clinicians or clinics/group practices based on outcomes.

• “Severity adjusted change” calculated by adding a provider’s mean residual score to average change for all cases in the database.

• Larger sample sizes yield better estimates of outcome.

• Use of confidence intervals avoids over interpretation of results from small sample sizes.

Page 65: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Sample: Comparing Results

Severity Adjusted Effect Size

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

Site 1 2 3 4 6 7 8 9 10 11 12 13 14 15

90% confidence band

Page 66: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Sample Disaggregated Results

# cases Change IndexTotal with > 1 (effect size) (actual-expected)

Adults Cases data point actual expectedNormal range 343 176 -0.04 -0.23 0.19

138 86 0.25 0.17 0.08Moderately distressed 224 131 0.66 0.40 0.26Severely distressed 263 170 0.98 0.79 0.19Combined Adult 968 563 0.48 0.29

Children & AdolescentsNormal range 10 7 0.28 -0.23 0.51

3 1 0.50 0.19 0.31Moderately distressed 8 3 0.98 0.57 0.41Severely distressed 10 9 1.04 0.90 0.14

Combined Child/Adolescent 31 20 0.74 0.42

Aggregate Results for All Age Groups ChangeTotal number of cases: 999 Index

Number of cases with > one data point: 583 actual expected (actual-expected)

% of cases with > one data point: 58% 0.48 0.29

Severity at intake

Above average

Mildly distressed

Mildly distressed

0.19

Age Group

0.32

0.19

Change

Change

Page 67: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

• Bruce Wampold, Michael Lambert and others argue that researchers have ignored the individual therapist as a source of variance

• Therapists vary widely in “effectiveness”

• Not explained by therapy method, training, or years of experience

• Even in controlled studies, therapist effects account for more variance in outcomes than treatment method

Therapists effects

Page 68: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Therapists Effects - continued

• Recent research provides strong evidence that therapist/psychiatrist effects have a significant impact on the effectiveness of medications, in particularly antidepressants

• Evidence suggest that use of medications may increase, rather than decrease, the variance due to the therapists…..

• Huh?

Page 69: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

The (almost) Bell Curve

Solo clinicians with sample sizes => 20 (PBH data)

0%

5%

10%

15%

20%

25%

-0.2 -0.1 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9

Effect Size

% o

f c

linic

ian

s

Page 70: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

• Honors for Outcomes Selection Criteria: – Minimum of 10 cases with two Y/LSQ data points in past

3 years– Average patient change must be reliably above average:

65% confidence that the provider’s Change Index >0 – Change Index is a case-mix adjusted measure, compares

outcomes to PBH’s large normative database

• Honors for Outcomes is updated quarterly

Honors for Outcomes

Page 71: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Website

Page 72: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Honors for Outcomes - Search

Page 73: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Honors for Outcomes - Results

Page 74: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

• Honors for Outcomes depends on predictive validity of Honors rating; prior performance predicts future performance

• Question: Does a therapist’s outcomes with adults predict outcomes with children and adolescents?

• Implication if yes: Therapists’ effectiveness is likely to be global in nature rather than specific to age and or diagnostic group.

Study Question 1

Page 75: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

• Question: Does a therapist’s outcomes with adults predict outcomes with children and adolescents on medications?

• Implication if yes: The therapist effectiveness of the therapists is apparently mediating the effect of the medication(s).

Study Question 2

Page 76: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

• Use Honors for Outcomes methodology to rank clinicians based on their outcomes with adult patients only.

• Therapist included in the study if they treated at least one child/adolescent with psychotherapy only and one with psychotherapy plus medication. (929 Honors, 1352 Non-Honors)

• Compare outcomes for children and adolescents for Honors clinicians to other clinicians.

Study Method

Page 77: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Result: Outcomes for adults predicts outcomes for children

-0.2

0

0.2

0.4

0.6

0.8

1

1.2

0-41 42-120

Intake scores

Eff

ec

t s

ize

Honors-psychotherapyonly

Honors-psychotherapyand medication

Non-Honors-psychotherapy only

Non-Honors-psychotherapy andmedication

mild symptoms moderate to severe symptoms

Page 78: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Results after adjusting for intake score, age, sex, diagnosis and

prior treatment history.

-0.25

-0.2

-0.15

-0.1

-0.05

0

0.05

0.1

0.15

0.2

0.25

0-41 42-120

Re

sid

ua

l eff

ec

t s

ize

Honors-psychotherapyonly

Honors-psychotherapyand medication

Non-Honors-psychotherapyonly

Non-Honors-psychotherapyand medication

moderate to severe symptomsmild symptoms

Page 79: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

All diagnoses and medications

Delta residual N Delta residual N

Honors-psychotherapy only

1.9 2.3 430 12.5 2.3 286

Honors-psychotherapy and medication

-1.3 -1.3 79 15.3 2.7 134

Non-Honors-psychotherapy only

-0.9 -0.5 565 8.3 -2.2 449

Non-Honors-psychotherapy and

medication-1.2 -2.7 102 10.3 -1.5 186

Intake score below mean Intake score at mean or above

Page 80: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Children diagnosed with depression and treated with psychotherapy alone or in combination with an

antidepressant

-0.3

-0.2

-0.1

0

0.1

0.2

0.3

0.4

0-41 42-120

Re

sid

ua

l eff

ec

t s

ize

Honors-psychotherapyonlyHonors-psychotherapyand medicationNon-Honors-psychotherapyonlyNon-Honors-psychotherapyand medication

Page 81: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Depression & antidepressants

Delta residual N Delta residual N

Honors-psychotherapy only

2.6 3.5 77 15.4 4.6 84

Honors-psychotherapy and medication

0.11 0.3 28 15.5 2.9 41

Non-Honors-psychotherapy only

-1.7 -2.9 87 9.2 -2 123

Non-Honors-psychotherapy and

medication-1.7 -3.2 27 11.1 -0.9 53

Intake score below mean Intake score at mean or above

Page 82: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Clinician effects and feedback

• PBH ALERT system letters identifies patients at risk for premature termination.

• Impact of ALERT letters appears to be dependent on the effectiveness of the clinicians.

• Following graph presents outcomes for at risk cases treated clinicians with outcomes in top quartile compared to bottom quartile.

Page 83: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Therapist rank and impact of ALERT letters

30

35

40

45

50

55

60

Intake Alertletter

(session3-5)

Lastsession(9-10)

LS

Q s

co

re Top quartileclinicians

Bottom quartileclinicians

Page 84: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Outcomes and cost

$0

$100

$200

$300

$400

$500

$600

No outcomedata forprovider

Honors:Groups

Honors:Soloclinicians

Non-Honors:Groups

Non-Honors:Solo

Average cost per episode

Page 85: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Value Index

•Value Index = Average effect size per $1000 expenditure (Effect Size/Cost of Care) x $1000

0

0.5

1

1.5

2

Honors: Groups Honors:Soloclinicians

Non-Honors:Groups

Non-Honors:Solo

Va

lue

Ind

ex

Honors: Groups

Honors:Solo clinicians

Non-Honors: Groups

Non-Honors: Solo

Page 86: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Case history # 1

• Resources for Living (RFL) began using 4 item Outcome Rating Scale and Session Rating Scale in 2002

• Telephonic counseling

• Baseline data collected for 5 months

• Baseline data used to create trajectory of change graphs

• Real time feedback provided to counselors via SIGNAL System

Page 87: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

RFL Signal System

Page 88: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

0.00

0.10

0.20

0.30

0.400.50

0.60

0.70

0.80

0.90

2nd

quar

ter 2

002

3rd

quar

ter 2

002

4th

quar

ter 2

002

1st q

uarte

r 200

3

2nd

quar

ter 2

003

3rd

quar

ter 2

003

4th

quar

ter 2

003

1st q

uarte

r 200

4

2nd

quar

ter 2

004

3rd

quar

ter 2

004

Eff

ec

t s

ize

RFL Signal System: results

Baseline period

Training and feedback

Page 89: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Case history # 2

• Accountable Behavioral Health Care Alliance (ABHA) switched from the OQ-30 used by PBH to a version of the 4 item ORS used by RFL and others.

• Questionnaire modified to become the Oregon Change Index (OCI) and utilized consistently from 2004 to present.

• Administered at every session in outpatient and day treatment settings.

• OCIs collected at over 80% of all sessions.

Page 90: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

OCI Feedback

• After collecting baseline data throughout 2004 and early 2005.

• In mid 2005 ABHA initiated regularly weekly feedback at the clinician and supervisor level.

• Excel based Active Case Report contains data on all cases seen within the last 6 weeks.

• Report is updated and emailed to clinicians at the start of each week. .

Page 91: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

OCI Active Case Report

C l i n i c i a n :S o r t b y :

M e a n i n t a k e s c o r e : 1 8 . 4M e a n r e c e n t s c o r e : 2 3 . 7

M e a n c h a n g e : 5 . 3 M e a n B e n c h m a r k : 2 . 9

C a s e C o u n t 9 1

C l i e n t I DA g e

G r o u pC l i n i c i a n a t

i n t a k eM o s t r e c e n t

c l i n i c i a n I n t a k e d a t eI n t a k e

O C IM o s t r e c e n t

d a t e

M o s t r e c e n t

O C IO C I

C o u n t

O C I C h a n g e

S c o r e S t a t u sB e n c h m a r k

s c o r e3 2 2 1 4 a d u l t 3 5 3 5 8 / 1 5 / 2 0 0 5 3 5 . 0 9 / 7 / 2 0 0 5 2 . 0 2 - 3 3 . 0 S i g n i f i c a n t l y w o r s e - 2 7 . 83 1 9 2 4 6 a d u l t 3 5 3 5 6 / 2 1 / 2 0 0 4 2 8 . 0 2 / 2 4 / 2 0 0 5 4 . 0 1 0 - 2 4 . 0 S i g n i f i c a n t l y w o r s e - 2 2 . 03 1 9 9 9 5 a d u l t 3 5 3 5 7 / 1 2 / 2 0 0 5 1 3 . 0 1 / 1 9 / 2 0 0 6 6 . 0 7 - 7 . 0 S o m e w h a t w o r s e - 1 2 . 03 2 0 9 5 8 a d u l t 2 4 3 5 3 / 2 / 2 0 0 5 1 3 . 0 6 / 2 7 / 2 0 0 5 8 . 0 1 9 - 5 . 0 S o m e w h a t w o r s e - 1 0 . 01 7 3 2 5 1 a d u l t 3 5 3 5 9 / 1 2 / 2 0 0 5 1 8 . 0 1 / 9 / 2 0 0 6 1 1 . 0 3 - 7 . 0 S o m e w h a t w o r s e - 9 . 63 2 2 5 5 4 a d u l t 3 5 3 5 1 1 / 9 / 2 0 0 5 7 . 0 1 2 / 7 / 2 0 0 5 6 . 0 2 - 1 . 0 S o m e w h a t w o r s e - 8 . 73 1 5 4 9 2 a d u l t 3 5 3 5 1 / 2 7 / 2 0 0 5 2 4 . 0 1 0 / 2 5 / 2 0 0 5 1 6 . 0 5 - 8 . 0 S o m e w h a t w o r s e - 7 . 93 2 2 8 5 4 a d u l t 3 5 3 5 1 2 / 1 3 / 2 0 0 5 9 . 0 1 / 2 4 / 2 0 0 6 8 . 0 3 - 1 . 0 S o m e w h a t w o r s e - 7 . 81 9 7 5 0 2 a d u l t 3 5 3 5 1 / 5 / 2 0 0 5 1 5 . 0 4 / 1 3 / 2 0 0 5 1 3 . 0 6 - 2 . 0 S o m e w h a t w o r s e - 6 . 03 0 1 4 7 a d u l t 3 5 3 5 7 / 1 5 / 2 0 0 5 1 4 . 0 9 / 2 0 / 2 0 0 5 1 3 . 0 2 - 1 . 0 S o m e w h a t w o r s e - 5 . 53 1 2 8 4 5 a d u l t 3 3 3 5 4 / 2 0 / 2 0 0 4 1 4 . 0 1 / 2 4 / 2 0 0 6 1 4 . 0 2 6 0 . 0 N o c h a n g e - 4 . 53 1 9 1 5 4 a d u l t 3 1 3 5 5 / 1 1 / 2 0 0 4 2 3 . 0 1 / 2 4 / 2 0 0 6 1 9 . 0 2 5 - 4 . 0 S o m e w h a t w o r s e - 4 . 33 2 0 1 7 6 a d u l t 3 5 3 5 1 2 / 2 2 / 2 0 0 4 8 . 0 2 / 8 / 2 0 0 5 1 1 . 0 5 3 . 0 S o m e w h a t i m p r o v e d - 4 . 31 2 8 9 8 2 a d u l t 3 3 3 5 8 / 9 / 2 0 0 5 1 7 . 0 1 / 3 1 / 2 0 0 6 1 6 . 0 1 0 - 1 . 0 S o m e w h a t w o r s e - 4 . 13 2 1 7 7 1 a d u l t 3 5 3 5 5 / 1 8 / 2 0 0 5 1 1 . 0 1 / 2 6 / 2 0 0 6 1 3 . 0 1 0 2 . 0 S o m e w h a t i m p r o v e d - 3 . 93 0 9 9 3 3 a d u l t 3 5 3 5 1 / 1 2 / 2 0 0 4 1 6 . 0 1 / 1 2 / 2 0 0 6 1 6 . 0 3 5 0 . 0 N o c h a n g e - 3 . 63 0 1 4 7 4 a d u l t 3 5 3 5 7 / 7 / 2 0 0 4 1 9 . 0 1 / 1 7 / 2 0 0 6 1 8 . 0 1 8 - 1 . 0 S o m e w h a t w o r s e - 3 . 21 5 1 0 1 2 a d u l t 3 5 3 5 1 / 1 8 / 2 0 0 6 9 . 0 2 / 1 / 2 0 0 6 1 3 . 0 2 4 . 0 S o m e w h a t i m p r o v e d - 2 . 83 1 7 2 6 3 a d u l t 3 5 3 5 4 / 2 0 / 2 0 0 4 1 4 . 0 5 / 1 6 / 2 0 0 5 1 6 . 0 5 2 . 0 S o m e w h a t i m p r o v e d - 2 . 51 7 4 1 3 1 a d u l t 3 5 3 5 2 / 2 3 / 2 0 0 5 2 1 . 3 3 / 9 / 2 0 0 5 2 0 . 0 2 - 1 . 3 S o m e w h a t w o r s e - 2 . 4

3 5 D e s c h u t e sC o u n t y :

T o v i e w t h e c h a n g e g r a p h f o r a s p e c i f i c c l i e n t , u s e t h e m o u s e t o c l i c k o n t h e r o w n u m b e r b e s i d e t h e c l i e n t y o u w i s h t o v i e w a n d t h e n c l i c k o n " V i e w C l i e n t G r a p h " :

C l i n i c i a n I D

B e n c h m a r k S c o r e

I n t a k e O C I

M o s t r e c e n t O C I

In t a k e d a t e

M o s t r e c e n t d a t e

V i e w C l i e n t G r a p h

Page 92: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Trajectory of Change Graph

The first 20 sessions are graphed. If there are more than 20 sessions, the final point is the most recent session

0

5

10

15

20

25

30

35

403/

2/20

05

4/4/

2005

4/7/

2005

4/11

/200

5

4/14

/200

5

4/18

/200

5

4/20

/200

5

4/25

/200

5

5/2/

2005

5/9/

2005

5/10

/200

5

5/16

/200

5

5/17

/200

5

5/25

/200

5

6/1/

2005

6/6/

2005

6/8/

2005

6/20

/200

5

6/27

/200

5

Client Scores

Clinical Cutoff

75th percentile

ExpectedChange

25th percentile

10th percentile

Page 93: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Outcomes trending upwards

ABHA Outcomes by YearClients with scores in the clinical range at intake

00.1

0.20.3

0.40.5

0.60.7

0.8

2004 2005 2006

Effe

ct S

ize

Page 94: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Implications for clinicians

• Good news: The clinician matters!!!!!!

• All treatments (including medications!?) are only as effective as the clinicians delivering the treatment.

• Clinicians have an ethical responsibility to assess and improve their personal effectiveness as clinicians… they cannot rely on the treatments alone to be curative.

Page 95: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Implications for administrators & policy makers

• Exclusive focus on the effectiveness of treatments rather than clinicians limits the potential to improve outcomes.

• Administrators and policy makers have an obligation to consumers to assure that they have access to effective clinicians.

• Failure to monitor outcomes at the clinician level places consumers at risk.

Page 96: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Performance Management:Four Stages of Development

1. Preparation

2. Implementation

3. Performance feedback

4. Managing outcomes

Page 97: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Stage one: Preparation

Goal: Put things in motion; avoid fatal errors. (see Formulas for Failure)

• Identification of stake holders and change agents.

• Articulation of vision, mission and purpose.

Why are we doing this?

• Choice of measures

• Development of case mix model

• Prototyping of reports and decision support tools

• Training materials and education of providers.

Page 98: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Stage two: Implementation

Goal: Get something up and running.

• Pilot system with sub set of willing high volume providers and clinics

• Refine reports and decision support tools based on feedback from users

• Monitor and provide feedback on data quality compliance with data collection protocols.

• Validate and refine case mix adjustment model.

Page 99: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Stage 3: Performance feedbackGoal: Get clinicians use to receiving

performance feedback.• Provide performance feedback on continuous

basis.

• Make direct comparisons across sites or providers; identify top performers.

• Institute remedial measures as necessary to improve data quality.

• Disseminate results; respond to concerns re data quality, validity of methods, etc.

Page 100: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Stage 4: Managing outcomes

Goal: Measurably improve outcomes!• Continued data analysis to explore opportunities for

quality improvement

• Provide information on pathways to improve outcomes

• Provide additional support in form of consultation, data analysis, reporting and decision tools as needed

• Reward top performers with recognition, incentives, increased referrals, etc.

Page 101: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Strategies for success

• Put the patient first: patient welfare trumps clinician comfort.

• Show the business case: return on investment; rational allocation of resources, marketing a sales advantages.

• Create a clear mandate to measure outcomes and a date for implementation: “drop dead date”.

• Keep it simple; don’t be afraid to fix it later.

• Give recognition and support for early adapters and risk takers.

Page 102: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Formulas for failure

• Too complicated: Too many measures, too much time, to hard to explain.

• IT paralysis: Too much technology, too much complexity, too much dependence on expertise not under your control (outside vendors, IT staff).

• Design by committee: Too many cooks in the kitchen; too many people with too many agendas.

• Clinician referendum: Expectation that outcomes initiative is dependent upon clinician “acceptance”.

Page 103: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

http://[email protected]

1821 Meadowmoor Rd.Salt Lake City, UT 84117

Voice 801-541-9720

Page 104: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Suggested readings

Ahn H, Wampold BE. Where oh where are the specific ingredients? A meta-analysis of component studies in counseling and psychotherapy. Journal of Counseling Psychology; 2001: 48, 251-257.Blatt SJ, Sanislow CA, Zuroff DC, Pilkonis PA. Characteristics of effective therapists: Further analyses of data from the National Institute of Mental Health Treatment of Depression Collaborative Research Program. Journal of Consulting and Clinical Psychology; 1996: 64, 1276-1284.Brown GS, Burlingame GM, Lambert MJ, et al. Pushing the quality envelope: A new outcomes management system. Psychiatric Services; 2001: 52 (7), 925-934.Brown GS, Herman R, Jones ER, Wu J. Improving substance abuse assessments in a managed care environment. Joint Commission Journal on Quality and Safety; 2004: 30(8), 448-454.Brown GS, Jones ER, Betts W, Wu J. Improving suicide risk assessment in a managed-mare environment. Crisis; 2003: 24(2), 49-55. Brown GS, Jones ER, Lambert MJ, Minami T. Identifying highly effective psychotherapists in a managed care environment. American Journal of Managed Care, 2005:11(8):513-20 . Brown GS, Jones ER. Implementation of a feedback system in a managed care environment: What are patients teaching us? Clinical Psychology/In Session: 2005: 61(2), 187-198.Burlingame GM, Jasper BW, Peterson G, et al. Administration and Scoring Manual for the YLSQ. Wilmington, DL, American Professional Credentialing Services; 2001.Crits-Christoph P, Mintz J. Implications of therapist effects for the design and analysis of comparative studies of psychotherapies. Journal of Consulting and Clinical Psychology, 1991; 59, 20-26.Crits-Christoph P., Baranackie K., Kurcias JS et al. Meta-analysis of therapist effects in psychotherapy outcome studies. Psychotherapy Research; 1991: 1, 81-91.Elkin I. A major dilemma in psychotherapy outcome research: Disentangling therapists from therapies. Clinical Psychology: Science and Practice; 1999: 6, 10-32.

Page 105: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Suggested readings (continued)

Hannan C, Lambert MJ, Harmon C, Nielsen SL, Smart DW & Shimokawa K, Sutton SW. A lab test and algorithms for identifying clients at risk for treatment failure. Journal of Clinical Psychology/In Session: 2005: 61(2), 155-164.Harmon C, Hawkins, EJ, Lambert, MJ, Slade K & Whipple JL. Improving outcomes for poorly responding clients: the use of clinical support tools and feedback to clients. 61(2), 175-186.Huppert JD, Bufka LF, Barlow DH, Gorman JM, Shear MK, Woods SW. Therapists, therapist variables, and cognitive-behavioral therapy outcomes in a multicenter trial for panic disorder. Journal of Consulting and Clinical Psychology; 2001: 69, 747-755.Kim DM, Wampold BE, Bolt DM. Therapist effects and treatment effects in psychotherapy: Analysis of the National Institute of Mental Health Treatment of Depression Collaborative Research Program. Psychotherapy Research: 2006: 12(2), 161-172.Lambert MJ., Whipple J., Smart, DW et al (Vermeersch, D. A., Nielsen, S.L., & Hawkins, E. J.) The effects of providing therapists with feedback on patient progress during psychotherapy: Are outcomes enhanced? Psychotherapy Research; 2001: 11, 49-68.Lambert MJ, Harmon C, Slade K, Whipple JL & Hawkins EL. Providing feedback to psychotherapists on their patients’ progress: clinical results and practice suggestions. Journal of Clinical Psychology/In Session: 2005: 61(2), 165-174. Lambert MJ, Hatfield DR, Vermeersch DA., et al. Administration and scoring manual for the LSQ (Life Status Questionnaire). East Setauket, NY: American Professional Credentialing Services; 2001.Lambert MJ, Whipple JL, Hawkins EJ et al. Is it time for clinicians to routinely track patient outcome? A meta-analysis. Clinical Psychology: Science & Practice; 2003: 10, 288-301.Lambert MJ. Emerging methods for providing clinicians with timely feedback on treatment effectiveness. Journal of Clinical Psychology/In Session: 2005: 61(2), 141-144.

Page 106: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

Suggested readings (continued)

Luborsky L, Crits-Christoph P, McLellan T et al. Do therapists vary much in their success? Findings from four outcome studies. American Journal of Orthopsychiatry; 1986: 56, 501-512. Luborsky L, Rosenthal R, Diguer L et al. The dodo bird verdict is alive and well--mostly. Clinical Psychology: Science & Practice; 2002: 9(1) 2-12.Matsumoto K, Jones E, Brown, GS. Using clinical informatics to improve outcomes: A new approach to managing behavioral healthcare. Journal of Information Technology in Health Care; 2003: 1(2), 135-150Okiishi J, Lambert MJ, Nielsen SL, Ogles BM. Waiting for supershrink: An empirical analysis of therapist effects. Clinical Psychology and Psychotherapy; 2003: 10, 361-373.Porter ME & Teisberg, EO. Redefining competition in health care. Harvard Business Review,2004:65-76.Shapiro DA, Shapiro, D. Meta-analysis of comparative therapy outcome studies: A replication and refinement. Journal of consulting and Clinical Psychology; 1982: 92, 581–604.Vermeersch DA, Lambert MJ, Burlingame GM. Outcome Questionnaire: Item sensitivity to change. Journal of Personality Assessment; 2002: 74, 242-261. Wampold BE, Brown GS. Estimating therapist variability: A naturalistic study of outcomes in private practice. Journal of Consulting and Clinical Psychology; 2005: 75(5) pp 914-923. Wampold BE, Mondin GW, Moody M et al. A meta-analysis of outcome studies comparing bona fide psychotherapies: Empirically, “all must have prizes.” Psychological Bulletin; 1997: 122, 203-2154.

Page 107: Launching and Nurturing a Performance Management System G.S. (Jeb) Brown, Ph.D. Center for Clinical Informatics.

About the presenter

G.S. (Jeb) Brown is a licensed psychologist with a Ph.D. from Duke University. He served as the Executive Director of the Center for Family Development from 1982 to 19987. He then joined United Behavioral Systems (an United Health Care subsidiary) as the Executive Director for of Utah, a position he held for almost six years. In 1993 he accepted a position as the Corporate Clinical Director for Human Affairs International (HAI), at that time one of the largest managed behavioral healthcare companies in the country.

In 1998 he left HAI to found the Center for Clinical Informatics, a consulting firm specializing in helping large organizations implement outcomes management systems. Client organizations include PacifiCare Behavioral Health/ United Behavioral Health, Department of Mental Health for the District of Columbia, Accountable Behavioral Health Care Alliance, Resources for Living and assorted treatment programs and centers throughout the world.

Dr. Brown continues to work as a part time psychotherapist at behavioral health clinic in Salt Lake City, Utah. He does measure his outcomes.