“Scoring an Oral Simulation Exam” Elizabeth A. Witt, Ph.D. American Board of Emergency Medicine Presented at the 2005 CLEAR Annual Conference September.

Post on 01-Apr-2015

213 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

Transcript

“Scoring an Oral Simulation Exam”

Elizabeth A. Witt, Ph.D.American Board of Emergency Medicine

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

ABEM Certification Process

• Complete residency in Emergency Medicine

• Pass written certification examination

• Pass oral certification examination, a series of simulated patient encounters

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

Purpose of ABEM’s Oral Certification Examination

• Assess clinical performance

• Test the application of knowledge of Emergency Medicine

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

Structure of ABEM’s Oral Certification Examination

• 7 simulations based on actual clinical cases– 1 field test simulation– 4 single patient encounters– 2 multiple patient encounters

• One-on-one; 7 different examiners• Examiner introduces each case and may

play role of patient, nurse, consultant, etc.

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

Scoring

• Done by the examiner during and immediately after the session

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

Examiner Qualifications

• ABEM diplomate at least 5 years

• Residency trained in EM (ACGME- or RCPSC-approved)

• Actively involved in the practice of clinical Emergency Medicine

• Nominated in writing by current examiner, director, or senior director

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

Examiner Qualifications, cont.

• Evaluated and recommended by ABEM’s Test Administration Committee

• Appointed by the Board of Directors

• Distinguished for high quality patient care, teaching, research, or leadership

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

Examiner Training Before Exam

• Focus on standardizing the delivery and scoring of each case

• Demonstrations

• Training video

• Scoring practice with feedback

• Case presentation practice with feedback and coaching

• The “what-ifs”

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

Examiner Training During Exam

• Observe a real simulation first

• Written materials support each case

• Observed by experienced examiner early

• End of 1st day – group discussion and individual feedback, coaching

• Scoring sheets and notes reviewed by chief examiners

• Ongoing discussions, feedback, mentoring

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

Standardized Ratings

• 8 performance criteria Scale = 1 to 8

• Critical actions Yes/No

• Dangerous action

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

Performance Criteria

• Data Acquisition• Problem Solving• Patient Management• Resource Utilization• Health Care Provided (Outcome)• Interpersonal Relations• Comprehension of Pathophysiology• Clinical Competence (Overall)

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

Final Score and Pass/Fail

• Only performance criteria ratings are used

• Two ways to pass:

1. Grand mean of all performance criteria scores > 5.75

2. Case score = mean of performance criteria for each case.

Highest and lowest case scores are averaged.

If the hi-lo average AND all of the remaining case scores > 5.0, pass

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

Example

1. Grand Mean Standard

Sum of performance criteria ratings = 375 Number of ratings (8x4) + (18x2) = 68 Grand mean = 375/68 = 5.51

5.51 > 5.75?

NO → Fail

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

Example

2. Case Score Average (High-Low) Standard

Mean of performance ratings for each case = 4.754.75, 5.235.23, 5.425.42, 5.755.75, 5.835.83, 6.086.08

High-Low Mean = (4.75+6.08)/2 = 5.425.42

Are the figures in goldgold all > 5.00?

Yes → Pass

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

Why Two Ways to Pass?

• Standard setting study + Bayesian procedures → 5.75

• BUT a 5+ is “acceptable performance”

• AND there is potential for measurement error, fluke, etc.

• So

55 6655 55 55 44 = Pass

5

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

A High Quality Exam

• Interrater Reliability 97% agreement on Critical Actions 95% agreement on “Acceptable/Unacceptable” 94% of all performance criteria ratings within 1 point

• Discriminant Validity 1. Residency trained physicians 2. Physicians not trained via residency 3. Residents beginning 2nd year of residency 4. 4th year medical students

• Correlation with written MC exam = .77• Predictive Validity

Oral exam predicts performance better than written exam does

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

Benefits of Scoring Procedure

• Stable pass rates

• High involvement of EM community

• Checks and balances

• Standardized, yet flexible assessment

Presented at the 2005 CLEAR Annual ConferenceSeptember 15-17 Phoenix, Arizona

Speaker Contact Information

Elizabeth A. Witt, Ph.D.

American Board of Emergency Medicine

East Lansing, MI

517-332-4800

ewitt@ABEM.org

www.ABEM.org

top related