Top Banner
Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010
36

Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

Mar 26, 2015

Download

Documents

Avery Morales
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

Tools and Tips for Learner Assessment and Evaluation in the Emergency Department

Heather PattersonPGY-4

April 28 2010

Page 2: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

Objectives

1. What should we be assessing?

2. What is the best method of assessment?

3. What factors influence assessment?

4. What are the tools available to evaluate learners?

5. Tips for delivering feedback.

Page 3: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

Objectives

1. What should we be assessing?• Brief review of CanMEDS

2. What is the best method of assessment?

3. What factors influence assessment?

4. What are the tools available to evaluate learners?

5. Tips for delivering feedback.

Page 4: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

What should we assess?

QuickTime™ and a decompressor

are needed to see this picture.

Page 5: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

Objectives

1. What should we be assessing?

2. What is the best method of assessment in the ED?• Direct Observation

3. What factors influence assessment?

4. What are the tools available to evaluate learners?

5. Tips for delivering feedback.

Page 6: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

Direct Observation

• Why bother?– Sherbino et al 2008– Hobgood et al 2008– Cydulka 1996

• What counts?

Page 7: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

Direct Observation

• Challenges– Hawthorne effect– ED flow and pt

care– Teaching

responsibilities

Page 8: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

Direct Observation

• Formalized direct observation program– Pittsburg EM residency program

• Dorfsman et al 2009

• How did they evaluate resident performance? – Standardized direct observation tool (SDOT)

• Shayne et al 2002 and 2006, La Manita et al 2002

– Reliable??– Valid??

Page 9: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

Direct Observation

• Take home:– Best method for the assessment of true behaviour

– It may be worthwhile to do some “behind the curtain” assessments to minimize the Hawthorne effect

– Can be used to guide feedback and to give more representative evaluations

– Opportunity exists for development of reliable and valid checklist tools to assess resident performance in the ED

Page 10: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

Objectives

1. What should we be assessing?

2. What is the best method of assessment?

3. What factors influence assessment?• Pitfalls of learner assessment

4. What are the tools available to evaluate learners?

5. Tips for delivering feedback.

Page 11: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

Evaluation vs Feedback

• Evaluation:– Formal assessment of how the learner has performed.

Page 12: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

Evaluation vs Feedback

• Feedback:– Designed to make a learner aware and accepting of strengths and weaknesses and to help guide future learning

Page 13: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

Pitfalls of assessment

• Hawk vs. Dove– Know your tendencies for how you evaluate– Acknowledge your subjective expectations for a

particular domain of assessment• Cydulka et al 1996

A practical guide for medical teachers. Dent 2005

Page 14: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

Pitfalls of assessment

• Halo vs millstone effect– Well documented and accepted as a source of bias in

learner evaluation

A practical guide for medical teachers. Dent 2005

Page 15: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

Pitfalls of assessment

• Leniency bias– Bandiera et al 2008

Page 16: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

Pitfalls of assessment

• Leniency bias and range restriction– Jouriles et al 2002

• No evaluation of lowest score despite previously identified problems

Page 17: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

Pitfalls of assessment

• Possible reasons for leniency bias and range restriction– Dudek et al 2005

• Lack of documentation of specific events• Lack of knowledge about what to document• Anticipation of an appeal process• Lack of remediation options

– Jouriles et al 2002• Avoidance of negative interactions• Fear of negative teaching evaluation• Worry about time commitments to justify evaluation • Worry about time requirements and potential responsibility for

remediation

– Gray et al 1996• Weakness inherent to ITER as an evaluation tool• Lack of training on proper use of ITER or other assessment tools

used

Page 18: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

Pitfalls of assessment

• Take home points:– Be aware of your pre-existing perceptions about the learner

– Be aware of your biases

– Don’t be afraid to give a representative evaluation

Page 19: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

Objectives

1. What should we be assessing?

2. What is the best method of assessment?

3. What factors influence assessment?

4. What are the tools available to evaluate learners?• ITER• Encounter Cards• 360 degree feedback• Checklists

5. Tips for Delivering Feedback

Page 20: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

ITER/Global Rating Forms

Page 21: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

ITER/Global Rating Forms

• Pros:– Ease of administration– Allows for longitudinal

assessments• Sherbino et al 2008

• Cons:– Bias introduced into

evaluation • Recall• Halo/millstone• Leniency and range restriction

– Sherbino et al 2008– Practical guide for medical

teachers Dent 2005– Gray et al 1996

Page 22: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

ITER/Global Rating Forms

• Cons (cont):– Poor reliability– Poor discrimination between constructs or behaviours

• Donnon et al - not yet published• Silber et al 2004

• Take home:– Residents:

• Deliver ITERs earlier to minimize recall bias. • Tell staff you are sending them.

– Staff:• Be objective as possible and include written comments. • Be aware of bias

Page 23: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

Daily Encounter Cards

Page 24: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

Daily Encounter Cards

• Pros– Less recall bias– Can be structured to facilitate CanMEDS roles

evaluation• Bandiera et al 2008

• Cons– Leniency bias– Recall bias– Needs further reliability and validity assessment

• Kim et al 2005• Paukert et al 2002• Brennan et al 1997

Page 25: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

Multisource Feedback (MSF)

• Pros– ? More representative

assessment of teamwork, leadership, communication, collaboration and professionalism• Sherbino et al 2008

– ?Stimulus for positive change• Lockyer 2003

• Cons– No “true” MSF post-graduate

medical education research • Rodgers et al 2002

– Numbers required achieve reliability• Wood et al 2006

Page 26: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

Multisource Feedback (MSF)

• Take home:– Input from allied

health professionals, collegues, and patients may contribute to a more complete assessment of resident competencies if done appropriately

– Caution: introduction of bias, ?reliability if only a few comments

Page 27: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

Checklists

Page 28: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

Checklists

• Pros– No recall bias, +/- reduced leniency bias– Over 55 published tools for use during direct

observation of clinical behaviour• Kogan et al 2009

• Cons– Evaluates specific behaviours NOT global

performance• ACGME toolbox of assessment methods 2000

– Extensive process to develop a reliable, valid tool• Cooper et al 2010

– Requires direct observation without interference• Dorfsman et al 2009• Shayne et al 2006

Page 29: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

Checklists

• Take home points:– Good for specific behavioural assessment ie

leadership– Extensive process to develop a tool– Significant research potential in this area

Page 30: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

Objectives

1. What should we be assessing?

2. What is the best method of assessment?

3. What factors influence assessment?

4. What are the tools available to evaluate learners?

5. Tips for Delivering Feedback

Page 31: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

Types of Feedback

• Brief

• Formal

• Major

Page 32: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

Tips for Effective Feedback

• Timing and location

• Feedback on your performance

• Learner self assessment

Page 33: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

Tips for Effective Feedback

• Feedback content

Page 34: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

Take Home Messages

• Direct observation represents the highest fidelity measurement of true behaviour

• Feedback and evaluation are different processes and have different goals

• Be aware of your biases and limitations of the evaluation tools– Hawk vs Dove– Halo vs Millstone effect– Recall bias– Leniency and range restriction

• Feedback should be specific and identify modifiable behaviours

Page 35: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

References

(1) Dorfsman ML, Wolfson AB. Direct observation of residents in the emergency department: a structured educational program. Acad.Emerg.Med. 2009 Apr;16(4):343-351.(2) Sherbino J, Bandiera G, Frank JR. Assessing competence in emergency medicine trainees: an overview of effective methodologies. CJEM, Can.j.emerg.med.care. 2008 Jul;10(4):365-371.(3) Hobgood CD, Riviello RJ, Jouriles N, Hamilton G. Assessment of Communication and Interpersonal Skills Competencies. Acad.Emerg.Med. 2002;9(11):1257-1269.(4) Jouriles NJ, Emerman CL, Cydulka RK. Direct observation for assessing emergency medicine core competencies: interpersonal skills. Acad.Emerg.Med. 2002 Nov;9(11):1338-1341.(5) Kogan JR, Holmboe ES, Hauer KE. Tools for direct observation and assessment of clinical skills of medical trainees: a systematic review. JAMA 2009 Sep 23;302(12):1316-1326.(6) Andersen PO, Jensen MK, Lippert A, Ostergaard D, Klausen TW. Development of a formative assessment tool for measurement of performance in multi-professional resuscitation teams. Resuscitation 2010 Mar 24.(7) Kim J, Neilipovitz D, Cardinal P, Chiu M. A comparison of global rating scale and checklist scores in the validation of an evaluation tool to assess performance in the resuscitation of critically ill patients during simulated

emergencies (abbreviated as "CRM simulator study IB"). Simul.Healthc. 2009 Spring;4(1):6-16.(8) Norcini JJ, Blank LL, Duffy FD, Fortna GS. The mini-CEX: a method for assessing clinical skills. Ann.Intern.Med. 2003 Mar 18;138(6):476-481.(9) Cooper S, Cant R, Porter J, Sellick K, Somers G, Kinsman L, et al. Rating medical emergency teamwork performance: development of the Team Emergency Assessment Measure (TEAM). Resuscitation 2010 Apr;81(4):446-452.(10) Fink A, Kosecoff J, Chassin M, Brook RH. Consensus methods: characteristics and guidelines for use. Am.J.Public Health 1984 Sep;74(9):979-983.(11) Morgan PJ, Lam-McCulloch J, Herold-McIlroy J, Tarshis J. Simulation performance checklist generation using the Delphi technique. Can.J.Anaesth. 2007 Dec;54(12):992-997.(12) Lockyer J, Singhal N, Fidler H, Weiner G, Aziz K, Curran V. The development and testing of a performance checklist to assess neonatal resuscitation megacode skill. Pediatrics 2006 Dec;118(6):e1739-44.(13) Ringsted C, Ostergaard D, Ravn L, Pedersen JA, Berlac PA, van der Vleuten CP. A feasibility study comparing checklists and global rating forms to assess resident performance in clinical skills. Med.Teach. 2003 Nov;25(6):654-

658.(14) Friedman Z, Katznelson R, Devito I, Siddiqui M, Chan V. Objective assessment of manual skills and proficiency in performing epidural anesthesia--video-assisted validation. Reg.Anesth.Pain Med. 2006 Jul-Aug;31(4):304-310.(15) Morgan PJ, Cleave-Hogg D, Guest CB. A comparison of global ratings and checklist scores from an undergraduate assessment using an anesthesia simulator. Acad.Med. 2001 Oct;76(10):1053-1055.(16) Morgan PJ, Cleave-Hogg D, DeSousa S, Tarshis J. High-fidelity patient simulation: validation of performance checklists. Br.J.Anaesth. 2004 Mar;92(3):388-392.(17) Wright MC, Phillips-Bute BG, Petrusa ER, Griffin KL, Hobbs GW, Taekman JM. Assessing teamwork in medical education and practice: relating behavioural teamwork ratings and clinical performance. Med.Teach. 2009

Jan;31(1):30-38.(18) Jefferies A, Simmons B, Tabak D, McIlroy JH, Lee KS, Roukema H, et al. Using an objective structured clinical examination (OSCE) to assess multiple physician competencies in postgraduate training. Med.Teach. 2007

Mar;29(2-3):183-191.(19) Cydulka RK, Emerman CL, Jouriles NJ. Evaluation of Resident Performance and Intensive Bedside Teaching during Direct Observation. Acad.Emerg.Med. 1996;3(4):345-351.(20) Shayne P, Heilpern K, Ander D, Palmer-Smith V, Emory University Department of Emergency Medicine Education,Committee. Protected clinical teaching time and a bedside clinical evaluation instrument in an emergency

medicine training program. Acad.Emerg.Med. 2002 Nov;9(11):1342-1349.(21) Shayne P, Gallahue F, Rinnert S, Anderson CL, Hern G, Katz E, et al. Reliability of a core competency checklist assessment in the emergency department: the Standardized Direct Observation Assessment Tool. Acad.Emerg.Med.

2006 Jul;13(7):727-732.(22) LaMantia J, Panacek EA. Core Competencies Conference: Executive Summary. Acad.Emerg.Med. 2002;9(11):1213-1215.(23) Bandiera G, Lendrum D. Daily encounter cards facilitate competency-based feedback while leniency bias persists. CJEM Canadian Journal of Emergency Medical Care 2008 Jan;10(1):44-50.(24) Paukert JL, Richards ML, Olney C. An encounter card system for increasing feedback to students. Am.J.Surg. 2002 Mar;183(3):300-304.(25) Kim S, Kogan JR, Bellini LM, Shea JA. A randomized-controlled study of encounter cards to improve oral case presentation skills of medical students. Journal of General Internal Medicine 2005 Aug;20(8):743-747.(26) Brennan BG, Norman GR. Use of encounter cards for evaluation of residents in obstetrics. Academic Medicine 1997 Oct;72(10 Suppl 1):S43-4.(27) Dudek NL, Marks MB, Regehr G. Failure to fail: the perspectives of clinical supervisors. Acad.Med. 2005 Oct;80(10 Suppl):S84-7.(28) Frank JR, Danoff D. The CanMEDS initiative: implementing an outcomes-based framework of physician competencies. Med.Teach. 2007 Sep;29(7):642-647.(29) Zibrowski EM, Singh SI, Goldszmidt MA, Watling CJ, Kenyon CF, Schulz V, et al. The sum of the parts detracts from the intended whole: competencies and in-training assessments. Med.Educ. 2009 Aug;43(8):741-748.(30) Epstein RM. Assessment in medical education. N.Engl.J.Med. 2007 Jan 25;356(4):387-396.(31) Gray JD. Global rating scales in residency education. Acad.Med. 1996 Jan;71(1 Suppl):S55-63.

Page 36: Tools and Tips for Learner Assessment and Evaluation in the Emergency Department Heather Patterson PGY-4 April 28 2010.

Quinn!