Medical Student Assessment in 2015 Dr Mark Gurnell Clinical SubDean & Assessment Lead The Deakin Centre Addenbrooke’s Clinical Skills Unit University of Cambridge Institute of Metabolic Science
Medical Student Assessment in 2015
Dr Mark Gurnell
Clinical SubDean &
Assessment Lead
The Deakin CentreAddenbrooke’s Clinical Skills Unit
University of CambridgeInstitute of Metabolic Science
Assessment
Aims of the session
• define assessment (distinguish from evaluation, feedback)
• review basic concepts in assessment
• consider different methods of assessment
(‘horses for courses’)
• Discuss new developments and challenges at a national
level
• (trying it out in the workshop!)
• What is assessment; how does it differ from evaluation
and feedback?
• What are the different types of assessment?
• Why do we need different methods of assessment?
Assessment
“The use of a range of methods to determine the
attainment of learners.”
“The processes and instruments applied to measure the
learner’s achievements – normally after he/she has
worked through a module/programme of study.”
i.e. Assessment is about testing the learners
• Usually judged against specified criteria
• Essentially a judgment on performance
Assessment
Wall D. In: Understanding Medical Education, Ed. T Swanwick, Wiley Blackwell, 1st Edition, 2010.
“A measurement of how a particular educational input,
course or programme has performed.”
“A systematic approach to the collection, analysis and
interpretation of information about any aspect of the
conceptualisation, design, implementation and utility of
an educational programme.”
i.e. Evaluation concerns the trainers/modules/course/etc
NB: In the US ‘evaluation’ may equate to ‘assessment’!
Evaluation
Wall D. In: Understanding Medical Education, Ed. T Swanwick, Wiley Blackwell, 1st Edition, 2010.
“mechanism through which learners are offered insight
into their actions and the consequences thereof”
• Essential part of learning process
• Allows learner to identify areas of strength and areas
for potential improvement
• Can be formal or informal
• Part of the dialogue between learner and teacher
i.e. Feedback alerts learners to the gap between their
current knowledge/skills & the desired goal.
Feedback (1)
Biggs 1999. Constructive alignment, HEA.
• Assessment method needs to be aligned:
— to the learning outcomes
— with the educational level
Aligning assessment with learning
Learning & teaching
activities
Designed to meet
learning outcomes
Intended
learning outcomes
Assessment
methods
Designed to assess
learning outcomes
Aligning assessment with learning
Learning & teaching
activities
Designed to meet
learning outcomes
Intended
learning outcomes
Assessment
methods
Designed to assess
learning outcomes
Do not underestimate the power of assessment to drive learning!
• Summative assessment
• Formative assessment
• Reliability
• Validity
• Educational impact
• Cost-effectiveness
• Acceptability
• Blueprinting
Assessment concepts / ‘jargon’ (1)
• Measures achievement of learning goals at the end of a
programme of study
• Formal
• Used to determine progression
• “high stakes” when implications for graduation or
progression to the next stage of training
• Typically associated with little or no feedback
Summative assessment
• Specifically designed to provide information on progress
• Ideally:
− on-going and frequent
− informal
− non-judgemental
• Enables learners to identify strengths and weaknesses
• May take many formats
• Feedback is central to all forms of formative assessment
Formative assessment
• reflects the reproducibility of the assessment tool, and
therefore the accuracy with which a score is being
measured;
• i.e. the consistency of marks obtained by the same
individuals when re-examined with the same test, on
different occasions, or with different sets of equivalent
test items
• Higher in certain exam formats, e.g. single best answer;
lower in others, e.g. OSCE
• Quantitative; represented by the statistic Cronbach’s
alpha
Reliability
• Length of test
• Number of items
• Spread of scores
• Level of difficulty
• Objectivity of marking
• Number of examiners
Factors affecting reliability
• reflects the accuracy with which a test measures what it
is purported to measure
• i.e. are you measuring what you think you are measuring?
• qualitative; “real world”
• Evaluates the authenticity of an assessment and its
fitness for purpose
• Different ‘subtypes’:
− Content validity: relation to curriculum content
− Face validity: ‘real life’ nature of assessment
− High construct: test discriminates well between candidates of
different abilities
Validity
• Historically, some modes of assessment exhibited
particular failings with respect to reliability and validity:
• http://www.youtube.com/watch?v=2wob1_LrnU0
Reliability & Validity – drivers for change…
• Blueprinting: ensure assessment samples the curriculum
• Educational impact: is it worth measuring?
• Costs: can you afford to measure it?
• Weighting: what aspects are most valuable to you?
U = Rw x Vw x Ew x Cw x A (= 0?)A = Acceptability
If nobody will do it then the rest just doesn’t matter!
Other assessment concepts
Miller GE. The assessment of clinical skills/competence/performance. Academic Medicine (Supplement) 1990; 65: S63-S7.
Knows
Shows how
Knows how
Does
Pro
fess
iona
l aut
hent
icity
Knowledge, competence and performance: “Miller’s Triangle”
Miller GE. The assessment of clinical skills/competence/performance. Academic Medicine (Supplement) 1990; 65: S63-S7.
Knows
Shows how
Knows how
Does
Pro
fess
iona
l aut
hent
icity
Cognition
Behaviour
Knowledge, competence and performance: “Miller’s Triangle”
Miller GE. The assessment of clinical skills/competence/performance. Academic Medicine (Supplement) 1990; 65: S63-S7.
Knows
Shows how
Knows how
Does
Pro
fess
iona
l aut
hent
icity
Written/
computer-based
Performance/
‘hands on’
Knowledge, competence and performance: “Miller’s Triangle”
Miller GE. The assessment of clinical skills/competence/performance. Academic Medicine (Supplement) 1990; 65: S63-S7.
Knows
Shows how
Knows how
Does
Pro
fess
iona
l aut
hent
icity
Facts‘Old style MCQ’
‘True/False’ questions
SAQ / Essay
Knowledge, competence and performance: “Miller’s Triangle”
(Clinical) contextSBA
EMQ
SAQ
Performance ‘In vitro’ OSCE
SCEE
Performance ‘In vivo’ Faculty observation
Patient reports
• To ensure minimum standards achieved to allow safe
progression:
− To award medical degree to students who meet the
University’s standards (University interest)
− To distinguish between the competent and the
insufficiently competent (Public interest)
− To certify that graduates are suitable for provisional
registration (GMC interest)
− To ensure graduates are fit to undertake FY posts (NHS
interest)
Importance of Final MB Assessment
• Graduates from UK medical schools obtain provisional
GMC registration without having to pass a national
licensing exam
• GMC require that ‘There will be systems in place to
set appropriate standards for assessment to decide
whether students have achieved the curriculum
outcomes’.
UK medical graduates – uniformity?
GMC, Tomorrow’s Doctors, London: GMC, 2009.
• UK medical schools design and implement their own
programmes of assessment
• External examiner system used to provide
reassurance that these standards are comparable
• Limited evidence suggests passing standards for
clinical examinations may vary across schools
UK medical graduates – uniformity?
Boursicot, Roberts & Pell, 2007. Med Ed, 41:1024-31.
Chesser et al, 2009. Med Ed, 43:526-32.
Boursicot, Roberts & Pell, 2007. Adv Health Sci Educ Theory Pract, 11:173-83.
Consequences of differing standards:
• False positives• School has significantly lower passing standards
• Small number of graduates are not sufficiently competent
• ‘Endangers’ patient safety
• False negatives• School has significantly higher passing standards
• Small number of graduates are competent yet fail Final MB
• ‘Unfair’ on students
Passing standards in medical schools
• Demonstration of comparable passing standards for
written component of Final MB (GMC requirement)• Medical Schools Council Assessment Alliance ‘Common Content’
project
• Prescribing Safety Assessment (PSA)• Medical Schools Council Assessment Alliance
• British Pharmacological Society
• National Licensing Examination (NLE)• ‘Passport to Practise’
National assessment initiatives
• Creation of a national bank of high quality single best
answer (SBA) questions
• ‘Common content’ items provided to participating
schools
• Standard setting undertaken at individual school level
MSCAA Common Content Project
MSCAA Common Content Project
National Licensing Examination
GMC, Council, London: Sept 2014.
National Licensing Examination
GMC, Council, London: Sept 2014.
• Delivery of a high quality assessment programme is
both time and resource intensive, and requires a clear
understanding of the fundamentals of assessment
• At a time when NHS service pressures are increasing,
external scrutiny and expectations are rising
• The move towards the use of mandatory ‘common
content’ material and/or a NLE is gathering pace
……so watch this space!
Conclusions
Questions?
Medical Student Assessment in 2015