2. The Medical Education Assessment
Advisory Committee
• Georges Bordage
• Craig Campbell
• Robert Galbraith
• Shiphra Ginsburg
• Eric Holmboe
• Glenn Regehr
6. “We demand ever more of
the latter even as we
bemoan its dominance,
and yearn for the former
while remaining wary of
its inefficiencies and lack
of uniformity.”
Humanistic expectations vs
Knowledge demands
(Anderson, 2011)
10. % prior to
blueprint
publication
% after
blueprint
publication
p-value
Exam performance 72.5 76.9 ns
Agreement that exam
tested material taught
63.5 81.3 < 0.01
Agreement that
evaluation methods
reflected subject
matter
60.7 81.5 < 0.01
Agreement that exam
was fair
58.0 76.9 < 0.05
(McLaughlin et al., 2005)
11. GeneralTheme
• Broad, competency-based assessment
frameworks, can promote performance
improvement rather than simply measuring
performance
• They create professional culture through
conversation, understanding, and steering
15. Implicit Messages:
• Expertise is something that can be
achieved
• The goal is to “become independent”
Risk:
• Assessments as hurdles
• Hesitation to disclose difficulties
• Reduced compulsion to offer support
(see MEAAC Report)
16. GoalTheory
Performance Orientation Mastery Orientation
•Desire to perform well •Desire to become proficient
•Satisfaction derived from
grades
•Deeper engagement
•Greater anxiety •Greater perseverance
•Task avoidance •Stronger motivation
(see Teunissen and Bok, 2013)
26. 1. Overcoming Unintended
Consequences
Gist:
• Reduce emphasis on exams as point in time
hurdles that prove one’s competence
• Promote notion that trainees are equally
accountable for their demonstration of learning
27. 1. Overcoming Unintended
Consequences
Strategy:
• Build quality improvement activities into
assessment practices
• Use data from licensing process to facilitate
formulation of learning plans
– And further develop system to enforce follow through
28. 1. Overcoming Unintended
Consequences
Examples:
• OSCE/CDM components that require candidates to follow-
up on an error made; ask for help; use clinical decision
supports
• Feedback intra-candidate relative strengths and
weaknesses and require generation of learning plan
• Tailor subsequent assessments to identified weaknesses
29. 2. Turning Quality Assurance into
Quality Improvement
Gist:
• Reduce the tension between high stakes
licensing assessment and genuine
investment in improvement
30. 2. Turning Quality Assurance into
Quality Improvement
Strategy:
• Further integrate assessment practices across
the continuum of learning with deliberate
attention paid to (and reward of) quality
improvement
31. 2. Turning Quality Assurance into
Quality Improvement
Examples:
• Create a formative test tailoring platform for use by
schools/individuals
• Support a national “Diagnostic OSCE” late in UG that can feed
data to subsequent stages of training/practice/assessment
• Testing moments that require demonstration of response to
data (e.g., OSCE station in which candidates bring personal
data)
32. 3. Ensuring Authenticity
Gist:
• Assessment that models the realities of
actual practice increases
credibility, engagement, and ensures that
efforts towards gamesmanship are
pedagogically valuable
33. 3. Ensuring Authenticity
Strategy:
• Portfolio-supported workplace-based
assessment
• Increasing use of real world supports and real
world uncertainties in current practices
34. 3. Ensuring Authenticity
Examples:
• Sequential OSCE stations; SPs who are trained to offer
contradictory information mid-station
• Post-encounter probes that require reflection on why
approach was appropriate and alternative actions
were ruled out
• Internet enabled OSCEs
35. Completing the puzzle
• Broaden the base of assessment
• Build coherent and integrated system
• Emphasize the primacy of learning
• Harness the power of feedback
• Share accountability with the individual