Using MCDA for HTA, Opportunities, Challenges and Possible Ways Forward
1. Martina Garau and Nancy Devlin
USING MCDA FOR HTA. OPPORTUNITIES,
CHALLENGES AND POSSIBLE WAYS
FORWARD
1
2. Agenda
• Introduction
• Why do we need MCDA in Health Technology
Assessment (HTA)?
• Examples of applications of MCDA in HTA
• Critical issues:
– Criteria and weights need to be ‘fixed’?
– Whose criteria?
– Whose preferences for weighting the criteria?
– How to factor in opportunity cost?
– How can uncertainty be addressed?
• Conclusions This presentation is based on a book chapter
(Garau and Devlin, forthcoming)
2
3. Introduction
• Many countries have developed/are developing collectively-
funded health care systems to ensure universal coverage and
access to health care for their populations
• HTA on individual or groups of technologies can be used to
allocate limited funds efficiently
• However, existing HTA processes vary in their remit and their
objectives
– QALY -maximisers given the budget constraint; explicit or implicit
consideration of opportunity cost of new technologies
– HTA systems which do not consider costs (e.g. US) and focus on
relative/comparative effectiveness
3
4. Why do we need MCDA in HTA? (1)
• Health care systems face multiple objectives that might go beyond
improvements in population health
• HTA systems vary in how far they are explicit and consistent in considering
multiple elements of value
Policy initiatives suggest a need for approaches that could take into
account multiple criteria simultaneously and in a systematic way (e.g. VBA
in the UK; GPS-Health in global context – Norheim et al., 2014)
• Increasingly a wide range of stakeholders, including patients and clinicians, have
been involved in HTA (e.g. PACE in Scotland)
But how do stakeholders’ views influence final decisions?
How can stakeholders views be taken into account in a systematic way and
weigh up against other types of evidence?
4
6. Why do we need MCDA in HTA? (2)
• Weighing up complex information is cognitively demanding
– Literature shows that individuals are subject to “cognitive bias” (add
reference)
– Deliberative processes are influenced by group dynamics and factors
including chairing styles and dominant people
→ “the preferred options identified by MCDA are likely to out-perform the
use of intuitive judgement alone” (Devlin and Sussex, 2011)
6
7. Growing interest in MCDA in HTA
• NICE:
– ‘Structured decision making’ included for the first time in the 2013
methods review
– Exploration of its use in clinical guidelines
– Explicit criteria for new highly specialised technologies process
– Public health
• Some examples of one-off uses but no current systematic use:
– Israeli Health Basket Committee (Golan and Hansen, 2012)
– Italy (Radaelli et al., 2014)
– Thailand (Youngkong et al., 2012)
– Germany (Danner et al., 2011)
7
9. MCDA in Italian region Lombardia
• For the implementation of new health technologies, the
Lombardia region in Italy has introduced a system combining
elements of the EUnetHTA Core model (for the assessment)
and of an MCDA approach (EVIDEM) as a decision-making aid
• The MCDA framework includes 9 broad dimensions and 20
criteria, including disease-, treatment-, financial- and social-
related aspects
• This approach has been deemed successful and used for 26
technologies (Radaelli et al., 2014)
9
11. Thai pilot (2)
• Assessment and appraisal of selected interventions based on
– Value for money (incremental cost effectiveness ratio against a threshold)
– Budget impact
Source: Youngkong et al., 2012
11
12. IQWiG pilots
• Aims were:
– Identifying patient-relevant outcomes in depression and
hepatitis C
– Eliciting patient preferences on the selected outcomes using
two approaches (Analytic Hierarchy Process (AHP) and Discrete
Choice Experiment (DCE))
– Enabling aggregation of outcome-specific efficiency frontiers
based on obtained weights
• Both pilots concluded that MCDA approaches can be
used to support the HTA process to incorporate patient
preferences (Thokala et al., 2016)
12
13. Critical issues
• Criteria and weights need to be ‘fixed’?
• Whose criteria?
• Whose preferences for weighting the criteria?
• How can the opportunity cost of new technologies
be incorporated?
• How can uncertainty be addressed?
13
14. Criteria and weights need to be ‘fixed’?
1. Established in advance; the same across all
decisions
– Allows using same metric to measure lost and added benefit
– Consistency between decisions
– Issue of different scale ranges
2. Chosen on a case-by case basis and varying across
technologies or disease areas
– It can hinder systematic consideration of all criteria and
predictability of decision making process
14
15. Whose criteria?
1. Current HTA bodies’ criteria
– Do they have any legitimacy?
2. Members of an HTA committee on behalf of payers
or NHS budget-holders
– This can encourage alignment of objectives across
healthcare decision makers
3. Reflect views of the general public
– Reflecting tax-payers’/potential users’ view
15
16. Whose preferences for weighting the criteria?
1. Stakeholders as defined by the decision maker
– In line with extra-welfarist foundation of HTA
– Variations of stakeholders among diseases requires flexible
weights?
2. Members of an HTA committee
– Pragmatic approach which can avoid conducting large
preference-based studies
– Structure the deliberative process
3. Members of the general public
– Consistent with the approach taken to valuing QoL in QALYs
16
17. How to factor in cost and opportunity cost? (1)
• Separate criterion for cost (e.g. EVIDEM) or cost effectiveness
which contributes to the overall intervention value
– Need to avoid overlapping with other criteria
• All (incremental) benefits, combined by using an MCDA
aggregation approach (e.g. Israeli Health basket Committee in
Golan and Hansen, 2012), weigh against all (incremental)
costs
– Still need to identify the “hurdle for adoption”, e.g. cost per
incremental point score
– Redefine the cost effectiveness threshold?
17
18. How to factor in cost and opportunity cost? (2)
• Debate over the meaning and the measurement
methods of the cost effectiveness threshold
• Consideration of multiple attribute of values
(beyond health gains) can complicate estimation of
threshold
– Need for a “cost per performance score” reflecting benefits
forgone if the new technology is implemented
18
19. How can uncertainty be addressed?
• HTA bodies face high degree of uncertainty
– Evidence base of new interventions can be limited (particularly near their
launch)
• Acceptable level of uncertainty is a matter of judgment
– Currently, HTA bodies give large discretion to committees to decide on the
appropriate level of acceptability
Separate criterion for uncertainty
– Measuring and valuing it can be challenging
Use existing sensitivity analysis (SA) techniques
– It leaves open the question of how SA results should affect decisions
19
20. MCDA pilots and critical issues
Pilot/Critical
issue
Criteria and
weights
‘fixed’?
Whose criteria? Whose preferences
for weighting the
criteria?
How can the
opportunity cost of
new technologies
be incorporated?
How can
uncertainty be
addressed?
Israel Yes Current HTA/priority
setting’ criteria
(based on literature
review)
Convenience sample Value for money
chart (total net cost
vs net benefit) and
efficiency frontiers
Quality of
evidence shown
via different size
bubbles
Italy Yes Mix of EUnetHTA
Core Model and the
EVIDEM framework
Decision makers (i.e.
committee
members)
Economic and
Financial Impact
cluster including
‘cost effectiveness’
criterion
Not explained
Thailand Yes Current HTA bodies’
criteria in seven
countries
Equal weight
applied to all criteria
(in nomination of
intervention step)
Value for money
criterion (using cost
per QALY threshold)
Budget impact
criterion
Not explained
Germany No
(disease-
specific)
Outcome measures
relevant
to patients reported
in the literature
Patients Not considered Sensitivity
analysis
performed
20
21. Opportunities Challenges Unresolved HTA issues
Established HTA systems to
increase their accountability -
“show the quality and rigor of its
work to others” (Walker, 2016)
Balance between deliberation and
more structured approaches -
avoid asking committees “to
rubber-stamp” decisions (Walker,
2016)
How is the budget constraint
reflected in the process? What
does the threshold mean?
Countries developing new HTA
systems to avoid
issues/limitations of existing
systems
Benefits, in terms of improved
decision making, vs cost of
implementing any given approach
– would that minimise “wrong”
decisions?
Whose value to derive criteria and
weights remains a normative
question
Align objectives across NHS
decision makers
Reconciling divergent views of
multiple stakeholders
How to deal with uncertainty?
21
22. Conclusions
• It is difficult to make ‘hard’ conclusions about how MCDA should be
implemented in HTA given fundamental differences between health
care systems and HTA processes (‘one size does not fit all’)
• Consideration of cost and opportunity cost in a systematic way
remains a methodological challenge (not only from an MCDA
perspective)
• Use of MCDA in HTA has the potential to provide a
coherent/unifying framework for healthcare decision making
• Need to consider the balance between additional costs of
implementing an MCDA approach and additional benefits of
improved decision making process
• Even partial use of MCDA, e.g. performance matrix, may still
improve decision making processes
22
23. References (1)
• Danner, M., Hummel, J.M., Volz, F. et al. (2011). Integrating patients' views into health
technology assessment: Analytic hierarchy process (AHP) as a method to elicit patient
preferences. International Journal of Technology Assessment in Health Care. Oct;27(4):369-
75
• Devlin, N., and Sussex, J., (2011). Incorporating Multiple Criteria in HTA. Methods and
processes. OHE research https://www.ohe.org/publications/incorporating-multiple-criteria-
hta-methods-and-processes
• Garau, M., Devlin, N., (forthcoming). Using MCDA as a decision aid in Health Technology
Appraisal for coverage decisions: opportunities, challenges and unresolved questions. In
“MCDA in health care decision making” published by Springer
• Golan, O., Hansen, P., Kaplan, G., Tal, O., (2011). Health technology prioritization: which
criteria for prioritizing new technologies and what are their relative weights? Health Policy
Oct;102(2-3):126-35
• Golan,O., and Hansen, P., (2012). Which health technologies should be funded? A
prioritization framework based explicitly on value for money. Israel Journal of Health Policy
Research 1:44
23
24. References (2)
• Norheim O. F., Baltussen R., Johri M., et al., 2014. Guidance on priority setting in health care
(GPS-Health): the inclusion of equity criteria not captured by cost-effectiveness analysis. Cost
Effectiveness and Resource Allocation 12:18
• Radaelli, G., Lettieri, E., Masella, C., (2014). Implementation of EUnetHTA core Model® in
Lombardia: the VTS framework. International Journal of Technology Assessment in Health
Care 30:1; 105-12
• Thokala, P., Devlin, N., Marsh, K., et al. (2016) Multiple Criteria Decision Analysis for Health
Care Decision Making - An Introduction: Report 1 of the ISPOR MCDA Emerging Good
Practices Task Force. Value in Health Jan;19(1):1-13
• Walker, A., (2016). Challenges in using MCDA for reimbursement decisions on new
medicines? Value in Health 19: 123-124
• Youngkong, S., Baltussen, R., Tantivess, S., et al. (2012). Multicriteria decision analysis for
including health interventions in the Universal Health Coverage Benefit package in Thailand.
Value in Health 15; 961-970
24