UCSF Informatics Day 2014 - Sayan Chatterjee, "APeX Reporting"
UCSF CER - Population Based Networks for Comparative Effectiveness Research (Symposium 2013)
1. Population-Based Networks for Comparative
Effectiveness Research:
Promises and Potholes
Tracy Lieu, MD, MPH
January 8, 2013
Kaiser Permanente Research
2. Kaiser Permanente is a resource for
comparative effectiveness research
3.3 million patients
7,000 physicians
21 hospitals
234 medical offices
Regional quality
improvement
programs
2
3. Our Division of Research has common
interests with UCSF
50+ research scientists in:
Cancer
Cardiovascular and metabolic
Health care delivery and policy
Infectious disease
Behavioral health and aging
Women’s and children’s health
3
4. Our research and funding are largely
public-domain
% of total funding in 2011 ($107M)
Pharma/ biotech
Central Research
Committee awards Federal
KP Community
Benefit
TPMG
Foundation
4
5. Population-based research networks
can facilitate CER
Patients drawn from a defined group,
representative of the general population
Multiple geographic sites
Sites have:
– Computerized data on exposures and
outcomes
– Access to clinicians and patients
5
6. Networks have supported safety and
epidemiologic research
Vaccine Safety Datalink (CDC, 1991)
Mini-Sentinel (FDA, 2010)
Cancer Research Network (NCI)
Cardiovascular Research Network
(NHLBI, 2007)
Mental Health Research Network
(NIMH, 2009)
6
7. In 2010, AHRQ sponsored 11 new
networks for CER
Examples:
Population-Based Effectiveness in
Asthma and Lung Diseases (PEAL)
Surveillance, Prevention, and
Management of Diabetes Mellitus
(SUPREME-DM)
Scalable Partnering Network (SPAN)
7
8. New approaches have increased the
power of these networks
Distributed data network approaches
Example – asthma network for CER
Methodologic potholes and potential
solutions
Resources for using distributed data
networks for CER
8
9. Distributed data networks are versatile
Standard, multi-purpose, multi-
institutional infrastructure
Can support both observational and
intervention studies
Local data holder control over access
and uses of data
Mitigates need to share or exchange
protected health information
9
10. Example: The Population-Based
Effectiveness in Asthma and Lung Diseases
(PEAL) Network
6 sites with diverse populations
Sponsored by AHRQ, 2010-2013
Purpose: Establish infrastructure and conduct
CER in asthma
Lay foundation for research in other lung
diseases and in other fields, e.g.
pharmacogenetics
10
11. Collaborators in the PEAL Network
HealthPartners
KP Northwest
Harvard Pilgrim
KP Northern Health Care
California
Vanderbilt
KP Georgia
www.pealnetwork.org
12. PEAL Virtual Data Warehouse
PEAL Data
Population Warehouse
HPHC selection and data HPHC
KPNC warehouse KPNC
KPSE building using KPSE
HPRF distributed
HPRF
VAND programs and
VAND
KPNW site-specific
KPNW
translation
programs
Local databases, some standard PEAL databases with
(VDW) and others w/varying structure common structure
12
13. Comparative effectiveness research
PEAL Data and other studies
Warehouse
Study-specific
HPHC analysis programs
KPNC based on common
KPSE data dictionary
HPRF
VAND
KPNW
Compatible
de-identified Research Team
datasets from each
PEAL databases with site
common structure
13
14. Data confidentiality is a key hurdle for
data networks
Pooling individual level-data poses risk
De-identification doesn’t always work
Distributed analysis gives stronger
protection -- only aggregated, count
data are shared
Example: Vaccine Safety Datalink
Project and Congressman Dan Burton
14
15.
16.
17. PEAL builds on standard datasets from the
HMO Research Network’s Virtual Data
Warehouse
New, from
Derived from the HMORN VDW
source data
Demographics Specialty Prescribing
Enrollment Dispensing Benefits
Utilization tables: Geocode & copayment
Encounter Vitals
Diagnosis Death
Procedure
17
18. The PEAL Network has succeeded in
its basic purpose
Established understandings –governance, data
use, IRB
Created data dictionaries & datasets
Identified the study cohorts; descriptive analyses
Completed studies of controller medication
effectiveness and statins in asthma
Studies of adherence, methodology, cost-
sharing, and insurance benefit design underway
18
20. Example of potential confounding:
Outcomes after leukotriene inhibitors compared
with inhaled corticosteroids
Retrospective cohort analysis of >44,000
children with probable persistent asthma
70% filled an inhaled corticosteroid (ICS); 26%
filled a leukotriene inhibitor (and not an ICS)
Proportional hazards models
Adjusted for age, sex, insurer, asthma risk (prior
ED visits, hospitalizations, oral steroid bursts),
Charlson score, comorbidities, and adherence
as a time-varying covariate 20
21. Example of potential confounding:
Outcomes after leukotriene inhibitors compared
with inhaled corticosteroids
Preliminary findings – confidential:
In TennCare, users of leukotriene inhbitors were
less likely to experience an asthma-related
emergency department visit (HR 0.7, CI 0.5-0.8)
in the next 12 months
In HMO populations, users of leukotriene
inhibitors were less likely to have subsequent
oral steroid bursts (HR 0.6, CI 0.4 – 0.9)
Wu AC, under review 21
22. Retrospective cohort designs for CER
are prone to selection bias
(confounding by indication)
Patients who receive a newer treatment often
differ from patients who don’t
Or, better clinicians or better health care
systems may adopt better interventions sooner
Traditional multivariate regression often cannot
resolve this confounding
22
23. We’re testing analytic approaches to
reducing confounding
In the PEAL cohort analysis, we are comparing:
Propensity score weighting
High-dimensionality propensity scores
Proportional hazards regression with time-
dependent covariates
Marginal structural models
Adding patient-reported information to
computerized data
23
24. Stronger designs may better reduce
confounding
Instrumental variable – find a covariate that is
associated with the exposure and not the
outcome, and use this to create “randomized”
groups – if you are lucky
Difference-in-difference – change in time
between intervention and comparison groups
Interrupted time series (regression discontinuity)
24
29. Interrupted time series analysis
Benzodiazepine (BZ) use and hip fractures in women in
Medicaid before and after NY policy restricting BZ use
50
P o licy
Bz Use among Female
Users before Policy,%
40
30
20
10 N ew Y ork 60% decrease
N ew Jersey in bz use in NY
0
Female Users before Policy
0.025
Hip Fracture per 100000
Cumulative Incidence of
P o lic y
0.02
0.015
0.01 No change in risk
of hip fracture
0.005
0
1 11 M o n th 21 31
Wagner AK Ann Intern Med 2007 (from Soumerai S)
30. Number of albuterol inhalers dispensed before and
after an increase in co-payment due to branding
changes – Preliminary data, confidential:
180
160
140
number of inhalers per 1,000 children
Cases (changed to
brand cost-sharing)
120
100
Controls (kept generic
80 cost-sharing)
60
40
20
0
2007M03
2007M07
2007M09
2009M11
2010M03
2010M05
2007M01
2007M05
2007M11
2008M01
2008M03
2008M05
2008M07
2008M09
2008M11
2009M01
2009M03
2009M05
2009M07
2009M09
2010M01
2010M07
2010M09
2010M11
Policy change
32. Population-based networks are useful
for:
• Observational comparative
effectiveness research (including
quasi-experimental designs)
• Interventional comparative
effectiveness research
• Delivery science / implementation
research
32
33. You can also use population-based
networks for:
• Epidemiology, including genetic
epidemiology
• Safety surveillance
• Identifying patients with specific
conditions, especially uncommon
ones, for all types of studies
33
34. Population-based research data may be
useful for clinical system needs
Firewall
Research Clinical and
Data Operational
Warehouses Users
& Data Marts collaborative research
direct access
reports direct distribution
report repository
research
staff
35. Electronic Data Methods (EDM) forum
is a national resource
• Facilitates learning across AHRQ projects
that build infrastructure for comparative
effectiveness research
• Led by AcademyHealth with AHRQ support
• Holds stakeholder symposia
• Organizes reports on specific topics, e.g.
building cohorts for research, deidentifying
data
35