OBSERVATIONAL MEDICAL OUTCOMES PARTNERSHIP OBSERVATIONAL MEDICAL OUTCOMES PARTNERSHIP Informatics Opportunities for Exploring the Real-World Effects of.
Post on 11-Jan-2016
215 Views
Preview:
Transcript
OBSERVATIONAL MEDICALOUTCOMESPARTNERSHIP
OBSERVATIONAL MEDICALOUTCOMESPARTNERSHIP
Informatics Opportunities for Exploring the Real-World Effects of Medical Products: Lessons
from the Observational Medical Outcomes Partnership
Patrick Ryanon behalf of OMOP Research Team
October 19, 2010
OBSERVATIONAL MEDICALOUTCOMESPARTNERSHIP
2
FDAAA’s impact on pharmacovigilance
• SEC 905 establishes SENTINEL network of distributed observational databases (administrative claims and electronic health records) to monitor the effects of medicines post-approval
OBSERVATIONAL MEDICALOUTCOMESPARTNERSHIP
3Page 3
Governance
Data
Methods Technology
Which types of data? administrative claims, electronic health records
Which sources? healthcare providers, insurers, data aggregators
What is the appropriate infrastructure: - hardware? - software? - processes? - policies?
What are appropriate analyses for: - hypothesis generating? - hypothesis strengthening?
Performance Architecture
Feasibility
What are viable data access models: - centralized? - distributed?
What are best practices for protecting data?
How to maintain collaborations and engage research community?
What are the keys to a successful public-private partnership?
Outstanding questions for active surveillance
4
OBSERVATIONAL MEDICALOUTCOMESPARTNERSHIP
Observational Medical Outcomes Partnership
Established to inform the appropriate use of observational healthcare databases for active surveillance by:
• Conducting methodological research to empirically evaluate the performance of alternative methods on their ability to identify true drug safety issues
• Developing tools and capabilities for transforming, characterizing, and analyzing disparate data sources
• Establishing a shared resource so that the broader research community can collaboratively advance the science
5
OBSERVATIONAL MEDICALOUTCOMESPARTNERSHIP
Partnership Stakeholders
Stakeholder Groups• FDA – Executive Board [chair], Advisory Boards, PI• Industry – Executive and Advisory Boards, two PIs• FNIH – Partnership and Project Management, Research Core
Staffing• Academic Centers & Healthcare Providers – Executive and
Advisory Boards, three PIs, Distributed Research Partners, Methods Collaborators
• Database Owners – Executive Board, Advisory Board, PI• Consumer and Patient Advocacy Organizations – Executive
and Advisory Board• US Veterans Administration – Distributed research partner
A public-private partnership between industry, FDA and FNIH.
OBSERVATIONAL MEDICALOUTCOMESPARTNERSHIP
6Page 6
OMOP Extended Consortium
OMOP Research Core
Distributed Network
Humana HSRC
PartnersHealthCare
RegenstriefINPC
SDI Health i3 Drug Safety*
Centralized data
GE
Research Lab
VA MedSAFE
Thomson Reuters
OMOP Data Community
MSLR MDCD MDCR CCAE
OBSERVATIONAL MEDICALOUTCOMESPARTNERSHIP
7
OMOP research experiment workflowOMOP Methods Library
Method 1Method 2
Method 3Method 4
Drugs• ACE Inhibitors• Amphotericin B• Antibiotics• Antiepileptics• Benzodiazepines• Beta blockers• Bisphosphonates• Tricyclic antidepressants• Typical antipsychotics• Warfarin
Health Outcomes of Interest• Angioedema • Aplastic Anemia • Acute Liver Injury • Bleeding • GI Ulcer Hospitalization • Hip Fracture • Hospitalization • Myocardial Infarction • Mortality after MI• Renal Failure
Non-specified conditions
-All outcomes in condition terminology
-‘Labeled events’ as reference-Warning-Precautions-Adverse Reactions-Postmarketing Experience
Common Data Model
feasibility testing
OBSERVATIONAL MEDICALOUTCOMESPARTNERSHIP
88
OMOP CDM OSCAR
DOI definitions
NATHAN
HOI_OCCURRENCE
GER
ALD
GER
ALD
OMOP_DRUG_ERA
OMOP_CONDITION_ERA
Analysis methods(for feasibility assessment)
Analysis methods(for Phase 3 evaluation)
RICOHOI definitions
OMOP Data Community
OMOP Research Core
Humana
PHSRegenstrief
SDI I3
Research Lab
VA
OMOP Project Plan Progression
OMOP Methods Library
Method 1
Method 2Method 3
Method 4
Sources
GE
Humana
Regenstrief
VA/MedSAFE
Thomson MSLR
Meta-analysis
Relative Risk
1.00.5 1.5 2.0 2.5
Method Self- controlled case series
Odds ratio
1.00.5 1.5 2.0 2.5
Screening rate ratio Odds ratio
Bayesian logistic regression Observational screening Case- control estimation
1.00.5 1.5 2.0 2.5 1.00.5 1.5 2.0 2.5
As we have been conducting the methodological research, all tools and processes we’ve developed along the way are made publicly available at: http://omop.fnih.org
9
OBSERVATIONAL MEDICALOUTCOMESPARTNERSHIP
Informatics tools developed to support a systematic analysis process
• Common Data Model– http://omop.fnih.org/CDMandTerminologies
• Standardized Terminology– http://omop.fnih.org/Vocabularies
• Data characteristics tools– OSCAR - http://omop.fnih.org/OSCAR
– NATHAN - http://omop.fnih.org/NATHAN
• Health Outcomes of Interest library– http://omop.fnih.org/HOI
• Methods library– http://omop.fnih.org/MethodsLibrary
• Simulated data– http://omop.fnih.org/OSIM
10
OBSERVATIONAL MEDICALOUTCOMESPARTNERSHIP
OMOP Analysis Process
Source 1 Source 2 Source 3
OMOP Analysis results
Analysis method
Transformation to OMOP common data model
OBSERVATIONAL MEDICALOUTCOMESPARTNERSHIP
11Page 11
Establishing a common data model
• Developed with broad stakeholder input• Designed to accommodate disparate types of data (claims and EHRs)• Applied successfully across OMOP data community
http://omop.fnih.org/CDMandTerminologies
OBSERVATIONAL MEDICALOUTCOMESPARTNERSHIP
12Page 12
Standardizing terminologies to accommodate disparate observational data sources
NDF-RTNDF-RT
RxNormRxNorm
NDCNDCGPIGPI MultumMultum
ExistingDe Novo
MappingExistingDe Novo
Mapping
HCPCS*HCPCS*
Derived
CPT-4*CPT-4*
Source codes
Low-level drugs (Level 1)
Ingredients(Level 2)
Classifications(Level 3)
RxNormRxNorm
Top-level concepts(Level 4) NDF-RTNDF-RT
ICD-9-Proc*ICD-9-Proc*
MedDRAMedDRA
MedDRAMedDRA
Low-level Terms (Level 1)
Preferred Terms (Level 2)
MedDRAMedDRA
MedDRAMedDRA
MedDRAMedDRA
High Level Terms (Level 3)
High Level Group Terms (Level 4)
MedDRAMedDRASystem Organ Class (Level 5)
ExistingDe Novo
Mapping
Derived
Source codes ICD-9-CMICD-9-CM
Low-level concepts (Level 1)
Higher-level classifications (Level 2)
ReadReadSNOMED-CTSNOMED-CT
SNOMED-CTSNOMED-CT
OxmisOxmis
Top-level classification (Level 3)
SNOMED-CTSNOMED-CT
SNOMED-CTSNOMED-CT
MedDRAMedDRA
MedDRAMedDRA
Low-level Terms (Level 1)
Preferred Terms (Level 2)
MedDRAMedDRA
MedDRAMedDRA
MedDRAMedDRA
High Level Terms (Level 3)
High Level Group Terms (Level 4)
MedDRAMedDRASystem Organ Class (Level 5)
ExistingDe Novo
Mapping
Derived
ExistingDe Novo
MappingExistingDe Novo
Mapping
Derived
Source codes ICD-9-CMICD-9-CM
Low-level concepts (Level 1)
Higher-level classifications (Level 2)
ReadReadSNOMED-CTSNOMED-CT
SNOMED-CTSNOMED-CT
OxmisOxmis
Top-level classification (Level 3)
SNOMED-CTSNOMED-CT
SNOMED-CTSNOMED-CT
Standardizing conditions:
Standardizing drugs:
http://omop.fnih.org/Vocabularies
OBSERVATIONAL MEDICALOUTCOMESPARTNERSHIP
13
Observational Source Characteristics Analysis Report (OSCAR)
• Provides a systematic approach for summarizing observational healthcare data stored in the OMOP common data model
• Creates a structured output dataset of summary statistics of each table and field in the CDM– Categorical variables: one-, two-, and three-way stratified counts (e.g. number of
persons with each condition by gender)– Continuous variables: distribution characteristics: min, mean, median, stdev,
max, 25/75 percentile (e.g. observation period length)– OSCAR summaries from each source can be brought together to do comparative
analyses• Uses
– Validation of transformation from raw data to OMOP common data model– Comparisons between data sources– Comparison of overall database to specific subpopulations of interest (such as
people exposed to a particular drug or people with a specific condition)– Providing context for interpreting and analyzing findings of drug safety studies
http://omop.fnih.org/OSCAR
OBSERVATIONAL MEDICALOUTCOMESPARTNERSHIP
14
Characterization example: Drug prevalence
• Context: Typically we evaluate one drug at a time, against one database at a time
• In active surveillance, need to have ability to explore any medical product, across a network of disparate data sources
• Exploration: what is the prevalence of all medical products across all data sources?– Crude prevalence: # of persons with at least one exposure / # of
persons– Standardized prevalence: Adjusted by age and gender to US Census– Age-by-gender Strata-specific prevalence
• Not attempting to get precise measure of exposure rates for a given product, but instead trying to understand general patterns across data community for all medical products
OBSERVATIONAL MEDICALOUTCOMESPARTNERSHIP
15
Standardized drug prevalence (age*gender stratified annualized rates;
standardized to US Census)
OBSERVATIONAL MEDICALOUTCOMESPARTNERSHIP
16
Standardized prevalence for select drugs
OBSERVATIONAL MEDICALOUTCOMESPARTNERSHIP
17
Source-specific drug prevalence, by year by age by gender
OBSERVATIONAL MEDICALOUTCOMESPARTNERSHIP
18Page 18
OMOP Methods Library
• Standardized procedures are being developed to analyze any drug and any condition
• All programs being made publicly available to promote transparency and consistency in research
• Methods will be evaluated in OMOP research against specific test case drugs and Health Outcomes of Interest
http://omop.fnih.org/MethodsLibrary
19
OBSERVATIONAL MEDICALOUTCOMESPARTNERSHIP
What methods are most appropriate for signal refinement?Multiple alternative approaches identified that deserve empirical
testing to measure performance
http://omop.fnih.org/MethodsLibrary
Method nameParameter combinations
Release date
Disproportionality analysis (DP) 112 15-Mar-10Univariate self-controlled case series (USCCS) 64 2-Apr-10Observational screening (OS) 162 8-Apr-10Multi-set case control estimation (MSCCE) 32 16-Apr-10Bayesian logistic regression (BLR) 24 21-Apr-10Case-control surveillance (CCS) 48 2-May-10IC Temporal Pattern Discovery (ICTPD) 84 23-May-10Case-crossover (CCO) 48 1-Jun-10HSIU cohort method (HSIU) 6 8-Jun-10Maximized Sequential Probability Ratio Test (MSPRT) 144 25-Jul-10High-dimensional propensity score (HDPS) 144 6-Aug-10Conditional sequential sampling procedure (CSSP) 144 30-Aug-10Statistical relational learning (SRL)Incident user design (IUD-HOI)
20
OBSERVATIONAL MEDICALOUTCOMESPARTNERSHIP
• Apply method with specific set of parameter settings to a database for a drug-outcome pair that has a prior suspicion of being potentially related
• Example: Run method X on database A for ACE inhibitors – Angioedema
Studying method performance for ‘signal refinement’
• Challenges:• How to put the resulting score in context?• What if we modified one of the methods parameters?• What if we applied the method to a different database?• If we had applied the same method to other drug-outcome pairs, what types of scores
would we expect?• How many other true positives would get a RR > 1.8?• How many false positives would be identified with a threshold of 1.8?
Hypothetical data
Rela
tive
risk
Database A - Method X
21
OBSERVATIONAL MEDICALOUTCOMESPARTNERSHIP
How do effect estimates vary by database?
• Each database may have unique source population characteristics that can influence method behavior, including:
– Sample size– Length and type of longitudinal data capture– Population demographics, such as age, gender– Disease severity, including comorbidities, concomitant medications and health service
utilization patterns
Hypothetical data
Rela
tive
risk
Method X
Database A Database B Database C Database D
22
OBSERVATIONAL MEDICALOUTCOMESPARTNERSHIP
How do estimates vary by method parameter settings?
• Performance can be sensitive to various factors, including:– Length of washout period to identify incident use– Definition of time-at-risk– Choice of comparator– Number and types of covariates to include in propensity score modeling– Statistical approach for adjustment: matching vs. stratification vs. multivariate modeling
• ‘Optimal’ settings may vary by database and/or the drug-outcome pair in question
Hypothetical data
Rela
tive
risk
Database C - Method XConfiguration 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17
23
OBSERVATIONAL MEDICALOUTCOMESPARTNERSHIP
How does method perform against other ‘benchmark’ true positives and negative controls?
• It is important to establish operating characteristics of any method, when applied across a network of databases, as part of the ‘validation before adoption for signal refinement’
Evaluate how estimates compare with other ‘true positive’ examples….
…but also need calibration against negative controls
Database A - Method X
Rela
tive
risk
Hypothetical data
24
OBSERVATIONAL MEDICALOUTCOMESPARTNERSHIP
• ROC plots sensitivity (recall) vs. false positive rate (FPR)• Area under ROC curve (AUC) provides probability that method will score a randomly
chosen true positive drug-outcome pair higher than a random unrelated drug-outcome pair
• AUC=1 is perfect predictive model, AUC=0.50 is random guessing (diagonal line)
Receiver Operating Characteristic (ROC) curve
Hypothetical data
OBSERVATIONAL MEDICALOUTCOMESPARTNERSHIP
25
Visualizing performance of alternative methods across a network of databases
26
OBSERVATIONAL MEDICALOUTCOMESPARTNERSHIP
• Developing a robust, systematic process for active surveillance requires innovative informatics solutions– Systems architecture and data access strategies for integrating
networks of disparate data sources– Common data model to standardize data structure and terminologies– Standardized procedures to characterize data sources and patient
populations of interest– Analytical methods to identify and evaluate the effects of medical
products– Visualization tools to enable interactive exploration of analysis
results and provide a consistent framework for effectively communicating with all stakeholders
• Further research and development should have broader applications beyond active surveillance to other healthcare opportunities
Ongoing opportunities for clinical informatics
27
OBSERVATIONAL MEDICALOUTCOMESPARTNERSHIP
Contact information
Patrick RyanResearch Investigatorryan@omop.org
Thomas ScarnecchiaExecutive Directorscarnecchia@omop.org
Emily WelebobSenior Program Manager, Researchwelebob@omop.org
OMOP website: http://omop.fnih.org
top related