Top Banner
Advanced Decision Analysis: Markov Models
56

Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Feb 10, 2020

Download

Documents

dariahiddleston
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Advanced Decision Analysis: Markov Models

Page 2: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Reasons to understand Markov models

• It’s the standard in academia • Very powerful/flexible tool

– simple decision trees • relatively inflexible

– limited time frame – hard to model recurring risks

– Markov models: • allow you to model anything (well, almost anything)

Page 3: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Challenges of Markov models

• More difficult conceptually than simple decision tree models

• Trickier math – rates vs. probabilities

Page 4: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Overview • Introduction to Markov models

– Problems with simple decision trees – Markov models

• general structure • how they run • rates & probabilities • how they keep score (expected value)

Page 5: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov
Page 6: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov
Page 7: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Should patients with Bjork-Shiley valves undergo prophylactic replacement?

Risks of Outlet Strut Fracture (and its consequences)

vs. Risks with reoperation

Page 8: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Clinical Scenario

• 65 year man with prosthetic valve in 1986 – otherwise healthy

• predicted operative risk 3%

– 29mm, 70° Björk-Shiley valve • strut fracture risk 10.5% at 8 years • 63% case-fatality

Page 9: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Decision Analysis

1. structure competing strategies and clinical outcomes in a decision model

2. estimate probabilities for clinical outcomes 3. assign values for clinical outcomes (utility) 4. analysis • calculate expected values of strategies • assess stability of results (sensitivity analysis)

Four Basic Steps

Page 10: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

1. Structure model (simple tree) die

survive/well strut fracture

no fracture/well

watchful waiting

die

survive prophylactic reop

choose

Page 11: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

2. Assign probabilities die

0.63 survive/well

strut fracture 0.105

no fracture/well 0.895

watchful waiting

die 0.03

survive 0.97

prophylactic reop

choose 0.37

Page 12: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

3. Assign values (e.g. in QALYs) die

0.63 0

survive/well 0.37 17.9

strut fracture 0.105

no fracture/well 0.895 17.9

watchful waiting

die 0.03 0

survive 0.97 17.9

prophylactic reop

choose 0.37

Page 13: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

4(a). Baseline analysis die

0.63 0

survive

0.37 17.9

strut fract

0.105

no fracture

0.895 17.9

watch wait

die

0.03 0

survive

0.97 17.9

proph reop

choose 0.37

U*P

0

0.7

16.0

0

17.4

16.7 QALYS

17.4 QALYS

EV

Page 14: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

4(b). Sensitivity analysis

16.0

16.4

16.8

17.2

17.6

18.0

Expected Value (QALYs)

10 8 6 4 2 0

Operative Mortality Risk (%)

Prophylactic Reoperation

Watchful Waiting

Bas

elin

e

Thr

esho

ld

Page 15: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Limitations of simple decision trees

• Don’t account for timing of events – Roll the dice once, at time zero

• Problems with parameter estimates (time) – probabilities, dealing with competing risks over

time – values applied to future outcomes may be

underestimated (all bad events considered to be happening immediately)

Page 16: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Problems with simple trees die

0.63 0

survive/well 0.37 17.9

strut fracture 0.105

no fracture/well 0.895 17.9

watchful waiting

die 0.03 0

survive 0.97 17.9

prophylactic reop

choose 0.37

EV 17.4

EV 16.7

What if the person lived 20 good years before dying from strut fracture? EV=0???

Page 17: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Markov models to the rescue …

• Definition: – iterative model in which hypothetical patients make

transitions between health states over time, accumulating QALYs along the way

• 2 main types: – Cohort simulation (large pop of identical patients) – Monte Carlo simulation (one pt at a time)

Page 18: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Decision analysis with Markov models

• Same four basic steps as simple trees – structuring the model – probabilities – assigning values to outcomes – baseline and sensitivity analysis

• But a little more complex …

Page 19: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Model structures viewed left to right

• Simple trees – specify alternatives

(decision node) – chance events (chance

nodes) – final health states

(terminal nodes)

• Markov models – specify alternatives

(decision node) – parse to health states

(intermediate and final) – chance events move

hypothetical patients between health states

Page 20: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Specifying alternatives

watchful waiting

prophylactic reop

choose

Page 21: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Health states well

well, postop

dead

watchful waiting

well, postop

dead

prophylactic reop

choose

Page 22: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Whole model dead

dead

well, postop

strut fracture

well

survive

well

dead

well, post-op

well, postop

dead

watchful waiting

dead

well, postop

well, postop

dead

prophylactic reop

choose

well, postop

1. Alternatives 2. Health states 3. Cycle trees (end with a health state assignment)

Page 23: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

How the model “runs”

• Markov cohort simulation – hypothetically large cohort – start in distribution of health states at time zero – some members make transitions between health

states with each cycle (Cycle = “stage” in TreeAge)

– keep cycling until everyone (or nearly everyone) absorbed into the state “dead”

Page 24: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Folding back: Time zero well

well, postop

dead

watchful waiting

well, postop

dead

prophylactic reop

choose

cycle length = 1 year

1000 pts

0 pts

0 pts

970 pts

30 pts

Page 25: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Folding back: Cycle 1 well

well, postop

dead

watchful waiting

well, postop

dead

prophylactic reop

choose

cycle length = 1 year

937 pts

4 pts

59 pts

923 pts

77 pts

Page 26: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Folding back: Cycle 10 well

well, postop

dead

watchful waiting

well, postop

dead

prophylactic reop

choose

cycle length = 1 year

540 pts

12pts

448pts

556 pts

444 pts

Page 27: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Folding back: Cycle 50 well

well, postop

dead

watchful waiting

well, postop

dead

prophylactic reop

choose

cycle length = 1 year

0 pts

0 pts

1000 pts

0 pts

1000 pts

Page 28: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Decision analysis with Markov models

• Four components – structuring the model (and how it runs) – probabilities – assigning values to outcomes – analysis

Page 29: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Probabilities

Events with short time horizons (e.g., op risk)

Events that occur over time (e.g., valve-failure)

Constant Changing

Simple trees Markov models

Page 30: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Events with short time horizons (easy) dead

dead

well, postop

strut fracture

well

survive

well

dead

well, post-op

well, postop

dead

watchful waiting

dead

well, postop

well, postop

dead

prophylactic reop

choose

well, postop

1. Alternatives 2. Health states 3. Cycle trees *

*

Page 31: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Events occurring over time (harder) dead

dead

well, postop

strut fracture

well

survive

well

dead

well, post-op

well, postop

dead

watchful waiting

dead

well, postop

well, postop

dead

prophylactic reop

choose

well, postop

1. Alternatives 2. Health states 3. Cycle trees

* *

*

*

Page 32: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Probabilities of events occurring during each cycle of the model (aka transition probabilities)

What you need:

Page 33: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Published studies don’t give you cycle-specific probabilities

Problem:

Page 34: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

What do you get from literature?

• Cumulative incidence – (e.g., “stroke incidence at 5 years was 24%…”)

• Kaplan-Meier plots – proportion of population free of event

• Number of events, size of population

Page 35: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Deriving transition probabilities

Data from source studies

Annual rates

Cycle-specific probabilities

Step 1

Step 2

Page 36: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Step 1. Deriving rates

Source Data x-year probability (% pts with event at x years) survival / event-free curve (% pts [f] without event) # events

Rate Formula ln (1 - p) t ln (f) t # events/pt-yrs follow-up

Page 37: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Deriving rates: Example

• Kaplan-meier plot • 89.5% free of OSF

at 8 years • rate = - ln (f) / t = - ln 0.895 / 8 = 0.0139 / yr = 1.39% / yr Time (yrs)

0 8

% free OSF

0

100

(89.5%)

Page 38: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Deriving transition probabilities

Data from source studies

Rates

Cycle-specific probabilities

Step 1

Step 2

Page 39: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Step 2. Converting rates to transition probabilities

• Transition probability (p) = 1 – e^(-r * t) – r = rate, t = cycle length

• Example: – Strut Fx rate = 0.0139; 3 month cycle length p = 1 – e^(-0.0139 * 0.25) = 0.00347

• TreeAge function: “RateToProb”

Page 40: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov
Page 41: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov
Page 42: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov
Page 43: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Decision analysis with Markov models

• Four components – structuring the model (and how it runs) – probabilities – assigning values to outcomes – analysis

Page 44: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Assigning values (rewards)

• Simple trees: – one value assigned each terminal node

• Markov: – assigned at each health state – can be credited multiple times (with each cycle

of the model)

Page 45: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Assigning values to health states (“rewards”)

• Measures of expected value – Costs ($), years of life, QALYs

Page 46: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Accumulating rewards: Cycle 0 to 1 well

well, postop

dead

watchful waiting

well, postop

dead

prophylactic reop

choose

Incremental reward = cycle length = 1 year

1000 pts

0 pts

0 pts

970 pts

30 pts

Cycle 1000 Cum 1000 Cycle 970 Cum 970

Page 47: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Accumulating rewards: Cycle 1 to 2 well

well, postop

dead

watchful waiting

well, postop

dead

prophylactic reop

choose

Incremental reward = cycle length = 1 year

937 pts

4 pts

59 pts

923 pts

77 pts

Cycle 941 Cum 1941 Cycle 923 Cum 1893

Page 48: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Accumulating reward: Cycle 10 to 11 well

well, postop

dead

watchful waiting

well, postop

dead

prophylactic reop

choose

Incremental reward = cycle length = 1 year

540 pts

12pts

448pts

556 pts

444 pts

Cycle 552 Cum 7822 Cycle 556 Cum 7810

Page 49: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Accumulating reward: Cycle 50 well

well, postop

dead

watchful waiting

well, postop

dead

prophylactic reop

choose

Incremental reward = cycle length = 1 year

0 pts

0 pts

1000 pts

0 pts

1000 pts

Cycle 0 Cum 17145 Cycle 0 Cum 17411

Page 50: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Baseline analysis

Strategy Cumulative reward

Expected Value

Watchful Waiting

17,145

Prophylactic Reoperation

17,411

Page 51: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Baseline analysis

Strategy Cumulative reward

Expected Value

Watchful Waiting

17,145 17.1 years/person

Prophylactic Reoperation

17,411 17.4 years/person

Page 52: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

How do you incorporate utilities?

Page 53: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Accumulating rewards: Cycle 0 to 1 well

well, postop

dead

watchful waiting

well, postop

dead

prophylactic reop

choose

Incremental reward = 0.5 * 1 year

1000 pts

0 pts

0 pts

970 pts

30 pts

Cycle 500 Cum 500 Cycle 485 Cum 485

Page 54: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

?

Page 55: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Building a Decision Model Your patient is a 65 year old white male with a large abdominal aortic aneurysm. Although asymptomatic, the aneurysm has grown substantially over the last year, from 4.6cm to 6.0cm. You have decided that the aneurysm needs repair. However, the patient also has severe angina and a positive stress test, and cardiac catheterization reveals good ventricular function, but severe coronary disease that is not amenable to PTCA or stenting. Question: Should this patient undergo AAA repair only, AAA repair followed by CABG, or CABG prior to AAA repair?

Page 56: Advanced Decision Analysis: Markov Modelsweb2.facs.org/ORC2014Flashdrive/MATERIALS/... · – Markov models: • allow you to model anything (well, almost anything) ... • Markov

Basic Steps

1. Structure a decision model 2. Enter probabilities, life expectancy and

utility values 3. Perform a baseline analysis 4. Perform sensitivity analysis