Top Banner
Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam
84

Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Dec 22, 2015

Download

Documents

Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Evaluation of information systems

Nicolette de KeizerDept Medical Informatics

AMC - University of Amsterdam

Page 2: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Outline

Significance of evaluationProcess of evaluationEvaluation questionsMethods – study design, data collectionTriangulation

Page 3: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

• Reduction of medication error by CPOE• Better treatment / diagnostics by decision-support systems• Increase quality of documentation• Reduce costs by telemedical applications

IT in health care: possible benefits

Page 4: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.
Page 5: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Unintended Consequences of Information Technologies

Aim Determine the effect on mortality of introducing CPOE into Pittsburgh

childrens hospital Methods

Demography, clinical and mortality data collected on all children transported to a hospital where CPOE implemented institution-wide in 6 days. Trends for 13 months prior and 5 months after compared.

Results Mortality rate increased from 2.80% (39 of 1394) to 6.57% (36 of 548) After adjustment for other covariables, CPOE independently

associated with increased odds of mortality (odds ratio 3.28, 95% C.I. 1.94 – 5.55)

Page 6: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Conclusion When implementing CPOE systems, institutions should continue to

evaluate mortality effects, in addition to medication error rates Importance

Received disproportionate media attention due to reactionary message

Follow-on study in Seattle, using same vendor system, also published in Pediatrics, showed no increase in mortality

Unintended Consequences of Information Technologies

Page 7: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

• Brigham and Womens' Hospital, Boston introduced a CPOE system, that allows physicians to order the medication online (and not on paper anymore).

• After implementation, the rate of intercepted Adverse Drug Events (ADE) doubled!

• Reason: The system allowed to easily order much too large dosages of potassium chloride without clear indicating that it be given in divided doses.

Bates et al The impact of computerized physician order entry on medication error prevention. JAMIA 1999, 6(4), 313-21.

Negative effects of CPOE: Example 2

Page 8: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.
Page 9: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Unintended Consequences of Information Technologies

Reference Linder et al., Arch Intern Med. 2007 Jul

9;167(13):1400-5. [Brigham & Women’s Hospital] Aim

Assess effects of Electronic Health Records on quality of care delivered in ambulatory settings

Methods Retrospective, cross-sectional analysis of 17

quality measures from 2003-2004 National Ambulatory Medical Care Survey, correlated with use of EHRs.

Page 10: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Results EHRs used in 18% of 1.8 billion visits For 14 of 17 quality measures, fraction of visits where

recommended best practice occurred was no different in EHR settings than manual records settings.

2 better with EHR: avoiding benzodiazepines in depression, avoiding routine urinalysis

1 worse with EHR: prescribing statins for hypercholesteremia (33% vs. 47%, p=0.01)

Conclusion As implemented, EHRs not associated with better quality

ambulatory care

Unintended Consequences of Information Technologies

Page 11: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Reference Linder et al., Arch Intern Med. 2007 Jul

9;167(13):1400-5.

Importance Received disproportionate media attention due to

reactionary message Lost in the media hype: Less than 40% of EHR

implementations have all elements important for effects on quality (e-prescribing, test ordering, results, clinical notes, decision support).

Best performance regardless of infrastructure was suboptimal (< 50% adherence to best practice).

Unintended Consequences of Information Technologies

Page 12: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

• London Ambulance Dispatch System collapsed due to inadequate testing. Thousands of emergency calls were answered not or too late.

http://www.cs.ucl.ac.uk/staff/A.Finkelstein/las.html

• The malfunction of Therac-25, a medical linear accelerator caused the death of three patients in the late 1980th.

http://courses.cs.vt.edu/~cs3604/lib/Therac_25/Therac_1.html

• University hospital stops introduction of Order Entry System due to user boycott. Costs: up to 34 MIO dollars

Ornstein C (2003) Los Angeles Times Jan 22nd, 2003

Other examples

Page 13: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Insufficiently designed, badly integrated or wrongly used IT systems can lead to user frustration and errors.

„Bad health informatics can kill“

Ammenwerth E, Shaw NT. Bad health informatics can kill - is evaluation the answer? Methods of Information in Medicine 2005;44:1-3.

Examples: http://iig.umit.at/efmi/ -> Bad Health Informatics can Kill

Page 14: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Need for Evaluation

IT systems can have large impact

on quality of careIT systems are costly

IT systems can fail or be sub-optimal designed

Evaluation is a way

to provide better IT Systems

Page 15: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Systematic Evaluation of IT is essential

Formative (constructive): evaluation looking forward

Summative: evaluation looking backward

Page 16: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

• Evaluation is • the act of measuring or exploring properties

• of a health information system (in planning, in development, in implementation, or in operation),

• the result of which informs a decision to be made concerning that system in a specific context.

Ammenwerth E, Brender J, Nykänen P, Prokosch U, Rigby M, Talmon J. Visions and strategies to improve evaluations of health information systems - reflections and lessons based on the HIS-EVAL workshop in Innsbruck. Int J Med Inform 2004 Jun 30;73(6):479-91.

Evaluation: Definition (1/2)

Page 17: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

24,2

20,0

15,0

7,4 6,75,3 5,3 4,6

3,1 2,0 1,5 1,5 1,4 1,4 0,50,0

5,0

10,0

15,0

20,0

25,0

30,0

Exp

ert

& K

now

ledg

e B

ased

Sys

tem

Tel

econ

sula

tion

Sys

tem

Clin

ical

Inf

orm

atio

n S

yste

m(g

ener

al,

othe

r) PA

CS

Pat

ient

Inf

orm

atio

n S

yste

m

GP

Sys

tem

CP

OE

Nur

sing

Sys

tem

Pha

rmac

eutic

al S

yste

m

Tel

emed

ical

yss

tem

Ana

esth

esia

Sys

tem

Rad

iolo

gica

l Inf

orm

aito

n S

yste

m

Lab

Info

rmat

ion

Sys

tem

Pat

ient

Dat

a M

anag

emen

t S

yste

m

Ope

ratio

n U

nit

Sys

tem

%

Evaluated types of information systems 1982- 2002 (n = 1.035)

Page 18: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

0

20

40

60

80

100

120

140

Num

ber

of

stu

die

s p

ublis

hed e

ach y

ear

Annualy published papers in PubMed on IT evaluation in health care (PubMed analysis)

0.6% of all papers

1.0% of all papers

Ammenwerth, de Keizer (2004)

Page 19: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Evaluation in Informatics is Notoriously Difficult

We live in a pluralistic world. There will be many points of view on need, benefit, quality.

Real people using deployed technology: things can go wrong or right for very complicated reasons

Intersection of 3 domains where progress is very rapid: Work in the domain (health care, bio-science) Information technology Evaluation methods

Sometimes they don’t really want to know...

Page 20: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Outline

Significance of evaluationProcess of evaluationEvaluation questionsMethods – study design, data collectionTriangulationPublication bias

Page 21: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

The General Process of Evaluation

"Contract"

Questions ReportInvestigationNegotiation

Decisions

Page 22: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Negotiation and Contract

Identify the primary audience(s) and interact with them

"Contract"

Questions ReportInvestigationNegotiation

Page 23: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Roles in Evaluation: The Playing Field

Director

StaffFunder

Evaluation

Funder

Peers,Relations

of Clients

ThoseWho UseSimilarResources

Public InterestGroups and

Professional Societies

Evaluation TeamDirectorStaff

Development Team Director'sSuperiors

Development

Development

Resource Users and

Their Clients

Page 24: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Negotiation and Contract

Identify the primary audience(s) and interact with them Set general goals and purposes of the study Identify, in general, the methods to be used Identify permissions, accesses, confidentiality issues

and other key administrative aspects of the study Describe the result reporting process Reflect this in a written agreement

"Contract"

Questions ReportInvestigationNegotiation

Page 25: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Questions

Specific questions derived from the general Maximum 5-10 They do not have to be stated as hypotheses Depending on methods used, the questions can

change over the course of the study

"Contract"

Questions ReportInvestigationNegotiation

Page 26: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Investigation

Choose data collection methods There are two major families of investigational

approaches: objectivist and subjectivist Although some studies use both families,

typically you will choose one or the other

"Contract"

Questions ReportInvestigationNegotiation

Page 27: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Report

Process of communicating findings: reporting is often done in stages

It doesn’t have to be a written document exclusively

Targeted at the audience(s), in language they can understand

Report must conform the evaluation agreement

"Contract"

Questions ReportInvestigationNegotiation

Page 28: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Subjectivistic vs. objectivistic

Generate hypothesis

Open and broad

Detect relationships

Inductive

Focus on qualitative methods

Subjectivistic (interpretative, explorative) approaches

Objectivistic (positivistic, explanative) approaches

Test hypothesis

Focused and exact

Prove relationships

Deductive

Focus on quantitative methods

hypothesis

Page 29: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Group work

Judge abstracts 1 and 2: Subjectivist or objectivist study?

Page 30: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Outline

Significance of evaluationProcess of evaluationEvaluation questionsMethods – study design, data collectionTriangulation

Page 31: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Evaluation and IT life cycle

Resource impact: effects, acceptance, costs, benefits, …

Usability, stability, …

Software Engineering: Verification, validation,

Information needs

Page 32: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Group work

What kind of aspects do you want to evaluate if your hospital implement a nursing documentation system or a DSS application? Physician Nurse Manager Developer

Page 33: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Measures for IT evaluation studies1.

Static IT attributes (hardware and software quality)

Static user attributes(computer knowledge)

2.Quality of interaction between

IT and user (e.g. usage patterns, user satisfaction, data quality)

3.Effects of IT on

process quality of care(efficiency, appropriatness,

organisational aspects)

4.Effects of IT on

outcome quality of care(quality of care, costs of care,

patient satisfaction)

Page 34: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

1,8

2,8

3,7

5,0

5,5

7,1

7,4

12,9

13,6

15,5

20,0

21,9

23,8

31,4

0 5 10 15 20 25 30 35 40 45 50

4.4 Patient-related knowledge or behaviour

1.3 General computer knowledge/attitudes

4.3 Patient satsifaction with patient care

3.3 Organisational and social quality

2.2 Costs of information processing

4.1 Quality of patient care

2.4 Usage patterns

2.1 Quality of documented/processed information

4.2 Costs of patient care

1.1 Hardware or technical quality

1.2 Software quality

2.3 User satisfaction

3.1 Efficiency of working processes

3.2 Appropriateness of care

Percantage [%]

Evaluted aspects in evaluation studies 1982- 2002 (n = 983)

Ammenwerth, de Keizer (2004)

Page 35: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Why formulate questions ?

Crystallize thinking of evaluators and key “stakeholders”

There is a need to focus and prioritize It converts broad aims into specific questions that

can potentially be answered

Page 36: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Further benefits of identifying questions

Stakeholders can see where their concerns are being addressed

The choice of methods follows from the question

A list discourages evaluators from focusing only on questions amenable to their preferred methods

Page 37: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

• Evaluation is part of the whole IT life cycle

• Any evaluation must have a clear evaluation question (there is no „global“ or „absolute“ evaluation).

• The evaluation questions should be decided by the stakeholders.

Evaluation question: Recommendations

Take the time for elaborating clear and agreed evaluation questions!

Page 38: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Outline

Significance of evaluationProcess of evaluationEvaluation questionsMethods – data collection & study designTriangulation

Page 39: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Evaluation question

Answer to evaluation question

Data

Evaluation methods

Page 40: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Evaluation generates data to answer questions

„I like it“

„It does not work!“

Physician curses at the computer.

Nurse tries to enter password several times.

Mean time for data entry: 3.5 min.

Mean user satisfaction: 1.9 (from 5 max.)

85% of care plans are incomplete.

4 medication errors per day.

Costs of 3.500 Euro per workstation.

Which types of data are represented?

Page 41: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Data

Quantitative data: Numbers Qualitative data: Text, videos, …

Count, measure, weight, … Describe, observe, …

Generates exact results

Easy to work with

Easier to aggregate

Rich in content

Needs less standardization

Needs no large numbers

Positive attributes?

Page 42: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

0

10

20

30

40

50

60

70

80

90

100

1982

1983

1984

1985

1986

1987

1988

1989

1990

1991

1992

1993

1994

1995

1996

1997

1998

1999

2000

2001

2002

Per

cent

age

[%]

more quantitativemethods usedmore qualitativemethods usedmixed or unclear

Quantitative vs. qualitative methods in evaluation studies 1982- 2002 (n = 983)

Ammenwerth, de Keizer (2004)

Page 43: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Quantitative RCT study

Introduction of nursing documentation

system

Reduce time efforts for documentation

Hypothesis: causal relationship

Example 1a: Does documentation system reduce time?

Page 44: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Example 1b: What are effects of documentation system?

Improve transparencyof nursing care

Reduce time efforts for documentation

Improve communication with physicians

Improve communication

in team

Improve qualityof nursing care

Improve IT skills

Improve quality ofdocumentation

Qualitative ethnographic study

Page 45: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Example 2a: Which factors determine user satisfaction with a nursing documentation system?

Computer experience(in years)

Attitude towards nursing process

Attitude towards computers in nursing

Age of nurse

Quality of training

Quality of support

Performance and Stability of the system

Qualitative interview study

Page 46: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Quality of training

Quantitative study

Example 2b: Does age or quality of training determine user satisfaction with a nursing documentation system?

User satisfaction

User satisfactionAge of nurse

Hypothesis: relationship

Hypothesis: relationship

Design: e.g. RCT, observational

Page 47: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Kinds of evaluation study

Objectivist studiesSubjectivist studies

Evaluation studies

Page 48: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Kinds of evaluation study

Objectivist studiesSubjectivist studies

Measurement studies Demonstration studies

Descriptive studies

Correlational studies

Comparative studies

Reliability studies

Validity studies

Evaluation studies

Page 49: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Descriptive studies

Aim: to describe something

Example: how often do doctors use a CPOE?

Methods: survey, log file, observation, case note audit…

Variables: single variable of interest – the “dependent” variable (usage rate)

Analysis: simple descriptive statistics – mean & SD; median & inter-quartile range…

Page 50: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Correlational studies

Aim: to correlate something with something else

Example: is CPOE use associated with less calls from pharmacy to department ?

Methods: survey, log file, observation, case note audit…

Variables: “dependent” variable (calls) + independent variables (usage rate, age…)

Analysis: univariate or multivariate regression

Page 51: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Comparative studies

Aim: to assess cause and effectExample: does CPOE cause less medication

errors? Methods: experiments: before-after, interrupted

time series, randomised trial…Variables: “dependent” variable (errors) +

independent variable (allocation to CPOE, actual usage rate, pt. age…)

Analysis: hypothesis testing or estimation (t tests, chi squared, analysis of variance…)

Page 52: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Structure of a comparative study

Group 2 (control)

Group 1 (study)

Intervention(information resource)

Difference Cause ?Bias

(unintended factors)

Allocation process

Measure 1

Measure 2

Page 53: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Quasi-experimental study types

1. One group posttest Intervention group: X O2

2. One group pretest posttest Intervention group: O1 X O2

3. Posttest with non-equivalent control group Intervention group: X O1 Control group: O1

4. Pretest posttest with non-equivalent control group Intervention group: O1 X O2 Control group: O1 O2

5. Interrupted time-series study Intervention group: O1 O2 O3 X O4 O5 O6

Harris AD et al. JAMIA. 2006 Jan-Feb;13(1):16-23.

Page 54: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Group work

Study design of abstract 2, 3, 4, 5, 6

Page 55: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

One group posttest

Nebeker JR, Hoffman JM, Weir CR, Bennett CL, Hurdle JF. High rates of adverse drug events in a highly computerized hospital.Arch Intern Med. 2005 May 23;165(10):1111-6.

Page 56: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Nebeker JR, Hoffman JM, Weir CR, Bennett CL, Hurdle JF. High rates of adverse drug events in a highly computerized hospital.Arch Intern Med. 2005 May 23;165(10):1111-6.

One group posttest

Page 57: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

One group pretest posttest

Potts AL, Barr FE, et al. Computerized physician order entry and medication errors in a pediatric

critical care unit. Pediatrics. 2004 Jan;113(1 Pt 1):59-63.

Page 58: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

One group pretest posttest

Potts AL, Barr FE, et al. Computerized physician order entry and medication errors in a pediatric

critical care unit. Pediatrics. 2004 Jan;113(1 Pt 1):59-63.

Page 59: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Posttest with non-equivalent control group

PE = Prescription ErrorCDOE = computerized drug order entry system

Oliven A, Michalake I, Zalman D, Dorman E, Yeshurun D, Odeh M. Prevention of prescription errors by computerized, on-line surveillance of drug order entry.Int J Med Inform. 2005 Jun;74(5):377-86.

Page 60: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Posttest with non-equivalent control group

Oliven A, Michalake I, Zalman D, Dorman E, Yeshurun D, Odeh M. Prevention of prescription errors by computerized, on-line surveillance of drug order entry.Int J Med Inform. 2005 Jun;74(5):377-86.

Page 61: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Pretest posttest with non-equivalent control group

King WJ, Paice N, Rangrej J, Forestell GJ, Swartz R. The effect of computerized physician order entry on medication errors and adverse drug events in pediatric inpatients. Pediatrics. 2003 Sep;112(3 Pt 1):506-9.

Page 62: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

King WJ, Paice N, Rangrej J, Forestell GJ, Swartz R. The effect of computerized physician order entry on medication errors and adverse drug events in pediatric inpatients. Pediatrics. 2003 Sep;112(3 Pt 1):506-9.

Pretest posttest with non-equivalent control group

Page 63: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Problems with (one group) before-after studies

Other internal changes: insights from health care redesign, staff training, re-deployment…

External changes in health policies, practice guidelines, technologies, professional training, staff shortages, patient case-mix, expectations, eHealth…

Page 64: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Problems with (one group) before-after studies

Result: confounding - were differences due to system, to redesign, to improved data, or something else ?

Need to control for confounding: option 1: carefully chosen external and internal controls

Page 65: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Controlled before-after studies

External control: same data / practice in one or more matched external

groups of practitioners subject to the same confounders not exposed to the intervention

Internal control: similar data / practice in the same target practitioners subject to the same confounders not susceptible to the intervention

Page 66: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Controlled before-after study

0

10

20

30

40

50

60

70

Before After

% d

ata

com

ple

te

asthma, site A

diabetes, site A

asthma, site B

System implemented at site A

Page 67: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Problems with before-after studies

Result: confounding - were differences due to system, to redesign, to improved data, or something else ?

Need to control for confounding: option 1: carefully chosen external and internal controls option 2: interrupted time series

Page 68: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Interrupted time-series study

Koide D, Ohe K, Ross-Degnan D, Kaihara S. Computerized reminders to monitor liver function to improve the use of etretinate.Int J Med Inform. 2000 Jan;57(1):11-9.

Page 69: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Interrupted time series

At least 3 pre- and 3 post-intervention measurements

Aim to demonstrate regression discontinuity

Problems: Cost of making repeated measurements -use routine data Difficulty separating intervention from baseline drift,

seasonal effects...

Page 70: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Problems with before-after studies

Result: confounding - were differences due to system, to redesign, to improved data, or something else ?

Need to control for confounding: option 1: carefully chosen external and internal controls option 2: interrupted time series option 3: randomized controlled trial

Page 71: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Randomization

Berner ES et al. Improving ambulatory prescribing safety with a handheld decision support system: a randomized controlled trial.J Am Med Inform Assoc. 2006 Mar-Apr;13(2):171-9.

CDSS = computerized decision support system

NSAID = nonsteroidal anti-inflammatory drug

Page 72: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

„The randomized controlled trial (RCT) is the gold standard of evaluation“

(Rotman 1996)

What do you think?

For which questions is the RCT useful?Are there situations where RCT is not possible?For which questions is the RCT not useful?

Study design

Page 73: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

A1 One group posttestA2 One group pretest – posttest…B1 Posttest with non-random control groupC1 Pretest – posttest with non-random control group…D Interrupted Time Series (multiple pretest and multiple posttest)…------------------------------------------------------------------------------------------Randomized controled trial------------------------------------------------------------------------------------------Systematic reviewMeta-Analysis

Level of evidence in Health Informatics

Quasi-experimental study design: „Natural experiment“

Experiment

Aggregation of evidence

Harris AD et al. JAMIA. 2006 Jan-Feb;13(1):16-23.

Page 74: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Problems in RCTs: contamination

Explanation: carry-over from intervention to control groups

Risk factors: if nurses / doctors can apply information (eg. guidelines) to control patients, share information with control staff; cross cover, rotations...

Result: reduce apparent effect (type II error)

Solutions: quantify in a pilot; randomise clusters (clinician, team, hospital…) instead of individual patients

Page 75: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Cluster trials

Problems: Need 30-100% more patients Risk of “unit of analysis” error Analyses more complicated

Page 76: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Problems in RCTs: confounding

Explanation: extra actions in addition to intended intervention

Risk factors: if training, extra support for information resource users

Result: exaggerate apparent effect (type I error)

Solutions: avoid co-intervention; apply it to control group;

Page 77: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Conclusions

Demonstration studies: descriptive, correlational, comparative

RCTs are gold standard for answering effectiveness questions

It may prove difficult or unethical to carry out an RCT; several variant designs may help

Subjectivistic studies: why does(not) it work

Page 78: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Outline

Significance of evaluationProcess of evaluationEvaluation questionsMethods – study design, data collectionTriangulation

Page 79: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.
Page 80: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Views

The choice of methods determines which data you will obtain which insight you will be able to get

Broader picture of reality and improved reliability

Page 81: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Triangulation

Evaluation: Multiple employments of Data sources, Observers, Methods, Theories, In investigation of the same phenomenon

Aim: Increasing the completeness of results Validation of results by comparing findings

Page 82: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Triangulation

Data triangulation: Various data sources are used Time: Questionnaires repeated at different times Person: Interviews with various health care professionals

Methods triangulation: Combination of methods for data collection and data analysis (e.g. interviews and observation)

Investigator triangulation: Researchers with different backgrounds take part in study

Page 83: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

Evaluation methods: Recommendations

Be aware of the large amount of available methods

Chose adequate methods with regard to your study question.

Consider applying triangulation

Page 84: Evaluation of information systems Nicolette de Keizer Dept Medical Informatics AMC - University of Amsterdam.

• Evaluation is an ethical imperative for health informaticians

DO IT!

• Methods and approaches must be adequatly selected to ensure that the important questions are answered

AND DO IT RIGHT!

Final comments