Top Banner
HUMAN ERROR José Luis García-Chico ([email protected]) San Jose State University ISE 105 Spring 2006 April 24, 2006 “To err is human…” (Cicero, I century BC) “…to understand the reasons why humans err is science” (Hollnagel, 1993)
28

Human error

Nov 22, 2014

Download

Documents

Reza Zarei

This looks interesting, I am writing an article abuot error prediction and prevention in medicine. the last part focuses on errors in aviation that I have no idea about that but the first half worth reading.
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Human error

HUMAN ERROR

José Luis García-Chico([email protected])San Jose State University

ISE 105 Spring 2006April 24, 2006

“To err is human…”

(Cicero, I century BC)

“…to understand the reasons why humans err is science”

(Hollnagel, 1993)

Page 2: Human error

Jose Luis Garcia-Chico April 24, 2006

What is important to know about human error?

• Human error is in our nature– It might happen everyone, at any time, in any context

• Some errors are preventable through procedures, system design and automation.

– But careful, they may introduce new opportunities of erring.– Emphasis should be put on error tolerant systems: error recovery instead of

erroneous action prevention.

• Human error might not be an accident cause in itself…it might be caused by multiple factors

– Do not only blame last human operator alone.

Page 3: Human error

Jose Luis Garcia-Chico April 24, 2006

Human error in nuclear powers plants

• Three Mile Island (1979)• Due to a failure, temperature in the reactor increased rapidly.

The emergency cooling system should have come into operation but maintenance staff left two valve closed, which blocked flow. Relief valve opened to relief temperature and pressure, but stuck open. Radioactive water pours into containment area and basement for 2 hour.

• Operators failed to detect the stuck open valve. An indicator had been installed to indicate the valve was commanded to shut, not the status of the valve.

• Some little radioactivity was released to the environment.

Page 4: Human error

Jose Luis Garcia-Chico April 24, 2006

Human error in nuclear powers plants• Uberlinguen (2002)• B757 and Tu-154 collided in the German

airspace, under Zurich control. 71 people were killed.

• Only one controller was in charge of two positions during a night shift (two separated displays). Telephone and STCA under maintenance.

• ATC detected late the conflict between both aircraft, and instructed T-154 to descend. The TCAS on board the T-154 and B757 instructed the pilots to climb and descend respectively. The T-154 pilot opted to obey controller orders and began a descent to FL 350 where it collided with the B757. B757 had followed its own TCAS advisory to descend.

Page 5: Human error

Jose Luis Garcia-Chico April 24, 2006

Definition of Human Error

• Error will be taken as a generic term to encompass all those occasions in which a planned sequence of mental or physical activities fails to achieve its intended outcome, and when these failures cannot be attributed to the intervention of some change agency. (Reason, 1990)

• Human error occurrences are defined by the behavior of the total man-task system (Rasmussen, 1987).

• Actions by human operators can fail to achieve their goal in two different ways: the actions can go as planned, but the plan can be inadequate, or the plan can be satisfactory, but the performance can still be deficient (Hollnagel, 1993)

Page 6: Human error

Jose Luis Garcia-Chico April 24, 2006

Human error performance

SENSE/INTERPRET

PLAN/COGNITION

EXECUTION

MISTAKES

SITUATION ASSESSMENT

INTENT OF ACTION

SLIPS

OMMSSION/COMMISION

(Norman, 1983)

Page 7: Human error

Jose Luis Garcia-Chico April 24, 2006

Human error taxonomies

• Errors of omission (not doing the required thing)– Forgetting to do it– Ignoring to do it deliberately

• Errors of commission (doing the wrong thing)– slips in which the operator has the correct motivation or

intention, but carries out the wrong execution• Sequence or wrong order of execution• Timing: too fast/slow

– errors based in erroneous expectations and schema.(schema are sensory-motor knowledge structures stored in

memory used to guide behavior: efficient and low energy)

Page 8: Human error

Jose Luis Garcia-Chico April 24, 2006

Human error taxonomies: SRK model of behavior (Rasmussen, 1982)

Errors depend on behavior: •Skill-based•Ruled-based•Knowledge-based

Page 9: Human error

Jose Luis Garcia-Chico April 24, 2006

Error distinctions

Dimension Skill-based Error Rule-based error Knowledge-based error

Activity Routine Problem solving

Focus of attention Something other than task in hand

Directed to the problem

Control Mode Highly automated (schemata)

Automated (rules: if X then Y)

Conscious process

Detection Rapid and effective

Difficult and often through external intervention

Page 10: Human error

Jose Luis Garcia-Chico April 24, 2006

Generic Error Modeling System-GEMS (Reason, 1990)

Skill-based level

Page 11: Human error

Jose Luis Garcia-Chico April 24, 2006

Human Error Distribution

• Humans are prone to slip & lapses with familiar tasks:– 61% of errors are skill-based

• Humans are prone to mistakes when tasks become difficult.– 28% of errors are rule-based– 11% of errors are knowledge-based that require

novel reasoning from principles.

Approximate data (Reason, 1990) obtained averaging three studies

Page 12: Human error

Jose Luis Garcia-Chico April 24, 2006

Human are error prone, but….is that all?

• It seems that human operator is responsible of system disasters, just because they are the last and more visible responsible of the system performance.

• Distinction between:– Active errors: error associated with the

performance of the front-line operators, i.e. pilots, air traffic controllers, control rooms crews, etc

– Latent errors: related to activities removed in time and space form the direct control interface, i.e. designers, managers, maintenance, supervisors.

Page 13: Human error

Jose Luis Garcia-Chico April 24, 2006

Model of Human Error causation (Reason, 1990)

Accident /mishap

Adapted from Shappel (2000)

Page 14: Human error

Jose Luis Garcia-Chico April 24, 2006

Building solutions

• Each system will require particular instantiation of the approach, but some general recommendations might include:– Prevent errors: procedures, training, safety

awareness, UI design (allow only valid choices)– Tolerate error: UI design (constraints on inputs),

decision support tools– Recover error: undo capability, confirmation

Page 15: Human error

Jose Luis Garcia-Chico April 24, 2006

Learning from past accident/incident

• Great source of lessons to be learnt…not of facts to blame.

• Careful considerations to keep in mind:– Most people involved in accidents are not stupid nor

reckless. They may be only blindness to their actions.

– Be aware of possible influencing situational factors.– Be aware of the hindsight bias of the retrospective

analyst.

Hindsight bias: Possession of output knowledge profoundly influence the way we analyze and judge past events. It might impose a deterministic logic on the observer about the unfolding events that the individual at the incident time would have not had.

Page 16: Human error

Jose Luis Garcia-Chico April 24, 2006

Nine steps to move forward from error:Woods & Cook (2002)

• Pursue second stories beneath the surface to discover multiple contributors.

• Escape the hindsight bias• Understand work as performed at the sharp end of the

system• Search for systemic vulnerabilities• Study how practice creates safety• Search for underlying patterns• Examine how changes create new vulnerabilities• Used new technology to support and enhanced human

expertise• Tame complexity through new forms of feedback

Page 17: Human error

A cased study:

HUMAN FACTOR ANALYSIS OF OPERATIONAL ERRORS IN AIR TRAFFIC CONTROL

Jose Luis Garcia-ChicoSan Jose State University

Master Thesis of Human Factors and Ergonomics

Page 18: Human error

Jose Luis Garcia-Chico April 24, 2006

Motivation of the study

• Some figures - Air Traffic in the USA 2004 (FAA, 2005)– 46,752,000 a/c in en-route operations– 46,873,000 movement in tower operations– 1216 OEs

• OE rate is been increasing during last years (FAA, 2005):– 0.66%* in 2002– 0.78% in 2003– 0.79% in 2004

• Analysis of errors based on initial Air Traffic Controller Reports:– 539 reports (Jan-Jun 2004)

Overview | Method | Research Questions | Initial Results

Page 19: Human error

Jose Luis Garcia-Chico April 24, 2006

Taxonomic study: Initial Results

Overview | Method | Research Questions | Initial Results

OE Classification

0

20

40

60

80

100

120

Fail C

onve

rging

Control

coord

Descen

d trho

ugh

Overlo

oked

Trf

Vector

inad

equ

Hear/Read

back

Altitud

e Ina

dequ

Fail A

lt Clim

b/Des

cend

Rwy Inc

Climb t

hroug

h

Fail O

verta

king-Trf

Instru

c no-i

ntend

ed

temp er

ror-is

sue

Misapp

l Proc

ed

datab

lock-m

isente

r

Airspac

e

Transp

ose a/

c

FPS-mise

nter

Speed

inad

equ

Wron

g a/c

Ocean

Trf

a/c ov

erlap

LOA m

is

Cleared

blw m

in

Misrea

d info

others

/wha

t

ARTCCTRACON

Total OE = 869TRACON = 304ARTCC = 565

Page 20: Human error

Jose Luis Garcia-Chico April 24, 2006

Top-10 OEs

Overview | Method | Research Questions | Initial Results

Top 10-OEs Distribution

Fail Converging14%

Control coord9%

Descend trhough9%

Overlooked Trf9%

Vector inadequ8%

Hear/Readback8%

Altitude Inadequ7%

Fail Alt Climb/Descend

7%

Climb trhough6%

Fail Overtaking-Trf4%

Others19%

Page 21: Human error

Jose Luis Garcia-Chico April 24, 2006

Proximity of encounters: OE output

Proximity Rating - Combined OEs

0

200

400

600

800

1000

1200

1400

1600

1800

2000

0.00 1.00 2.00 3.00 4.00 5.00

Horizontal Distance (NM)

Verti

cal D

ista

nce

(ft)

ABC

Page 22: Human error

Jose Luis Garcia-Chico April 24, 2006

Concurrent and contextual factors

Overview | Method | Research Questions | Initial Results

Top - Contributing Factors

0 20 40 60 80 100 120 140 160

D-side AbsOS Abs/CIC

Combned sect/decombMishearing

MisjudgmentTrf compexity

Training in prgrsNo pilot response/deviation

Distraction Poor Performance Manouever

WX complexityPilot request

Lapse coordinationOther complx

Point Out ComplexityOthers

Not enough info

TRACONARTCC

Page 23: Human error

Jose Luis Garcia-Chico April 24, 2006

Taxonomic study: Initial Results

Overview | Method | Research Questions | Initial Results

DEV vs Prox Rating

0% 10% 20% 30% 40% 50% 60% 70% 80%

Rating A

Rating B

Rating C

None/Unknown

CPCDEV

Page 24: Human error

Jose Luis Garcia-Chico April 24, 2006

Proximity in EOs% Proximity Rating in OEs

0% 20% 40% 60% 80% 100%

Fail ConvergingControl coord

Descend trhoughOverlooked TrfVector inadequ

Hear/ReadbackAltitude Inadequ

Fail Alt Climb/DescendClimb trhough

Fail Overtaking-TrfInstruc no-intended

temp error-issueMisappl Proced

datablock-misenterAirspace

Transpose a/cFPS-misenter

Speed inadequWrong a/ca/c overlap

Misread infoLOA mis

Cleared blw minOthers

ABCNo Rate

Page 25: Human error

Jose Luis Garcia-Chico April 24, 2006

Co-occurrence of OE

Page 26: Human error

Jose Luis Garcia-Chico April 24, 2006

D-side presence/absence

Severtiy of EOs - D-side Present

0 10 20 30 40 50

Fail Overtaking

Overlook TrfFail Converging

Descend Through

Climb Through

Altitude InadeqVector Inadeq

Hear/Readback

Instruc no-intended

Transpose a/cControl Coordination

Data Bolck misenter

Fail Alt Climb Desc

Temp error-issueMisapplication Proc

FPS-Misenter

ABC

Severtiy of EOs - D-side Absent

0 10 20 30 40 50

Fail Overtaking

Overlook Trf

Fail ConvergingDescend Through

Climb Through

Altitude Inadeq

Vector Inadeq

Hear/ReadbackInstruc no-intended

Transpose a/c

Control Coordination

Data Bolck misenter

Fail Alt Climb DescTemp error-issue

Misapplication Proc

FPS-Misenter

ABC

Page 27: Human error

Jose Luis Garcia-Chico April 24, 2006

Time on Position

0

5

10

15

20

25

30

35

40

5 15 25 35 45 55 65 75 85 95 110 >120

Time on Position

#OEs vs Minutes on Position

Page 28: Human error

Jose Luis Garcia-Chico April 24, 2006

Further Reading

• Besnard, D. Greathead, D., & Baxter, G. (2004). When mental models go wrong. Co-occurrences in dynamic, critical systems. International Journal of human Computer Studies, 60, 117-128.

• Dekker, S. W. A. (2002) Reconstructing human contributions to accidents: the new view on error and performance. Journal of Safety Research, 33, pp. 371-385.

• Hollnagel, E. (1993). The phenotype of erroneous actions. International Journal of Man-Machines Studies, 39, 1-32.

• Norman, A. D. (1981). Categorization of slips. Psychological review, 88 (1), 1-15.• Parasuraman, R., Sheridan, T.B., & Wickens, C.D. (2000). A model for types and levels of

human interaction with automation. IEEE transactions on systems, man, and cybernetics-Part A: Systems and humans, 30 (3), 286-297

• Rasmussen, J. (1982). Human errors: A taxonomy for describing human malfunction in industrial installations. Journal of Occupational Accidents, 4, 311-33.

• Rasmussen, J. (1987) The definition of human error and a taxonomy for technical system design. In Rasmussen, J., Duncan, K., & Leplat, J. (Eds.), New Technology and Human Error (pp. 23-30). New York, NY: John Wiley & Sons.

• Reason, J. T. (1990). Human error. Cambridge, England: Cambridge University Press.• Reason, J. T. (1997). Managing the risks of organizational accidents. Aldershot, England:

Ashgate Publishing Company.• Woods, D.D. & Cook, R.I. (2002). Nine steps to move forward from error. Cognition,

Technology, and Work, 4, 137-144.