Top Banner
Applying Error and Threat Management Concepts to Soaring Operations Key Dismukes Chief Scientist for Aerospace Human Factors NASA Ames Research Center
28

Applying Error and Threat Management Concepts to Soaring Operations Key Dismukes Chief Scientist for Aerospace Human Factors NASA Ames Research Center.

Jan 05, 2016

Download

Documents

Lynn Watts
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Applying Error and Threat Management Concepts to Soaring Operations Key Dismukes Chief Scientist for Aerospace Human Factors NASA Ames Research Center.

Applying Error and Threat Management Concepts to Soaring

Operations

Key DismukesChief Scientist for Aerospace Human Factors

NASA Ames Research Center

Page 2: Applying Error and Threat Management Concepts to Soaring Operations Key Dismukes Chief Scientist for Aerospace Human Factors NASA Ames Research Center.

Most Aviation Accidents Attributed to Pilot Error

• What does this mean?Lack of skill, vigilance, or conscientiousness?

ORInherent vulnerability to error?

• Substantial body of scientific research on pilot performance, error, and safety– Mostly directed to airline ops– Lack database of human factors in soaring accidents (NTSB and ASRS

reports)

• Many principles from airlines and military flight safety can be applied to soaring– Will illustrate with soaring examples

Page 3: Applying Error and Threat Management Concepts to Soaring Operations Key Dismukes Chief Scientist for Aerospace Human Factors NASA Ames Research Center.

Dispense with Fallacies

Fallacy: Pilots who have accidents lack “the right stuff”

Truth: - Not supported by data- With increasing skill pilots take on greater

challenges- Experts in all professions make errors

Fallacy: A flight operation is either “safe” or “unsafe”

Truth: - Every operation has finite degree of risk- Identify and assess risks and develop plan of

action

Page 4: Applying Error and Threat Management Concepts to Soaring Operations Key Dismukes Chief Scientist for Aerospace Human Factors NASA Ames Research Center.

Line-Oriented Safety Audits (LOSA)

• Airline crews typically make one or more errors on most flights– In spite of training, skill, and frequent practice

• Humans perform tasks far beyond capabilities of computers– Work with incomplete and ambiguous data, interpret diverse data sets,

project downstream consequences– Humans are in airline cockpits to deal with the unexpected

• Cognitive features that enable unique human abilities also produce vulnerability to characteristic forms of error– No matter how skillful and conscientious -- error rate is never zero

Page 5: Applying Error and Threat Management Concepts to Soaring Operations Key Dismukes Chief Scientist for Aerospace Human Factors NASA Ames Research Center.

Research on Two Major Domains of Pilot Error

1) Judgment and Decision-making

2) Prospective memory - remembering to perform intended actions

– Talk will be technical in places• Illustrate with real-life soaring situations• Groundwork for practical countermeasures

– Address mistakes of experts, not novices• Level of expertise of private pilot rating in gliders• How cognitive processes operate at this level of expertise

Page 6: Applying Error and Threat Management Concepts to Soaring Operations Key Dismukes Chief Scientist for Aerospace Human Factors NASA Ames Research Center.

Judgment and Decision-Making

• Most common human factor cited by NTSB– But this displays hindsight bias

• Experts in their training make decisions in ways other than formal analysesFormal analysis = 1) listing and evaluating all relevant aspects of situation

2) identifying all relevant options

3) assessing pros and cons of each option

4) assigning weighted score to each option

• Gary Klein et al. studied firefighter scene commanders “Recognition-primed decision-making”

– Quickly scanned scene– Identified situation in terms of prototypes from memory– Appropriate solution pops into mind

• Naturalistic decision-making– Demonstrated in work of professionals in many fields– Not something we choose -- happens automatically as we develop experience

Page 7: Applying Error and Threat Management Concepts to Soaring Operations Key Dismukes Chief Scientist for Aerospace Human Factors NASA Ames Research Center.

Comparison

• Formal problem-solving methods– Slow– Serial processing– Require substantial mental effort– Difficult to use under time pressure, stress

(But can be used to uncover hidden aspects, downstream consequences)

• Naturalistic decision-making – Automatic– Fast– Parallel processing– Less mental effort

(But subject to characteristic biases that can cause errors)

Page 8: Applying Error and Threat Management Concepts to Soaring Operations Key Dismukes Chief Scientist for Aerospace Human Factors NASA Ames Research Center.

Cognitive Biases and Heuristics

• Anchoring and adjustment

• Assimilation bias

• Availability heuristic

• Base rate fallacy

• Confirmation bias

• Conjunction fallacy

• Conservatism• Endowment effect (aka Status

quo bias)

• Escalation of commitment

• Estimation overconfidence

• Expectation bias

• Familiarity bias (similar to Assimilation bias)

• Favor of causal explanations

• Framing effect

• Fundamental attribution bias

• Future discounting

• Omission bias

• Prior hypothesis bias

• Reasoning by analogy

• Recency bias

• Representativeness heuristic

Page 9: Applying Error and Threat Management Concepts to Soaring Operations Key Dismukes Chief Scientist for Aerospace Human Factors NASA Ames Research Center.

Aero-retrieve from Indian Valley Airport

a personal experience

Page 10: Applying Error and Threat Management Concepts to Soaring Operations Key Dismukes Chief Scientist for Aerospace Human Factors NASA Ames Research Center.

Cognitive Factors Contributing to My Flawed Judgment and Decision

Representativeness bias– Situation framed by matching similar prototypes in memory

– Fair amount of experiences in off-field landings (50-60?)• Appeared to be good match to previous retrieve

– Statistical sampling problem• Previous successes may not give true risk probability (wrong in either direction)

– Recognition-primed decision-making• Does not inform individual of downstream consequences of novel actions: cocking

the tow plane• Does not anticipate consequences of novel combinations of factors: loose dirt,

cocking the tow plane, weeds, narrow runway, gullies • Sunk costs: invested time in aero retrieve

– Framing of situation and goal: “make aero retrieve work”

– Wishful thinking: more attractive option (aero retrieve) may have biased perception of risks

Page 11: Applying Error and Threat Management Concepts to Soaring Operations Key Dismukes Chief Scientist for Aerospace Human Factors NASA Ames Research Center.

Implications of this Episode

• Natural decision methods:– Fail to alert us to hidden dangers– Sometimes bias perception of degree of risk of options

• Judith Orasanu: “Plan continuation error”– Failure to revise original plan when circumstances change– One of most common forms of error in airline accidents

Page 12: Applying Error and Threat Management Concepts to Soaring Operations Key Dismukes Chief Scientist for Aerospace Human Factors NASA Ames Research Center.

Ways to Reduce Vulnerability to Decision Biases

• Be aware of limitations and biases inherent in normal cognitive processes

• Explicitly identify your assumptions– Example: wind direction for off-field landing– Voicing assumptions may prompt pilot to search for contrary evidence – Share assumptions with other personnel

• Ask: “What if …?”

• Ask: “Is anything different today from previous encounters?”

• If several unusual aspects present, think how they might interact to produce downstream consequences

– Individual factors may be benign but combine to produce threat

• Always have a back door

Page 13: Applying Error and Threat Management Concepts to Soaring Operations Key Dismukes Chief Scientist for Aerospace Human Factors NASA Ames Research Center.

Prospective Memory (PM)

• Remembering to perform intended actions

• Common in everyday life - in aviation can be fatal– Several major airline accidents

– Several friends killed or injured because failed to hook up a control rod during assembly

• Will discuss prototypical PM situations in soaring– Errors most likely in presence of interruptions, distractions, or

concurrent task demands

Page 14: Applying Error and Threat Management Concepts to Soaring Operations Key Dismukes Chief Scientist for Aerospace Human Factors NASA Ames Research Center.

1. Performing Habitual Tasks from Memory

• Habitual tasks deeply encoded as procedural memory– Not likely to forget how to perform task– Vulnerable to omitting a step when interrupted or distracted

• Habitual tasks (e.g., assembling frequently used sailplane)– Require minimal mental effort: each step of task “pops” into mind

automatically– Executing each step automatically triggers retrieval of next step from

memory– Environmental stimuli (e.g., parts of sailplane) help prompt retrieval of

each step– Sequential execution of steps and environmental stimuli work together

Prospective memory Prototypical situations

Page 15: Applying Error and Threat Management Concepts to Soaring Operations Key Dismukes Chief Scientist for Aerospace Human Factors NASA Ames Research Center.

Opportunities for Error While Performing Habitual Tasks

Example (1): Interruptions while assembling sailplane• ASW 20 horizontal stab held to vertical stab by bolt• Interruption after finger-tightening bolt

– Breaks chain of one step triggering next step– Removes environmental stimulus (view of bolt)

• Distinct chance will jump to next step, forgetting to cinch down the bolt

Example (2): Landing with gear retracted• Several scenarios, for example:

– Develop habit of extending gear on downwindVisual scene provides cues, stimuli that trigger retrieval from memory

– Straight-in approach removes normal visual cues– Interruptions, distractions, or high workload can have the same effect

Prospective memory

Page 16: Applying Error and Threat Management Concepts to Soaring Operations Key Dismukes Chief Scientist for Aerospace Human Factors NASA Ames Research Center.

2. Intending to Perform a Non-habitual Task that must be Deferred

Example: During assembly you notice main gear tire is low– Decide to fill with air after pushing to launch area– Get busy with launch preparations and forget to fill tire

• Why do we forget?– Brain has no special mechanisms for these situations

• How do we (sometimes) remember to perform deferred intentions?– Depends on noticing environmental cues linked in memory to the

intention: cues trigger retrieval from memory– Happenstance process

Example: Might notice air bottle by hangar

Prospective memory Prototypical situations

Page 17: Applying Error and Threat Management Concepts to Soaring Operations Key Dismukes Chief Scientist for Aerospace Human Factors NASA Ames Research Center.

3. Interleaving Several Tasks Concurrently

Example: Entering data or commands in flight computer– Going head down interrupts outside visual scan

– Similar to using cell phone while driving

– Mentally engaging tasks fully occupy consciousness (focal attention) momentarily forgetting outside scan

– Eyes stay down longer than intended

Prospective memory Prototypical situations

Page 18: Applying Error and Threat Management Concepts to Soaring Operations Key Dismukes Chief Scientist for Aerospace Human Factors NASA Ames Research Center.

Ways to Reduce Vulnerability to Prospective Memory Errors

1) Try to avoid interrupting critical tasks– If interrupted, minimize time eyes are away from tasks

2) If interrupted, create salient cue to remind yourself

Example: When interrupted during assembly, put hands in pockets

3) Use checklist for “killer” items – Know difference between “do” list and checklist– Checklist for killer items: short as possible– Checklists provide redundancy

4) When deferring tasks:– Create salient cue– Identify explicitly where and when you intend to perform deferred tasks

• Example: put tape on canopy to remind about low tire

5) When going head down:– Develop/maintain habit of performing only one step at a time– Scan horizon between each step

Page 19: Applying Error and Threat Management Concepts to Soaring Operations Key Dismukes Chief Scientist for Aerospace Human Factors NASA Ames Research Center.

Error and Threat Management (ETM)

• Latest development in Crew Resource Management (CRM)• CRM development started in late ‘70s (airlines, USAF, & NASA)

– Accidents caused by poor communications, failing to grasp all aspects of situation, failing to manage workload effectively, and failing to develop appropriate plans

• CRM originally focussed on preventing errors– Workload management, communications, situation awareness, decision-

making, leadership/followership, and automation management

• ETM emphasizes detecting and managing errors and threats• Instead of criticizing crews for making mistakes, instructors

should reward crews for catching/managing errors• ETM in early stages of development

Page 20: Applying Error and Threat Management Concepts to Soaring Operations Key Dismukes Chief Scientist for Aerospace Human Factors NASA Ames Research Center.

Applying ETM to Soaring

• Robert Sumwalt - USAirways captain and sailplane pilot– Extensive experience in aviation safety– Collaborates with airlines/ALPA/NASA

• Remainder of talk combines my ideas with Robert’s ideas on adopting ETM to soaring operations

Page 21: Applying Error and Threat Management Concepts to Soaring Operations Key Dismukes Chief Scientist for Aerospace Human Factors NASA Ames Research Center.

Principles of ETM

• Recognize vulnerability to errors– Especially decision-making and prospective memory

• • • • • •

Page 22: Applying Error and Threat Management Concepts to Soaring Operations Key Dismukes Chief Scientist for Aerospace Human Factors NASA Ames Research Center.

Principles of ETM

• Recognize vulnerability to errors• Identify threats

– Three domains of threat:(1) Those present of every flight (e.g., rope break)

(2) Threats present only on a particular flight (e.g., my Indian    Valley escapade)

(3) Threats specific to certain situations (e.g., Carl Herald’s topic    today)

– Before takeoff and before each phase of flight:Identify threats, ask “what if…”, and develop plan

– Presence of multiple threats is especially hazardous• Vulnerability to error goes way up• Treat as red warning flag

Page 23: Applying Error and Threat Management Concepts to Soaring Operations Key Dismukes Chief Scientist for Aerospace Human Factors NASA Ames Research Center.

Principles of ETM

• Recognize vulnerability to errors• Identify threats• Treat interruptions, distractions, and deferred

tasks as red warning flags– Cannot identify threat in advance

Page 24: Applying Error and Threat Management Concepts to Soaring Operations Key Dismukes Chief Scientist for Aerospace Human Factors NASA Ames Research Center.

Principles of ETM

• Recognize vulnerability to errors• Identify threats• Treat interruptions, distractions, and deferred

tasks as red warning flags• Redundancy: develop multiple layers of defense

– Defenses (countermeasures) are essential– No defense is perfect (e.g., checklists)– Thus need defense in depth– Threats and error much less likely to penetrate multiple

layers– Be sure layers are independent of each other

Page 25: Applying Error and Threat Management Concepts to Soaring Operations Key Dismukes Chief Scientist for Aerospace Human Factors NASA Ames Research Center.

Layers of Defense to prevent assembly errors

Adapted from Robert Sumwalt

Self-connectingcontrols

Pilot assembles glider

Positivecontrol check

Independent criticalassembly check

Wing runnercheck

Page 26: Applying Error and Threat Management Concepts to Soaring Operations Key Dismukes Chief Scientist for Aerospace Human Factors NASA Ames Research Center.

Principles of ETM

• Recognize vulnerability to errors

• Identify threats

• Treat interruptions, distractions, and deferred tasks as red warning flags

• Redundancy: develop multiple layers of defense

• Communicate your perceptions of threats and risk to fellow pilots– Soaring mostly single pilot, but always a team operation

• Other pilots may not have noticed the threat• Help each other remember critical items to perform

Page 27: Applying Error and Threat Management Concepts to Soaring Operations Key Dismukes Chief Scientist for Aerospace Human Factors NASA Ames Research Center.

Principles of ETM• Recognize vulnerability to errors

• Identify threats

• Treat interruptions, distractions, and deferred tasks as red warning flags

• Redundancy: develop multiple layers of defense

• Communicate your perceptions of threats and risk to fellow pilots

• Standardize critical procedures– Airlines and military rely on SOP to reduce errors and accidents– Soaring ops are different situation, but …– Each pilot should:

• Work out explicit procedures for critical tasks• Develop strong habits to standardize execution of tasks

– Standardizing your procedures reduces vulnerability to forgetting items or performing incorrectly

– Use checklists

Page 28: Applying Error and Threat Management Concepts to Soaring Operations Key Dismukes Chief Scientist for Aerospace Human Factors NASA Ames Research Center.

Integrated Suggestions for Reducing Vulnerability to Error and Managing Threats

1) Recognize areas of vulnerability to errors

2) Identify and voice your assumptions

3) Identify threats

4) Ask “What if …”

5) Communicate your perceptions of threats to fellow pilots

6) Always have a back door

7) Treat interruptions, distractions, and deferred tasks as red  flags:– Minimize interruptions during critical tasks

– Create salient cues as reminders

8) Break head-down tasks into small steps and interleave with  scanning the horizon

9) Standardize critical procedures

10) Use checklists