YOU ARE DOWNLOADING DOCUMENT

Please tick the box to continue:

Transcript
  • 7/27/2019 How to Avoid Catastrophe

    1/13

    HOW TO AVOID

    CATASTROPHE

    A presentation

    By G-10:

    Shruti Gupta

    Guldeep Singh

    Satyam Kumar

    Puneet Jain

    Karuna Miglani

    Mohit Gupta

    Prajwal Menon

  • 7/27/2019 How to Avoid Catastrophe

    2/13

    INTRODUCTION

    Near misses are not just close calls that could have been a lot worse but they c

    be unremarked small failures that penetrate day to day business that cause no

    immediate harm.

    People commonly misinterpret and ignore the warnings embedded in these fail

    Each near miss, rather than raise alarms and prompt investigations, was instead

    as an indication that existing methods and safety procedures worked.

    Various researches showed that every disaster was preceded by a number of ne

    misses and most of these misses were ignored or misread.

  • 7/27/2019 How to Avoid Catastrophe

    3/13

    COGNITIVE ERROR

    Research shows that particularly 2 cogni tive biasesconspire to blind managers misses.

    The first one is: Normali zation of deviance. It is the tendency to accept anomal

    particularly the risky ones as normal over the time.

    The second cognitive error is Outcome Bias. When people observe successful

    outcomes they tend to focus on the results rather than focussing on the process t

    to it.

  • 7/27/2019 How to Avoid Catastrophe

    4/13

    ROOTS OF CRISIS

    Organizational disasters do not have a single cause.

    These disasters are triggered by latent errors.

    Latent errors are seemingly small, unimportant human errors, technological failu

    or bad business decisions.

    These latent errors when combined with enabling conditions lead to significant

    failures.

    Near misses also arise from the same preconditions but in absence of enabling

    conditions, they produce only small failures and thus go undetected.

  • 7/27/2019 How to Avoid Catastrophe

    5/13

    EXAMPLES

    BAD APPLE

    Immediately after the launch of Apples iPhone4 in June 2010, customers bcomplaining about dropped calls and poor signal strength.

    Apple initially blamed the users only for holding the phone in a wrong way

    covering the antenna.

    Customers found Apples posture disrespectful and also filed a lawsuit aga

    alleging defective design.

    This reputation crisis of Apple reached crescendo when consumer reports dto recommend iPhone4.

    They eventually had to acknowledge software errors and offered updates to

    owners.

    If apple would have recognized this latent error and responded, it would ha

    avoided this crisis.

  • 7/27/2019 How to Avoid Catastrophe

    6/13

    SPEED

    WARNING

    On August 28,2009, California Highway Patrol officer

    Mark Saylor and 3 family members died in a fiery crashafter the pedal of the Toyota Lexus Sedan he was

    driving stuck and thus accelerating the car to 120 miles

    per hour.

    Before that Toyota had received about 2000 complaints

    of unintended acceleration among its cars.

    Ultimately Toyota halted the sales of its eight models

    sustaining an estimated $2 billion loss in North America

    alone and a huge damage to its reputation.

    Toyota could have averted this crisis by paying

    attention to customers complaints and reacting within

    time.

  • 7/27/2019 How to Avoid Catastrophe

    7/13

    JET BLACK AND BLUE

    Since the time of operation jet airways has taken aggressive steps against bad weat

    Instead of cancelling flights it asked it pilots to pull away from gates in severe wea

    be near the front of the line when runways were cleared for takeoff :even if that me

    planes would sit for some time on the tarmac.

    For many years it was working fine, but on feb 14,2007 a massive ice storm at Ne

    F. Kennedy International Airport caused widespread disruption.

    Pilots found themselves stuck on the tarmac and with no open gates to return. Passe

    several planes were trapped for upto 11 hours in overheated, foul-smelling cabins wi

    or water.

    Jet Blue managers ignored the risk and saw only successful launched flights. This w

    combined with the enabling condition ie the ferocious ice storm, turned the latent err

  • 7/27/2019 How to Avoid Catastrophe

    8/13

    RECOGNIZING AND PREVENTING NE

    MISSES

    Heed high pressur e

    Greater the pressure to meet performance goals such as tight schedules, cost, or produ

    more likely managers are to discount near-miss signals or misread them as signs of soun

    making.

    When people make decisions under pressure, psychological research shows, they ten

    heuristics, or rules of thumb, and thus are more easily influenced by biases.

    Organizations should encourage, or even require, employees to examine their decisio

    pressure-filled periods.

  • 7/27/2019 How to Avoid Catastrophe

    9/13

    Learn from deviations

    Managers should seek out operational deviations from the norm and examine wh

    reasons for tolerating the associated risk have merit.

    Eg: As the Toyota and JetBlue crises suggest, managers response when some asp

    operations skews from the norm is often to recalibrate what they consider acceptabl

    Research shows that in such cases, decision makers may clearly understand the st

    represented by the deviation, but grow increasingly less concerned about it.

  • 7/27/2019 How to Avoid Catastrophe

    10/13

    Uncover root causes

    When managers identify deviations, their reflex is often to correct the symptom rathe

    cause.

    Eg:Apples response when it at first suggested that customers address the antenna pro

    changing the way they held the iPhone.

    Eg: A near miss at Delnor-Community Hospital, in Geneva, Illinois. Two patients shar

    room had similar last names and were prescribed drugs with similar sounding namesC

    Cytoxan. Confused by the similarities, a nurse nearly gave one of the drugs to the wrong

    Luckily, the mistake was caught in time .The hospital immediately separated the patients

    policy to prevent patients with similar names from sharing rooms in the future.

    Demand accountability

    Even when people are aware of near misses, they tend to downgrade their importance

    limit this potentially dangerous effect is to require managers to justify their assessments

  • 7/27/2019 How to Avoid Catastrophe

    11/13

    Worst-case scenario

    People tend not to think through the possible negative consequences of near misses

    Apple managers, for example, were aware of the iPhonesantenna problems but pr

    imagined how bad a consumer backlash could get. If they had considered a worstcase

    might have headed off the crisis, research suggests.

    Evaluate projects at every stage

    When things go wrong managers evaluate it.

    When they go well, only few do formal reviews of the success to capture its lesson

    misses can look like successes, they often escape scrutiny.

    Eg:Edward Rogers, chief knowledge officer at NASA instituted a pause and learn

    which teams discuss at each project milestone what they have learned.

  • 7/27/2019 How to Avoid Catastrophe

    12/13

    Reward owning up

    Seeing and attending to near misses requires organizational alertness, but no am

    attention will avert failure if people arent motivated to expose near misses or eve

    are discouraged from doing so.

    E.g.: An enlisted seaman on an aircraft carrier.

  • 7/27/2019 How to Avoid Catastrophe

    13/13

    THANK YOU


Related Documents