April 12-14, 2010 Sheraton New Orleans Effective Techniques for Risk Measurement Steven Ross Executive Principal Risk Masters, Inc. April 12-14, 2010 Sheraton New Orleans Agenda • The Failure of Current Techniques • A Fresh Approach to Risk Measurement • The Theory Behind the Techniques • A Practical Example
22
Embed
Effective Techniques for Risk Measurement...April 12-14, 2010 Sheraton New Orleans Effective Techniques for Risk Measurement Steven Ross Executive Principal Risk Masters, Inc. April
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
April 12-14, 2010Sheraton New Orleans
Effective Techniques for Risk Measurement
Steven Ross
Executive Principal
Risk Masters, Inc.
April 12-14, 2010Sheraton New Orleans
Agenda
• The Failure of Current Techniques
• A Fresh Approach to Risk Measurement
• The Theory Behind the Techniques
• A Practical Example
April 12-14, 2010Sheraton New Orleans
The Failure of Current Techniques
April 12-14, 2010Sheraton New Orleans
Measuring and Managing Risk
• It is axiomatic that “If it can’t be measured, it can’t be managed”– And yet, standard techniques for continuity risk
management do not address risk measurement
• Risk Management is an aspect of many disciplines– Finance– Insurance– Military– Enterprise Risk Management
• In Business Continuity Management we fall back on the simplistic “classic” formula
April 12-14, 2010Sheraton New Orleans
The “Classic” Formula
• Risk = Impact X Probability, where– Impact = Expected cost per incident– Probability = Expected number of
incidents/Time (usually one year)
• The “classic” formula deals with exposure, not risk– Annualized loss expectancy from predictable
causes– Risk is the measure of the uncertainty of loss
April 12-14, 2010Sheraton New Orleans
Failure of the “Classic” Formula
• We do not know what the impact of rare events will be– Therefore, we fall back on “worst case”– But the worst case is only one of many outcomes
of a given incident• The rate of occurrence of catastrophic events
is unknowable– Thus differentiated probability is meaningless
• How many airplanes have ever flown into buildings? Tsunamis that killed hundreds of thousands?
• No matter the number of occurrences, it might happen today
April 12-14, 2010Sheraton New Orleans
Failure of the “Classic” Formula, continued
• Thus, Risk = Impact X Probability is the product of the unknown and the unknowable!– No wonder the “classic” formula fails
• As a result, we get useless risk assessments that tell us that– Tornadoes are a risk in Kansas– Ice storms are not a risk in Miami– Unmitigated risks are the greatest of all
• Electromagnetic pulse• Animal or insect infestation
• And yet all the major standards are based on the “classic” formula
April 12-14, 2010Sheraton New Orleans
What the Standards Say…
• BS 25999– Risk is “an average effect by summing the
combined effect of each possible consequenceweighted by the associated likelihood of each consequence”
• ISO 27005– “Risk estimation [is the] process to assign values
to the probability and consequences of a risk”• NFPA 1600
– “Risk assessment…categorize[s] threats, hazards, or perils by both their relative frequencyand severity”
April 12-14, 2010Sheraton New Orleans
…And Do Not Say
• None of them even mention risk measurement
• None address the underlying rationale for – Multiplication of impact
and probability
– Limiting risk to only impact and probability
April 12-14, 2010Sheraton New Orleans
Towards a New Formula
• Risk = ʄ (impact, probability)
April 12-14, 2010Sheraton New Orleans
Towards a New Formula
• Risk = ʄ (impact, probability, credibility, resources, scale, duration, mean time to repair, mean time to recurrence…)– And many other factors that
• Can be described but not quantified
• Attract differing viewpoints as to values and weighting
April 12-14, 2010Sheraton New Orleans
A Fresh Approach to Risk Measurement
April 12-14, 2010Sheraton New Orleans
Overview of the Approach
Measure the effect on critical resources, not the threats to them
Categorize the impacts
Scale the categories
Determine the credibility of each level of risk
Consider frequency of occurrence
Availability of people, premises, information, networks, raw materials
Availability of people, premises, information, networks, raw materials
Total loss, significant damage, moderate damage, minimal damage
Total loss, significant damage, moderate damage, minimal damage
Credible or not credibleCredible or not credible
Often, occasionally, rare
Often, occasionally, rare
Office facilities intact but not accessible
Office facilities intact but not accessible
VOIP telephony if the Internet is down
VOIP telephony if the Internet is down
Loss of all personnel vs. loss of a few people
Loss of all personnel vs. loss of a few people
Not credible that all personnel are lost but credible to lose some
Not credible that all personnel are lost but credible to lose some
Bad weather often, terrorism rarely
Bad weather often, terrorism rarely
In each step, there are variables to consider for each risk being measured
Examples of risks that would not fit into the “classic formula”
April 12-14, 2010Sheraton New Orleans
Technique #1 – Focus on Resources
• Measure the effect on critical resources, not the threats to them– The set of causes is infinite and unknowable– The set of resources is finite and known, e.g.,
• Working premises• Human resources• Data• Equipment• Information systems• Voice and data networks• Raw materials
– Thus the measurement of risk is the consequential effect of disruption of these resources
April 12-14, 2010Sheraton New Orleans
Technique #2 - Categorization
• Destruction (the resource no longer exists) –Consider the smoking hole
• Inaccessibility (the resource exists but we cannot get to it) – Consider offices on the fiftieth floor when the elevator does not work
• Unavailability (the resource exists but is rendered inoperable) – Consider hacks that stop Internet web sites
• Unusability (the resource exists but it is malfunctioning) – Consider a VOIP telephone systems if Internet connectivity is lost
• Incapacity (the resource exists and functions as expected, but not at a sufficient level) – This usually occurs at a gradual pace, but consider a computer virus that slows a network to a crawl
April 12-14, 2010Sheraton New Orleans
Technique #2 – Categorization, continued
• There are other categories that might apply in specific circumstances
• Not all categories apply to all resources– “Unusable” people?
April 12-14, 2010Sheraton New Orleans
Technique #3 - Scale
• Each of the impact categories might occur at different levels, e.g.,– Total loss (i.e., worst case)
– Most of the resource affected
– Some of the resource affected
– Unit damage
– Inconsequential effect
Each of these presents its own distinct risk profile
Each of these presents its own distinct risk profile
April 12-14, 2010Sheraton New Orleans
Technique #4 - Credibility
• Some risks exist but need not be taken seriously, in context– If a risk is credible, then some
response is required• If only risk acceptance
• The test of credibility is entirely subjective, based on the perspective of the observer– Multiple observers might provide a
better measurement– Fuzzy but correct sets of data points
are better than precisely wrong ones
April 12-14, 2010Sheraton New Orleans
Technique #5 - Frequency
• Related to, but not the same as, probability• Enables the distinction between high
frequency-low impact and low frequency-high impact events
• Fuzzy terminology is helpful in distinguishing levels of risk, e.g.– Routine– Frequent– Sometimes– Rare– Never
April 12-14, 2010Sheraton New Orleans
The Theory Behind the Techniques
April 12-14, 2010Sheraton New Orleans
Risk is Not an Absolute
• Risk measurement depends on– Who is doing the measuring
– What is at risk
– To what degree of accuracy
– Within which bounds
• Let’s do an experiment!
April 12-14, 2010Sheraton New Orleans
Accuracy and Precision
• The goal of risk measurement should be accuracy, forsaking precision
• Fuzzy mathematics enables this– A methodology for systematically handling
concepts that embody imprecision and vagueness
April 12-14, 2010Sheraton New Orleans
Fuzzy Sets and Systems
• The mathematics of fuzzy set theory was originated by L. A. Zadeh in 1965
• Fuzziness describes objects or processes that are not amenable to precise definition or precise measurement
• Fuzzy systems – Processes that are too complex to be modeled by using
conventional mathematical methods • Vaguely defined and have some uncertainty in their description
– The uncertainty and fuzziness arising from interrelated humanistic types of phenomena such as
• Fuzziness in thinking and reasoning processes is an asset since it makes it possible to convey a large amount of information with a very few words– Uncertainty characterized by structures that
lack sharp (well-defined) boundaries
– A modeling link between the human reasoning process, which is vague, and computers, which accept only precise data
April 12-14, 2010Sheraton New Orleans
An Example of Fuzziness
• Conventionally, we might say that temperature is an absolute (e.g., 20o, 30o
…100o)
• But we do not perceive temperature that way– Rather as very cold, cold, moderate, hot, etc.
• The determination of the temperature is subjective, with varying degrees of certainty
April 12-14, 2010Sheraton New Orleans
An Example of Fuzziness, continued
• Note that all the rows add up to 1 – (or 100%)
Temp. in Farenheit Very cold Cold Cool Moderate Warm Hot
Destruction Not credible Not credible Rare Infrequent Does not apply
Inaccessibility Not credible Rare Infrequent Infrequent Does not apply
Unavailability Rare Rare Sometimes Frequent Does not apply
Unusability Does not apply Does not apply Does not apply Does not apply Does not apply
Incapacity Does not apply Does not apply Does not apply Does not apply Does not apply
People
Risk Factors Risk Scales
Equipment
Data
Network
April 12-14, 2010Sheraton New Orleans
Impact and Ranking
• This is an example of a risk assessment derived from fuzzy risk measurement
• What does this tell us?– Insufficient recoverability established for Company X’s data center and equipment– After that, their worst cases are inaccessibility and unavailability, not destruction of
other resources
Resource Category Catastrophic Significant Some Minor Inconsequential
Destruction X
Inaccessibility X
Unavailability X
Unusability X
Incapacity X
Destruction X
Inaccessibility X
Unavailability X
Destruction X
Inaccessibility X
Unavailability X
Unusability X
Incapacity X
Destruction X
Inaccessibility X
Unavailability X
People
ImpactRisk Factors
Equipment
Data
Network
April 12-14, 2010Sheraton New Orleans
Thank You
“What you don’t know is far more important than what you do know”