9: EARTHQUAKE RECURRENCE Crucial for hazards, earthquake physics & tectonics (seismic versus aseismic deformation) Recordings of the east-west component.
Post on 13-Dec-2015
232 Views
Preview:
Transcript
9: EARTHQUAKE RECURRENCE
Crucial for hazards, earthquake physics & tectonics (seismic versus aseismic deformation)
Recordings of the east-west component of motion made by Galitzin instruments at DeBilt, the Netherlands. Recordings from the 1922 earthquake (shown in black) and the 1934 and 1966 events at Parkfield (shown in red) are strikingly similar, suggesting virtually identical ruptures.
EARTHQUAKE FREQUENCY - MAGNITUDE
LOG-LINEAR Gutenberg-Richter
RELATION
LEVEL OF ACTIVITY (a value) VARIES
REGIONALLY
BUT b ~ 1
MOMENTS HAVE SIMILAR CURVE TO MAGNITUDES
but slope = 2/3
WHY TOO FEW VERY LARGE EARTHQUAKES?
EXPECT = 2/3
LARGE EVENTS SHOW = 1
Most earthquakes between solid lines with slope 1/3, showing M0 proportional to L3. However, strike-slip earthquakes (solid diamonds) have moments higher than expected for their fault lengths, because above a certainmoment fault width reaches maximum, so fault grows only in length.
Romanowicz, 1992
Total global seismic moment release dominated by few largest events
Total moment for 1976-1998 ~1/3 that of giant 1960 Chilean earthquake
Global Earthquakes Continental Intraplate
Stein & Wysession, 2003Triep & Sykes, 1997
CHALLENGE: INFER UNKNOWN RATE OF LARGEST EARTHQUAKES FROM RECORDED RATE OF SMALLER ONES
Use standard log-linear Gutenberg-Richter relationship
With seismological data only, log-linear relation breaks down
Largest earthquakes (M > 7-7.5) less frequent than expected,
presumably due to fault finiteness (large event lengths >> width)
Magnitude (Ms) Magnitude (Ms)
Nu
mb
er
per
yea
r
GUTENBERG-RICHTER RELATIONSHIP: INDIVIDUAL FAULTSWasatch
Basel, Switzerland
paleoseismic data
instrumental data
Youngs & Coppersmith, 1985 Meghraoui et al., 2001
paleoseismic data
historical data
Largest events deviate in either direction, often when different data mismatch
When more frequent than expected termed characteristic earthquakes. Alternative are uncharacteristic earthquakes
Could these differences - at least in some cases - be artifacts?
CharacteristicUncharacteristic
EARTHQUAKE RECURRENCE IS HIGHLY VARIABLE
M>7 mean 132 yr 105 yr Estimated probability in 30 yrs 7-51%
Sieh et al., 1989
Extend earthquake history with paleoseismology
Magnitude
Magnitude
POSSIBLE ARTIFACTS CAUSING SPURIOUS CHARACTERISTIC
OR UNCHARACTERISTIC EARTHQUAKES
Magnitude
Ear
thqu
ake
Rat
e
Undersampling: record comparable to or shorter than mean recurrence - most events characteristic, because can’t have a fraction of an earthquake. Can also miss largest events. Effect similar for longer records.
Direct paleoseismic study:
Magnitude overestimated, events appear characteristic.
Events missed, recurrence overestimated, events appear uncharacteristic
Indirect paleoseismic using assumed geologic slip & earthquake size:
Long term slip rate overestimated or aseismic slip unaccounted for, events appear characteristic
CHARACTERISTIC
UNCHARACTERISTIC
LONG RECORDS SHOW RECURRENCE VARIABILITY
~ 0.4 Tav or higher seems reasonable description of variability
Log-normal with ~ 0.2 Tav (Nishenko & Buland, 1987) can underestimate
SIMULATIONS
For histories = 0.5 Tav any M7 earthquakes appear characteristic.
No uncharacteristic ones since cannot record fractions of events.
Often miss largest earthquakes (no M7 observed)
10,000 synthetic earthquake histories for G-R relation with slope b=1
Gaussian recurrence times for M> 5, 6, 7
Various history lengths given in terms of Tav, mean recurrence for M>7
SHORT SIMULATIONS
For histories = 1,2 Tav many earthquakes appear characteristic
For 2 Tav as many uncharacteristic & characteristic events
LONGER SIMULATIONS
Distributions about Tav tighten up
No bias: as many uncharacteristic as characteristic events
Still likely to overestimate or underestimate large event rate
CHARACTERISTIC EARTHQUAKE RESULTS VARY WITH SPATIAL SAMPLING
Characteristic earthquakes on Wasatch fault (Chang and Smith, 2002), but not in entire Wasatch front (data from Pechmann and Arabasz, 1995)
ESTIMATING EARTHQUAKE
PROBABILITIES
A game of chance, with unknown rules, and very little data from which to infer
them
CHALLENGE: DON’T KNOW WHAT PROBABILITY DISTRIBUTION DESCRIBES EARTHQUAKE RECURRENCE TIMES
POISSON DISTRIBUTION
TIME INDEPENDENT MODEL OF
EARTHQUAKE PROBABILITY
Used to describe rare events: include volcanic eruptions, radioactive decay, and number of
Prussian soldiers killed by their horses
TIME INDEPENDENT VERSUS TIME DEPENDENT
MODEL
GAUSSIAN DISTRIBUTION
TIME DEPENDENT MODEL OF
EARTHQUAKE PROBABILITY
Probability of large earthquake a time t after
the past one is p(t, , )
Depends on average and variability of recurrence times, described by the mean and standard
deviation
p is probability that recurrence time for this
earthquake will be t, given an assumed distribution of
recurrence times.
CONDITIONAL PROBABILITY
Use the fact that we know the next
earthquake hasn’t already happened
Gaussian
SAN ANDREAS FAULT PALLETT CREEK SEGMENT
Gaussian (time dependent) model
In 1983, estimate 9% probability by 2003, increases with time
Gaussian
SAN ANDREAS FAULT PALLETT CREEK SEGMENT
Poisson (time independent) model
In 1983, estimate 10% probability by 2003,
constant with time
SYNTHETIC EARTHQUAKE HISTORIES Gaussian model yields more periodic series; Poisson model yields clustering
Which looks more like earthquake history?
SEISMIC GAP MODEL
Long plate boundary like the San Andreas or an oceanic trench ruptures in segments
Expect steady plate motion to cause earthquakes that fill in gaps that have not ruptured for a long time
Gap exists when it has been long enough since the last major earthquake that time-dependent modelspredict earthquake probability much higher than expectedfrom time-independent models
Sounds sensible but seems not to work well, for unknown reasons GAP?
NOTHING YET
EARTHQUAKE FORECASTS: EASY TO MAKE, HARD TO TEST
Hard to prove right or wrong
Because the estimates must be tested using data that were not used to derive them, hundreds or thousands of years (multiple recurrences) will be needed to assess how well various models predict large earthquakes
on specific faults or fault segments.
The first challenge is to show that a model predicts future earthquakes significantly better than the simple time-independent Poissonian model
Given human impatience, attempts have been made to conduct alternative tests using smaller earthquakes or many faults over a short
time interval.
To date, results are not encouraging.
RECENT SEISMICITY MAY NOT REFLECT
LONG-TERM PATTERN WELL
Random seismicity simulation for fault along
which probability of earthquake is uniform
Apparent seismic gaps develop
May take long time to fill compared to length of
earthquake record
Stein & Wysession, 2003
PARKFIELD, CALIFORNIA SEGMENT OF SAN ANDREAS
Characterized by smaller earthquakes that occur more frequently and appear much more periodic than other segments.
Earthquakes of M 5-6 occurred in 1857, 1881, 1901, 1922, 1934, and 1966.
Average recurrence is 22 yr; linear fit made 1988 likely date of the next event.
In 1985, predicted at 95% confidence level that the next earthquake would occur by 1993
Actually didn’t occur till 2004 (16 years late)
Problems:
Limitations of statistical approach in prediction (including omission of 1934 earthquake on the grounds that was premature and should have occurred in 1944)
Unclear whether Parkfield shows such unusual quasi-periodicity because it differs from other parts of San Andreas (in which case predicting earthquakes there might not be that helpful for others), or results simply from the fact that given enough time
& fault segments, random seismicity can yield apparent periodicity somewhere
Within 10 years of
prediction, 10 large events occurred in
these areas. None were in
high- or intermediate-risk areas; 5 were in low-risk areas.
GLOBAL TEST OF SEISMIC GAP HYPOTHESIS
Gap map forecasting locations of major earthquakes did no betterthan random guessing.
Many more large earthquakes occurred in areas identifiedas low risk than in presumed higher-risk gaps (reverse colors?)
Result appears inconsistent with ideas of earthquake cycles and seismic gaps
Kagan & Jackson, 1991
EARTHQUAKE PROBABILITY MAPS
Hard to assess utility of such maps for many years
Major uncertainties involved
Perhaps only meaningful to quote probabilities in broad ranges, such as low (<10%), intermediate (10-90%), or high (>90%).
EARTHQUAKE PREDICTION?
Because little is known about the fundamental physics of faulting, many attempts to predict earthquakes searched for precursors, observable behavior that precedes earthquakes. To date, search has proved generally unsuccessful
In one hypothesis, all earthquakes start off as tiny earthquakes, which happen frequently, but only a few cascade via random failure process into large earthquakes
This hypothesis draws on ideas from nonlinear dynamics or chaos theory, in which small perturbations can grow to have unpredictable large consequences. These ideas were posed in terms of the possibility that the flap of a butterfly's wings in Brazil might set off a tornado in Texas, or in general that minuscule disturbances do not affect the overall frequency of storms but can modify when they occur
If so, there is nothing special about those tiny earthquakes that happen togrow into large ones, the interval between large earthquakes is highly variable and no observable precursors should occur before them. Thus earthquake prediction is either impossible or nearly so.
“It’s hard to predict earthquakes, especially before they happen”
PROBABILISTIC SEISMIC HAZARD ASSESSMENT (PSHA)
Seek to quantify risk in terms of maximum expected acceleration in some time period (2% or 10% in 50 yr, or once in 2500 or 500 yr)
Maps made by assuming:
Where and how often earthquakes will occur
How large they will be
How much ground motion they will produce
Because these factors are not well understood, especially on slow moving boundaries or intraplate regions where large earthquakes are rare, hazard estimates have considerable uncertainties and it will be a long time before we know how well they’ve done
“A game of chance of which we still don't know all the rules"
10% EXCEEDENCE PROBABILITY
(90% NON EXCEEDENCE)
WITHIN 50 YEARS
Jimenez, Giardini, Grünthal (2003)
SHORT RECORD OF SEISMICITY & HAZARD ESTIMATE
Predicted hazard from historic seismicity is highly variable
Likely overestimated near recent earthquakes, underestimated elsewhere
More uniform hazard seems more plausible - or opposite if time dependence considered
Map changes after major earthquakes
Africa-Eurasia convergence rate varies smoothly
GSHAP
NUVEL-1Argus et al., 1989
SHORT RECORD OF SEISMICITY & HAZARD ESTIMATE
Predicted hazard from historic seismicity is highly variable
Likely overestimated near recent earthquakes, underestimated elsewhere
More uniform hazard seems more plausible - or opposite if time dependence considered
Map changes after major earthquakes
Africa-Eurasia convergence rate varies smoothly
GSHAP 1998
NUVEL-1Argus et al., 1989
2004
2003
M>7
Canadian east coast: seismicity clusters
Seismic zone along eastern coast of Canada & US passive
continental margin
May reflect reactivation of rifting faults from continental breakup, perhaps by deglaciation and/or
other stress
Largest events (M 7) in Baffin Bay, Grand Banks
Are these concentrations a real phenomenon or artifacts
of seismic record length?
Issue important for both passive margin tectonics and earthquake
hazard
Stein et al., 1979
Years 100 500 1000 3000 5000 8000# of events 3 12 24 72 117 187recurrence 33 42 42 42 43 43time
Stein et al., 1979
Synthetic M>7 earthquake history
0
35
00
70
00
di
stan
ce (
km)
CLUSTERS COULD EASILY BE ARTIFACT OF SHORT RECORD
LONG-TERM SEISMICITY & HENCE HAZARD COULD BE UNIFORM
Peak Ground Acceleration10% probability of exceedance
in 50 years
GSHAP (1999)GSHAP (1999)
Present StudyPresent StudyHUNGARY:
ALTERNATIVE HAZARD MAPS
Historic seismicity
Seismicity + geoology:
Diffuse hazard
Toth et al., 2004
EASTERN US versus CANADA:
ALTERNATIVE HAZARD MAPS
Historic seismicity
Diffuse Hazard
Halchuk and Adams, 1999
IS NEW MADRID AS HAZARDOUS AS CALIFORNIA?
Frankel et al., 1996
Proposed new building code would require California standards
EFFECTS OF ASSUMED
GROUND MOTION MODEL
Effect as large as one magnitude unit
Frankel model, developed for maps, predicts significantly greater shaking for M >7
Frankel M 7 similar to other models’ M 8
Frankel & Toro models averaged in 1996 maps; Atkinson & Boore not used
Newman et al., 2001
UNCERTAINTIES IN NMSZ HAZARD MAPS
Areas of predicted significant hazard differ significantly, depending on poorly known parameters.
Assumed Mmax on main fault has largest effect near fault.
Assumed ground motion model has regional effect, because it also influences predicted hazard from earthquakes off main fault.
Newman et al., 2001
UNCERTAINTIES IN NMSZ HAZARD MAPS
Areas of predicted significant hazard differ significantly, depending on poorly known parameters.
Differences have major policy implications (e.g. Memphis & St. Louis).
Uncertainties won’t be resolved for 100s-1000s years
Uncertainties dominated by systematic errors (epistemic) and hence likely underestimated
Newman et al., 2001
ASSUMED HAZARD DEPENDS ON TIME WINDOW
Over 100 years, California site much more likely to be shaken strongly than NMSZ one
Over 1000 years, some NMSZ sites shaken strongly a few times; many in California shaken many times
Short time relevant for buildings with 50-100 yr life
Shaken areas MMI > VII
Random seismicity simulation including seismicity & ground motion differences
$100M seismic retrofit of Memphis
VA hospital, removing nine
floors, bringing it to California
standard
Does this make sense?
How can we help society decide?
THERE ARE NO UNIQUE OR CORRECT STRATEGIES, SO SOCIETY HAS TO MAKE TOUGH CHOICES.
Mitigating risks from earthquakes or other natural disasters involves economic & policy issues as well as the scientific one of estimating the
hazard and the engineering one of designing safe structures.
SHOULD BUILDINGS IN MEMPHIS MEET CALIFORNIA STANDARD?
New building code IBC 2000, urged by FEMA, would raise to California level (~ UBC 4)
Essentially no analysis of costs & benefits of new code
Year
J. Tomasello
Code
Proposed new code results largely from redefining the hazard frommaximum shaking at a geographic point over 500 yr (10% in 50 yr) to the
maximum every 2,500 yr ( 2% in 50 yr.
Much shorter life of ordinary structures.
Definition allows New Madrid hazard to be similar to that in California, although annual California hazard is much lower.
By similar argument, in very long (three million hand) poker game. probability of at least one pair and royal flush are comparable - although in one hand, the
probability of a pair is ~ 43%, whereas that of a royal flush is far less, ~ 1 / 100,000
Using this argument would lose money in ordinary duration game.
THOUGHT EXPERIMENT: TRADEOFF
Your department is about to build a new building.
The more seismic safety you want, the more it will cost.
You have to decide how much of the construction budget to put into safety. Spending more makes you better off in a future large earthquake. However, you’re worse off in the intervening years, because that money isn't available
for office and lab space, equipment, etc.
Deciding what to do involves cost-benefit analysis. You try to estimate the maximum shaking expected during the building's life, and the level of
damage you will accept.
You consider a range of scenarios involving different costs for safety and different benefits in damage reduction.
You weigh these, accepting that your estimates for the future have considerable uncertainties, and somehow decide on a
balance between cost and benefit.
THIS PROCESS, WHICH SOCIETY FACES IN PREPARING FOR EARTHQUAKES, ILLUSTRATES TWO PRINCIPLES:
“There's no free lunch”
Resources used for one goal aren’t available for another, also desirable, one. In the public sector there are direct tradeoffs. Funds spent strengthening schools aren’t available to hire teachers, upgrading hospitals may mean covering fewer
uninsured (~$1 K/yr), stronger bridges may result in hiring fewer police and fire fighters (~$50 K/yr), etc...
“There's no such thing as other people's money”
Costs are ultimately borne by society as a whole. Imposing costs on the private sector affects everyone via reduced
economic activity (a few % cost increase may decide whether a building isn’t built or build elsewhere), job loss (or reduced growth), and the resulting reduction in tax revenue and thus
social services.
INITIAL COST/BENEFIT ESTIMATES: MEMPHIS AREA
I: Present value: FEMA estimate of annual earthquake loss $17 million/yr, part of which would be eliminated by new code, ~ 1% of annual construction costs ($2 B).
II: Life-of-building: Use FEMA estimate to infer annual fractional loss in building value from earthquakes. If loss halved by new code, than over 50 yr code saves 1% of building value.
If seismic mitigation cost increase for new buildings with IBC 2000 >> 1%, probably wouldn't make sense.
Similar results likely from sophisticated study including variations in structures, increase in earthquake resistance with time as more structures
meet code, interest rates, retrofits, disruption costs, etc.
LIFE SAFETY
U.S. earthquake risk primarily to property; annualized losses estimated at ~$4 billion.
Also ~10 deaths/yr, averaged over larger numbers in major earthquakes. Annual fatalities roughly constant since 1800, presumably in part because
population growth in hazardous areas offset by safer construction.
Situation could likely be maintained or improved by strengthening building codes, so the issue is how to balance this benefit with alternative uses of
resources (flu shots, defibrillators, highway upgrades, etc.) that might save more lives for less.
Estimated cost to save life (in U.S.) varies in other applications:~$50 K highway improvements
~$100 K medical screening~$5 M auto tire pressure sensors
Different strategies likely make sense in different areas within the U.S. and elsewhere, depending on earthquake risk, current building codes, and
alternative demands for resources.
Hence seismic mitigation costs in Memphis area - $20-200 M/yr (1-10% new construction cost) + any retrofits - could insure
20,000 - 200,000 people and save some lives that way
Tricky tradeoff here
TAKE TIME TO GET THINGS RIGHT
Because major earthquakes in a given area are infrequent on human timescale, we generally have time to formulate strategy carefully
(no need to rush to wrong answer)
Time can also help on both the cost and benefit sides.
As older buildings replaced by ones meeting newer standards, overall earthquake resistance increases. Similarly, even where retrofitting isn't
cost-effective, higher standards for new ones may be.
Technological advances can make additional mitigation cheaper and more cost-effective.
If understanding of earthquake probabilities becomes sufficient to confidentially identify how probabilities vary with time, construction
standards could be adjusted accordingly where appropriate.
WE ARE NOT ALONE
There's increasing interest in making mitigation policy more rationally for other hazards with considerable uncertainties.
“The direct costs of federal environmental, health, and safety regulations are probably ~$200 billion annually, about the size of all federal domestic,
nondefense discretionary spending. The benefits of those regulations are even less certain. Evidence suggests that some recent regulations would pass a
benefit-cost test while others would not.”
(Brookings Institution & American Enterprise Institute)
Viewing seismology and engineering as part of a holistic approach to hazards mitigation will make our contributions more useful to society.
This utility will grow as we learn more about earthquakes and their effects in different areas.
top related