Past Applications, Lessons Learned, Current Thinking Levi Brekke (Reclamation, Research & Development Office) NCPP Quantitative Evaluation of Downscaling Workshop, Boulder, CO Panel “Panel discussion : Using downscaled data in the real world: Sharing experiences: Part II”, 15 August 2013
30
Embed
Past Applications, Lessons Learned, Current Thinking
Past Applications, Lessons Learned, Current Thinking. Levi Brekke (Reclamation, Research & Development Office). NCPP Quantitative Evaluation of Downscaling Workshop, Boulder, CO - PowerPoint PPT Presentation
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Past Applications, Lessons Learned, Current ThinkingLevi Brekke (Reclamation, Research & Development Office)
NCPP Quantitative Evaluation of Downscaling Workshop, Boulder, CO
Panel “Panel discussion : Using downscaled data in the real world: Sharing experiences: Part II”, 15 August 2013
III. Conduct Planning Evaluations
II. Relate to Planning Assumptions
I. Choose Climate Context
Instrumental Records: observed weather (T and P)
and runoff (Q)
Demand Variability
System Analysis, Evaluate Study Questions(related to Resource Management Objectives)
Operating ConstraintsSupply Variability
Traditional climate context in planning
I. Decision-Makers: “Keep it simple.”
II. Climate Information Providers: “Here’s the info… use it wisely.”
III. Technical Practitioners (Ushers): “Keep it Manageable.”
3) Assess climate change impacts on planning assumptions (e.g., supplies, demands, and or water management constraints).
1) Survey Future Climate
Information over the study region
Analyses of Various Responses
2.a) Decide whether to cull
information, and how…
2.b) Decide how to use
retained information…
4) Assess operations and dependent resource responses; characterize uncertainties
Focusing on CA, Brekke et al. (2008) considered “historical” simulations from 17 GCMs, and found similar skill when enough metrics were considered. Focusing globally, Gleckler et al. 2008 and Reichler et al. 2008 found similar results.
Focusing on CA, projection distributions didn’t change much when the GCM-skill assessment (Brekke et al. 2008) was used to reduce the set of 17 GCMs to a “better” set of 9 GCMs.
Santer et al. PNAS 2009 – results from a global water vapor detection and attribution (D&A) study were largely insensitive to skill-based model weighting. Pierce et al. PNAS 2009 – results from western U.S. D&A study were more sensitive to ensemble size than skill-based model weighting.
Box 2.b) Two Method Classes (generally speaking)
• Period-Change– prevalent among impacts studies– “perturbed historical” at some milestone future
• Transient– time-evolving view, from past to future– prevalent in the climate science community
(“climate projections”)
7
1) Used UW CIG HB2860 scenarios (Period-Change + Transient)2) Selected smaller set of both scenario types
Period-Change type was of most interest. Goal was to select set that spans the rest:LW = less warming, MW = more warmingD = drierW = wetterC = centralMC = minimal change
Scenarios selected for big-basin change… sub-basin changes didn’t always reflect the big-basin scenarios (e.g., Upper Snake is wetter if 5 of 6 scenarios)
• Period-Change– prevalent among impacts studies– “perturbed historical” at some milestone future
• Transient– time-evolving view, from past to future– prevalent in the climate science community
(“climate projections”)
Period-Change: Overview• Historical climate variability sequence is retained (space and time)
• “Climate Change” Scenarios are defined for perturbing historical, where a change is diagnosed from a historical period to a future period
• Studies typically feature an Historical scenario and multiple climate change scenarios in order to reveal impacts uncertainty
• Several methods are available to define scenarios, differing by:– time (e.g., change in means, change in distributions) – space (e.g., change in regional condition, or change in spatilly
disaggregated conditions), and – amount of information (e.g., single climate projection, or many projections)
Period-Change: Pros and Cons
• Pros:– Retains familiar historical variability patterns – Simple frame for exploring system sensitivity– Permits “cautious” sampling of temporal aspects from climate
projections (e.g., can be simple like change in annual mean, or complex like change in monthly distribution)
• Cons:– Less ideal for adaptation planning; climate change timing matters– Diagnosing period “Climate Change” is not obvious (more of a
problem for DP than for DT)– (when single-projections inform climate change scenarios) month-
to-month changes may seem disorderly or noisy
Transient: Overview• Historical climate variability sequence not retained (but distribution
may be retained through climate projection bias-correction…)
• “Climate” Projections are selected to define an evolving envelope of climate possibility, representing simulated past to projected future– Monthly or daily time series projections typically used
• Climate Projections may be developed using various methods, e.g.:– Time series outputs from a GCM simulation (or a GCM-RCM simulation)– … bias-corrected and spatially downscaled translations of these outputs– … stochastically resampled (resequenced) versions of these outputs,
reflecting a different frequencies reference (observations, paleoproxies)
• Studies need to feature a large ensemble of climate projections to adequately portray an envelope of climate possibility through time
Transient: Pros and Cons
• Pros:– Avoids challenges of “Climate Change” diagnosis
• Not discussed, but a key issue is “multi-decadal varaibility” in projections– Supports “master planning” for CC adaptation
• schedule of adapations through time, including project triggers
• Cons:– Projection historical sequences differ from experience– Requires “aggressive” sampling of temporal information from climate
projections (frequencies vary by member, and may be questionable)– Information is more complex
• Requires use of many projections, challenging analytical capacities, and requiring probabilistic discussion of results, evolving through time… requires learning phase
III. Conduct Planning Evaluations
II. Relate to Planning Assumptions
I. Choose Climate Context
Instrumental Records: observed weather (T and P)
and runoff (Q)
Demand Variability
System Analysis, Evaluate Study Questions(related to Resource Management Objectives)
Operating ConstraintsSupply Variability
Legacy climate context for planning assumptions in water resources studies
III. Conduct Planning Evaluations
We’ve developed ways to blend climate change information into this context.
II. Relate to Planning Assumptions
I. Choose Climate Context
Instrumental Records: observed weather (T and P)
and runoff (Q)
Demand Variability
Future Operations Portrayal for OCAP BA(flows, storage, deliveries, etc.)
Operating ConstraintsSupply Variability
Runoff
watershed simulationRegional T and P
Global Climate Projections: Representing various GCMs, forcing
bias-correction, spatial downscaling
Delta Flow-Salinity Relationship
Constraint on Upstream Operations
Global T and P… Sea Level Rise
…Stream Water Temperature analyses
Regional TReservoir Operations
e.g., Reclamation 2008, Mid-Pacific Region’s Central Valley Project – Operations Criteria and Plan, Biological Assessment
III. Conduct Planning Evaluations
When using projected climate, future climate & hydrology assumptions typically reflect a blend of observed and projected information.
II. Relate to Planning Assumptions
I. Choose Climate Context
Instrumental Records: observed weather (T and P)
and runoff (Q)
Demand Variability
System Analysis, Evaluate Study Questions(related to Resource Management Objectives)
Operating ConstraintsSupply Variability
Runoff
watershed simulationRegional T and P
Global Climate Projections: Representing various GCMs, emissions