Top Banner
Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield Glasgow, Sept 2006
72

Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Mar 28, 2015

Download

Documents

Lillian Neal
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Uncertainty and sensitivity analysis- model and measurements

Marian Scott and Ron Smith and Clive AndersonUniversity of Glasgow/CEH/University of Sheffield

Glasgow, Sept 2006

Page 2: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Outline of presentation

Errors and uncertainties on measurements Sensitivity and uncertainty analysis of models Quantifying and apportioning variation in model

and data. A Bayesian approach Some general comments

Page 3: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Uncertainties on measurement

Page 4: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

The nature of measurement

All measurement is subject to uncertainty Analytical uncertainty reflects that every time a

measurement is made (under identical conditions), the result is different.

Sampling uncertainty represents the ‘natural’ variation in the organism within the environment.

Page 5: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

The error and uncertainty in a measurement

The error is a single value, which represents the difference between the measured value and the true value

The uncertainty is a range of values, and describes the errors which might have been observed were the measurement repeated under IDENTICAL conditions

Error (and uncertainty) includes a combination of variance and bias

Page 6: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Key properties of any measurement

Accuracy refers to the deviation of the measurement from the ‘true’ value (bias)

Precision refers to the variation in a series of replicate measurements (obtained under identical conditions) (variance)

Page 7: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Accurate

Imprecise

Inaccurate

Precise

Accuracy and precision

Page 8: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Evaluation of accuracy

In an inter-laboratory study, known-age material is used to define the ‘true’ age

The figure shows a measure of accuracy for individual laboratories

Accuracy is linked to Bias

1009080706050403020100

500400300200100

0-100-200-300-400-500-600

laboratory identifier

Off

set (

yea

rs B

P)

Page 9: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Evaluation of precision

Analysis of the instrumentation method to make a single measurement, and the propagation of any errors (theory)

Repeat measurements (true replicates) – using homogeneous material, repeatedly subsampling, etc…. (experimental)

Precision is linked to Variance (standard deviation)

Page 10: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

The uncertainty range

for a measurement of 4509 years with quoted error (1 sigma) 20 years, the measurement uncertainty at 2 sigma, would be 4509 40 years or 4469 to 4549 years. We would say that the true age is highly likely to lie within the uncertainty range (95% confidence)

Page 11: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

The uncertainty range on the mean

From the series of 27 replicate measurements made in a single laboratory over a period of several months. The average age of the series is 4497 years. The standard deviation of the series is 30.2 years. The error on the mean is (30.2/27) or 6 years. So the uncertainty (at 2 sigma) on the true age is 4497 12 years or 4485 to 4509 years.

Page 12: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Is the quoted error realistic?

Commonly judged by making a series of repeat measurements (replicates) and calculating the standard deviation of the series. For the 27 measurements, the st.dev. is 30.2 years but the quoted errors on individual measurements range from 13 to 33 years. So 30 years might be a more realistic individual error.

Page 13: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Are two measurements significantly different?

Two examples of measurements of a sample. The measurements were made in two different laboratories and so are assumed statistically independent.

Page 14: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Case A

a)  2759 years 39 and 2811 years20 The difference is -52 years and the error is 44

years, ((392+202)) therefore the uncertainty range is –52 88 years and includes 0.

There is no evidence that these two samples do not have the same true age. These two measurements could therefore be legitimately combined in a weighted average .

Page 15: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Case B

a)       2885 years 37 and 2781years 30. The difference is 104 years and error is 48 years,

therefore the uncertainty range is 10496 years or 8 to 200 years and does not include 0.

We could conclude that within the individual uncertainties on the measurements, these two samples do not have the same true age. Therefore these two measurements could not be legitimately combined.

Page 16: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Can we combine a series of measurements?

The results for 6 samples taken from Skara Brae on the Orkney Islands. The samples consisted of single entities (i.e. individual organisms) that represented a relatively short growth interval. The terrestrial samples were either carbonised plant macrofossils (cereal grains or hazelnut shells) or terrestrial mammal bones (cattle or red deer).

Page 17: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

The test of homogeneity,

series of measurements xi, with error si

Null hypothesis says measurements are the same (within error)

Calculated the weighted mean , xp  

the test statistic T = (xi –xp)2/si

2

This should have a 2(n-1) distribution

Page 18: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Case A

455540, 460540, 452540, 4530 35, 427040, 4735 40

Using all 6 measurements, the weighted average is 4536.34 years, and T is 72.2789.

T compared with a 2 (5), for which the critical value is 11.07, thus we would reject the hypothesis that the samples all had the same true age, so they cannot be combined.

Page 19: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Case B

455540, 460540, 452540, 4530 35 the weighted average is 4552 years, and T is

2.612. T compared with a 2 (3), for which the critical value is 7.8,

thus we would not reject the hypothesis that the samples all had the same true age, and so the weighted average (with its error) could be calculated.

Page 20: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Model uncertainty

Page 21: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Uncertainties

uncertainties in input data– uncertainty in model parameter values

Conflicting evidence contributes to– uncertainty about model form– uncertainty about validity of

assumptions

Page 22: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Conceptual system

Data

Model

Policy

inputs & parameters

model results

feedbacks

Page 23: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Goals

1. Transparent approach to facilitate awareness/identification/inclusion of uncertainties within analysis

2. Provide useful/robust/relevant uncertainty assessments

3. Provide a means to assess consequences

Page 24: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Modelling tools - SA/UA

Sensitivity analysis

  determining the amount and kind of change produced in the model predictions by a change in a model parameter

 

  Uncertainty analysis

 an assessment/quantification of the uncertainties associated with the parameters, the data and the model structure.

Page 25: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Modellers conduct SA to determine

(a) if a model resembles the system or processes under study,

(b) the factors that mostly contribute to the output variability,

(c) the model parameters (or parts of the model itself) that are insignificant,

(d) if there is some region in the space of input factors for which the model variation is maximum,

and(e) if and which (group of) factors interact with each

other.

Page 26: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

SA flow chart (Saltelli, Chan and Scott, 2000)

Page 27: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Design of the SA experiment

Simple factorial designs (one at a time) Factorial designs (including potential

interaction terms) Fractional factorial designs Important difference: design in the context of

computer code experiments – random variation due to variation in experimental units does not exist.

Page 28: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

SA techniques

Screening techniques– O(ne) A(t) T(ime), factorial, fractional factorial

designs used to isolate a set of important factors

Local/differential analysis Sampling-based (Monte Carlo) methods Variance based methods

– variance decomposition of output to compute sensitivity indices

Page 29: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Screening

screening experiments can be used to identify the parameter subset that controls most of the output variability with low computational effort.

Page 30: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Screening methods

Vary one factor at a time (NOT particularly recommended)

Morris OAT design (global)– Estimate the main effect of a factor by computing a

number r of local measures at different points x1,…,xr in the input space and then average them.

– Order the input factors

Page 31: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Local SA

Local SA concentrates on the local impact of the factors on the model. Local SA is usually carried out by computing partial derivatives of the output functions with respect to the input variables.

The input parameters are varied in a small interval around a nominal value. The interval is usually the same for all of the variables and is not related to the degree of knowledge of the variables.

Page 32: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Global SA

Global SA apportions the output uncertainty to the uncertainty in the input factors, covering their entire range space.

A global method evaluates the effect of xj while all other xi,ij are varied as well.

Page 33: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

How is a sampling (global) based SA implemented?

Step 1: define model, input factors and outputs

Step 2: assign p.d.f.’s to input parameters/factors and if necessary covariance structure. DIFFICULT

Step 3: simulate realisations from the parameter pdfs to generate a set of model runs giving the set of output values.

Page 34: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Choice of sampling method

S(imple) or Stratified R(andom) S(ampling)– Each input factor sampled independently many times from

marginal distbns to create the set of input values (or randomly sampled from joint distbn.)

– Expensive (relatively) in computational effort if model has many input factors, may not give good coverage of the entire range space

L(atin) H(ypercube) S(sampling)– The range of each input factor is categorised into N equal

probability intervals, one observation of each input factor made in each interval.

Page 35: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

SA -analysis

At the end of the computer experiment, data is of the form (yij, x1i,x2i,….,xni), where x1,..,xn are the realisations of the input factors.

Analysis includes regression analysis (on raw and ranked values), standard hypothesis tests of distribution (mean and variance) for subsamples corresponding to given percentiles of x, and Analysis of Variance.

Page 36: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Some ‘new’ methods of analysis

Measures of importance

VarXi(E(Y|Xj =xj))/Var(Y)

HIM(Xj) =yiyi’/N

Sobol sensitivity indices Fourier Amplitude Sensitivity Test (FAST)

Page 37: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

How can SA/UA help?

SA/UA have a role to play in all modelling stages:– We learn about model behaviour and ‘robustness’ to

change;– We can generate an envelope of ‘outcomes’ and

see whether the observations fall within the envelope;

– We can ‘tune’ the model and identify reasons/causes for differences between model and observations

Page 38: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

On the other hand - Uncertainty analysis

Parameter uncertainty– usually quantified in form of a distribution.

Model structural uncertainty– more than one model may be fit, expressed as a

prior on model structure.

Scenario uncertainty– uncertainty on future conditions.

Page 39: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Tools for handling uncertainty

Parameter uncertainty– Probability distributions and Sensitivity analysis

Structural uncertainty– Bayesian framework– one possibility to define a discrete set of models,

other possibility to use a Gaussian process

Page 40: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

An uncertainty example (1)

Wet deposition is rainfall ion concentration

Rainfall is measured at approximately 4000 locations, map produced by UK Met Office.

Rain ion concentrations are measured weekly (now fortnightly or monthly) at around 32 locations.

Page 41: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

An uncertainty example (2)

BUT• almost all measurements are at low altitudes• much of Britain is uplandAND measurement campaigns show• rain increases with altitude• rain ion concentrations increase with altitude

Seeder rain, falling through feeder rain on hills, scavenges cloud droplets with high pollutant concentrations.

Page 42: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

An uncertainty example (3)

Solutions: (a) More measurements

X at high altitude are not routine and are complicated

(b) Derive relationship with altitudeX rain shadow and wind drift (over about 10km down

wind) confound any direct altitude relationships(c) Derive relationship from rainfall map

model rainfall in 2 separate components

Page 43: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

An uncertainty example (4)

Page 44: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

An uncertainty example (5)

Wet deposition is modelled by

r actual rainfalls rainfall on ‘low’ ground (r = s on ‘low’ ground, and

(r-s) is excess rainfall caused by the hill)c rain ion concentration as measured on ‘low’ groundf enhancement factor (ratio of rain ion concentration

in excess rainfall to rain ion concentration in‘low’ground rainfall)

deposition = s.c + (r-s).c.f

Page 45: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

An uncertainty example (6)

Rainfall Concentration

Deposition

Page 46: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

An uncertainty example (7)

r modelled rainfall to 5km squares provided by UKMO - unknown uncertainty

scale issue - rainfall a point measurementmeasurement issue - rain gauges difficult

touse at high altitude

optimistic 30% pessimistic 50%

how is the uncertainty represented?(not e.g. 30% everywhere)

Page 47: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

An uncertainty example (8)

s some sort of smoothed surface(change in prevalence of westerly winds

means it alters between years) c kriged interpolation of annual

rainfall weighted mean concentrations(variogram not well specified)assume 90% of observations within ±10% of correct value

f campaign measurements indicate valuesbetween 1.5 and 3.5

Page 48: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

An uncertainty example (9)

Output measures in the sensitivity analysis are the average flux (kg S ha-1 y-1) for

(a) GB, and(b) 3 sample areas

Page 49: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

An uncertainty example (10)

Morris indices are one way of determining which effects are more important than others, so reducing further work.

but different parameters are important in different areas

Page 50: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

An uncertainty example (11)

100 simulations Latin Hypercube Sampling of 3 uncertainty factors:

enhancement ratio% error in rainfall map% error in concentration

Page 51: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

An uncertainty example (12)

Note skewed distributions for GB and for the 3 selected areas

Page 52: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

An uncertainty example (13)

OriginalMean of 100 simulations

Standard deviation

Page 53: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

An uncertainty example (14)

CV from 100 simulations

Possible bias from 100 simulations

Page 54: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

An uncertainty example (15)

• model sensitivity analysis identifies weak areas• lack of knowledge of accuracy of inputs a

significant problem• there may be biases in the model output which,

although probably small in this case, may be important for critical loads

Page 55: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Conclusions so far

The world is rich and varied in its complexity Modelling is an uncertain activity

SA/UA are an important tools in model assessment The setting of the problem in a unified Bayesian

framework allows all the sources of uncertainty to be quantified, so a fuller assessment to be performed.

Page 56: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Bayesian Approach

to

Model Uncertainty, Calibration,

Sensitivity Analysis ….

Page 57: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Bayes Essentials

Eg experimental determination of a constant

Prior ideas

about

Data

Posterior ideas

about

Page 58: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Bayes’ Rule

likelihood – from model for data generation

Bayes Essentials

Page 59: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

General form:

Observations

Unknown

a (statistical) model describing

data generation,

specified in a likelihood

For inferences to be coherent they must work in this way.

+

Bayes Essentials

Page 60: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Computer/Numerical Models

Scientific understanding of environmental processes often expressed in a computer/numerical model …

Page 61: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Climate

CO2, N

Soil

PHYSIOLOGY

BIOPHYSICS

WATER & NUTRIENT

FLUXES

PLANT

STRUCTURE

&

PHENOLOGY

DISTURBANCE

VEGETATION

DYNAMICS

Sheffield Dynamic Global Vegetation Model, SDGVM

Computer/Numerical Models

Page 62: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

CO2: emissions vs atmospheric increase

‘Sinks for Anthropogenic Carbon’, Physics Today 2002, J L Sarmiento & N Gruber

Computer/Numerical Models

Page 63: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

• usually deterministic, always wrong

Computer/Numerical Models

Computer/Numerical Models

• how to quantify the uncertainty?

Page 64: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Statistical Viewpoint on Numerical Models

MODELINPUT OUTPUT

Uncertain as a representation of reality:

• may not be known

• may be inadequate

— uncertainty analysis

— model inadequacy

Page 65: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Numerical model: a function mapping inputs into outputs

Output

Input

x

If model outputs available only at a limited number of inputs?

How represent knowledge about the model?

EmulationStatistical Viewpoint on Numerical Models

Page 66: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Bayes Formulation

Put a distribution on the space of possible functions;

ie, treat as random

and use the Bayes machinery to update knowledge about it from

runs of the computer model/simulator.

(Bayes rule!)

Statistical Viewpoint on Numerical Models

called an emulator

The probability distribution of

Page 67: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Numerical Models and Reality - Calibration, Model Inadequacy, Predictive Uncertainty

Main goal of modelling: to learn about reality.

Relation of numerical model to reality: represent via a statistical model and use the inference machinery to learn about it.

One formulation:

observations, the true process, the numerical model

observational error regression parameter

model inadequacy

Page 68: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Treat also as an unknown function

Earlier, used runs of numerical model to learn about and build emulator.

Now in same way use observed data and the emulator to learn about

via Bayes rule

Page 69: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Calibration: using observed data to learn about model inputs .

Find

parameters of the two GPs

via Bayes rule

Can integrate out and use maximizing to get

summarizing information about .

Page 70: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Prediction and predictive uncertainty:

ie what is ?

Conditionally is a Gaussian process

Combine with

for inference about

and further combine with

for inference about

Hence predictions and their uncertainty.

Page 71: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

GEM software (Gaussian Emulation Machine)

Generates a statistical emulator of a computer code from training data consisting of an arbitrary set of inputs and the resulting outputs.

Gives the following: – prediction of code output at any untried inputs, taking account of

uncertainty in one or more of the code inputs. – main effects of each individual input. – joint effects of each pair of inputs. – percentage allocation to the variance from each individual input.

Calibrates code to observations, quantifies model inadequacy & predictive uncertainty

GEM-SA, GEM-CAL

Page 72: Uncertainty and sensitivity analysis- model and measurements Marian Scott and Ron Smith and Clive Anderson University of Glasgow/CEH/University of Sheffield.

Kennedy, M. C. & O’Hagan, A. (2001) Bayesian calibration of computer models. J. Roy. Statist. Soc. B, 63, 425-464.

Kennedy, M. C., O’Hagan, A. & Higgins, N. (2002) Bayesian analysis of computer code outputs. In Quantitative Methods for Current Environmental Issues, eds CW Anderson, V Barnett, P Chatwin & AH El-Shaarawi. Springer, London.

Oakley, J. E. & O’Hagan, A. (2004) Probabilistic sensitivity analysis. J. Roy. Statist. Soc. B,

66, 751-769.

Saltelli A, Chan K, Scott E M (2000) Sensitivity Analysis. Wiley.

Royal Society of Chemistry, Analytical Methods Sub-committee (web)

Some References:

For GEM software see www.ctcd.shef.ac.uk