Observation Targeting Andy Lawrence Predictability and Diagnostics Section, ECMWF Acknowledgements: Martin Leutbecher, Carla Cardinali, Alexis Doerenbecher, Roberto Buizza ECMWF Predictability Training Course - April 2006
Jan 05, 2016
Observation Targeting
Andy LawrencePredictability and Diagnostics Section, ECMWF
Acknowledgements:Martin Leutbecher, Carla Cardinali,
Alexis Doerenbecher, Roberto Buizza
ECMWF Predictability Training Course - April 2006
Contents
Introduction • What is Observation Targeting?
Targeting Methodology• Example using a simple model • Kalman Filter techniques• Singular Vector techniques• Summary of research issues
Operational targeting principles• Previous targeting campaigns• Operational structure and schedules• Results from ATReC 2003• Verification of forecast impacts• Future of observation targeting
ECMWF Predictability Training Course – April 2006
What is observation targeting?
Techniques that optimize a flexible component of the observing network on a day-to day basis with the aim to achieve specific forecast improvements.
ECMWF Predictability Training Course - April 2006
Estimated ROUTINE observations = 10 6
Estimated TARGETED observations = 10 3
The concept of observation targeting
Forecast an event.
ECMWF Predictability Training Course - April 2006
OBS
OBS
OBS
Improve forecast?
Use adjoint model transform algorithm to derive data-sensitive areas. Add extra observations
Verify with corresponding analysis.
The concept of observation targeting
Q: If we have the capability to add observations in data-sensitive areas to improve the forecast of a specific event, can these locations be determined using objective (model-based) methods?
ECMWF Predictability Training Course - April 2006
A: This is an optimization problem with two constraints:
i. Probability of making an analysis error at a particular location
ii. The intrinsic ability of the flow at that location (i.e. sensitivity)
OBS
OBS
OBS
Where are observations needed to improve a 18-hour forecast?
Sensitive areas Verification region
ECMWF Predictability Training Course - April 2006
H
H
H
LL
L
L
L
528
528
528
528
552
552
552
576
576
576
576
30°N30°N
45°N 45°N
60°N60°N
45°W
45°W 0°
0°Targ. time: 20031117, 18 UT / Verif. time: 20031118, 12 UT (opt: 18h)trajectory initialized from fc 20031115, 00 UT +66 hShading: areas of 8, 4, 2, 1 x10
6 km
2
Valid time: 20031117, 18 UT (Targeting Time)ECMWF-SAP based on TE-SVs (dry T42) and Z500
H
H
H
LL
L
L
L
528
528
528
528
552
552
552
576
576
576
576
30°N30°N
45°N 45°N
60°N60°N
45°W
45°W 0°
0°Targ. time: 20031117, 18 UT / Verif. time: 20031119, 12 UT (opt: 42h)trajectory initialized from fc 20031115, 00 UT +66 hShading: areas of 8, 4, 2, 1 x10
6 km
2
Valid time: 20031117, 18 UT (Targeting Time)ECMWF-SAP based on TE-SVs (dry T42) and Z500
Where are observations needed to improve a 42-hour forecast?
ECMWF Predictability Training Course - April 2006
Where are observations needed to improve a 66-hour forecast?
ECMWF Predictability Training Course - April 2006
H
H
H
LL
L
L
L
528
528
528
528
552
552
552
576
576
576
576
30°N30°N
45°N 45°N
60°N60°N
45°W
45°W 0°
0°Targ. time: 20031117, 18 UT / Verif. time: 20031120, 12 UT (opt: 66h)trajectory initialized from fc 20031115, 00 UT +66 hShading: areas of 8, 4, 2, 1 x10
6 km
2
Valid time: 20031117, 18 UT (Targeting Time)ECMWF-SAP based on TE-SVs (dry T42) and Z500
The Observation Targeting Question
How do identify optimal sites for additional observations?
Methodology
ECMWF Predictability Training Course - April 2006
OR
How do we predict changes of forecast uncertainty due to assimilation of additional observations?
Information needed to answer this question:
1. Knowledge of the statistics of initial condition errors…and how they change due to an assimilation of additional observations.
– Gaussian error statistics– Kalman filter techniques
Methodology
ECMWF Predictability Training Course - April 2006
For linear perturbation dynamics and Gaussian error statistics, optimal state estimation can be approximated by the Extended Kalman Filter… which can also be used to select optimal sites for additional observations.
2. Knowledge of the perturbation dynamics from the observation time to the forecast verification time
– NWP studies suggest that perturbation dynamics are approximated by a linear propagator defined by ensemble-based techniques or tangent-linear/ adjoint techniques.
Targeted observations in the framework of an Extended Kalman Filter(Illustration using the Lorenz-95 system - Leutbecher, 2003)
Extended Kalman Filter
• A chaotic system, where a time unit of 1 represents 5 days
dxi
dt = -xi-2 xi-1 + xi-1 xi+1 – xi + F
with i = 1,2,…40
x0 = x40 , x–1 = x39 , x41 = x1 and F = 8
ECMWF Predictability Training Course - April 2006
A perturbation placed at i=10, propagates ‘eastward’ at a speed of 25 degrees/day.
For routine observations:Observations are constructed by adding noise (representing unbiased and uncorrelated normally distributed errors) to values taken from a ‘truth’ run..become available every 6 hours.
Over land (positions 21-40) we have observations at all locations 0 = 0.05clim
Over ocean (positions 1-20) we have observations at ‘cloud-free’ locations 0 = 0.15clim
Planet L95: routine observing network
ECMWF Predictability Training Course - April 2006
2-day forecast errors for Europe
Planet L95: forecast errors
ECMWF Predictability Training Course - April 2006
2-day fc errors for Europe (21-28)
assim. with KF using routine obs.
sample of 1000 daily forecasts
rms error= 0:23 = 0:06clim
try to reduce forecast errors using 1 tar-getedobs with o = 0:05clim over sea
A single observation over the ocean, with the error characteristics of a land observations is considered (i.e. at i=1,…20).
The aim of this is to provide a better forecast over ‘Europe’ on Planet L95.
Planet L95: targeted observation
ECMWF Predictability Training Course - April 2006
Covariance evolution within the Kalman filter:
For Routine observations:
Analysis Step at time tj (Pra)-1 = (Pr
f)-1 + HrT Rr
-1 Hr
Forecast Step tj tj+ Prf = M Pr
a MT
Covariance prediction with the Kalman filter (routine + additional observations)
ECMWF Predictability Training Course - April 2006
Optimal position i* (where i* gives the maximum reduction of forecast error variance) :
maxi=1..20 trace ( LEu (Prf - Pi
f ) LT
Eu )
For Routine + additional observations at position i (where i = 1,…20)
Analysis Step at time tj (Pia)-1 = (Pr
f )-1 + HrT Rr
-1 Hr + HiT Ri
-1 Hi
Forecast Step tj tj+ Pif = M Pi
a MT
Optimal position for an additional observation
ECMWF Predictability Training Course - April 2006
Distribution of the optimal positionfor an additional obs in the Atlanticto improve the 2-day fc over Europe (KF full rank)
0 3 6 9 12 15 18 210.00
0.05
0.10
0.15
0.20
0.25
0.30
fc range 2 days (A04)fr
actio
n of
cas
es
position of targeted obs.
Distribution of optimal position for an additional observation in L95 ‘Atlantic’(to improve the 2-day forecast over Europe)
Planet L95: Forecast impacts
How to measure the improvements in forecast skill:
• Compare with randomly placed observations
• Compare with impact obtained with observations that actually reduce the forecast error the most ( although never achievable as it requires information at verification time; position depends on realization of actual errors!)
• Compare with the impact of an observation added at a fixed site optimized for the forecast goal (a hard test).
– No such test with NWP system and real observations yet.
– How does last test look with L95?
ECMWF Predictability Training Course - April 2006
Fixed location v. adaptive day-to-day location
ECMWF Predictability Training Course - April 2006
Distribution of 2-day forecast errors: targeting versus addition at fixedsite
0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.00
5
10
15
20
25
fc range 2 days (A04)
cdf
of f
orec
ast e
rror
s (p
erce
nt)
routine + targeted obs
routine + obs at 19
routine obs
rms error
targeting using full-rank Kalman filter for data assimilation and for forecast error variance prediction at
verification time
Method is good for a simple model, BUT
an extended Kalman filter is too expensive for a full NWP model
Targeting would require to run the Kalman filter several times
(i.e. each configuration of the additional observations considered in the
planning process is one among several feasible ones).
Approximations of the Kalman Filter
Solution:Calculate forecast error variance in small relevant subspace!
1. Perturbations of ensemble members about the mean: a proxy for data assimilation scheme (ETKF).
2. Based on singular vector schemes computed with an estimate of the inverse of the routine analysis error covariance metric as the initial time metric.
ECMWF Predictability Training Course - April 2006
Reduction of forecast error variance using the ETKF (Majumdar, 2001)Signal = forecast error (routine +additional) – forecast error (routine)
Ensemble Transform Kalman Filter (ETKF)
ECMWF Predictability Training Course - April 2006
Used in targeting for Winter Storms Reconnaissance Program (WSRP) (Bishop, Etherton and
Majumdar, 2001)
• Assumes linearity.‘linear combination of ensemble perturbations + ensemble mean = model trajectory’.
• Assumes optimal data assimilation.• No covariance localisation.
Singular vectors identify the directions (in phase space) that provide maximum growth over a finite period of time.
ECMWF Predictability Training Course - April 2006
Singular Vector Schemes
• Dependent on model characteristics and optimization time
• Growth is measured by the inner product (or metric or norm):
If the correct norm is used, the resulting ensemble captures the largest amount of forecast error variance at optimization time (assuming that the forecast error evolves linearly).
• For targeting: Forecast error variance prediction from t0 to tv replaced by variance predictions in a singular vector subspace.
• Data assimilation can either use a full Kalman filter or Optimal Interpolation.
Singular Vector-based reduced-rank estimate:
• Initial time metric is the inverse of the routine analysis error covariance matrix (Pr
a)-1
• SV’s computed with this metric evolve into the leading eigenvectors of the routine error covariance matrix
trace (Leu Pf LTEu ) .
Singular vector targeting method 1
ECMWF Predictability Training Course - April 2006
• Compute variance of forecast errors only in a subspace of leading singular vectors
trace (Ân LEu Pf LTEu Ân
T) instead of trace (LEu Pf LTEu )
Here, Ân denotes the projection on the subspace of the leading n (left) singular vectors of LEuM.
• Data assimilation uses full Kalman filter
Reduced-Rank approximation of covariance forecast step
Analysis error covariances in the subspace spanned by the leading n SV’s are represented by
Routine network Vn VnT where Vn
T ( Pra ) -1 Vn = I
Modified network Vn i iT Vn
T where (Vn i )T ( Pra ) -1 (Vn i) = I
The transformation matrix i is the inverse square root of the n x n matrix,Ci, that expresses the modified analysis error covariance metric in the basis of the singular vectors;
Ci = VnT ( Pr
a ) -1 Vn = In + VnT Hi
T R –1 Hi Vn .
Using these representations (of the aecm), the forecast error variance in the verification region becomes:
trace (Ân LEu ( Pfj+ ) LT
Eu ÂnT ) = n
j=12j routine network
trace (T diag (21… 2
n ) ) modified network
where j denotes the singular value of the j-th SV vj
ECMWF Predictability Training Course - April 2006
Singular vector targeting method 2
ECMWF Predictability Training Course - April 2006
Approximate full Kalman filter by replacing the routine forecast error covariance matrix Pf
r by a static background error covariance matrix B (Optimal Interpolation scheme)• In the variance prediction (for targeting) and • In the assimilation algorithm
Analysis error covariance matrix given by
A-1 = B-1 + HTR-1H
In L95 system, static background error covariance matrix B = (xf - xt ) (xf - xt)T
is a sample covariance matrix computed from (forecast – truth) differences from
a 1000 day sample → combine with reduced-rank technique
Method 1: Method 2:
Full KF- SV subspace Optimal Interpolation – SV subspace
L95 comparisons : SV reduced-rank schemes
ECMWF Predictability Training Course - April 2006
Approximation of the background error covariances
Replace the routine background error covariance matrix P fr by a static backgrounderror covariance matrix B (OI)
in the variance prediction (targeting) and
in the assimilation algorithm.
magic B: The static background error covariance matrix is a sample covariancematrix computed from forecast truth differences.Several iterations of updating B=h(x f xt)(x f xt)Ti over a 1000 day sample.
KF versus OI performance, global rms errors
range (d) 0 1 2 3 4 5KF 0.11 0.17 0.25 0.38 0.57 0.80OI 0.33 0.49 0.74 1.04 1.38 1.74clim =3:6; o;land = 0:18; o;ocean= 0:54
L95 comparisons : SV full rank v reduced-rank schemes
ECMWF Predictability Training Course - April 2006
Distribution of 2-day forecast errors: KF-full rank versus KF-rank 1
tg obs (KF-rank 1)
0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.00
5
10
15
20
25
fc range 2 days (A04/R08)
rms error
cdf
of f
orec
ast e
rror
s (p
erce
nt)
0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.00
5
10
15
20
25
control
obs at 19
tg obs (KF full rank)
targeting using full-rank Kalman filter for data assimilation and a rank-1 approximation for the forecast
error variance prediction at verification time
Distribution of 2-day fc errors over Europe (OI-rank 1)
0.0 0.5 1.0 1.5 2.0 2.5 3.0 3.50
5
10
15
20
25
fc range 2 days (R07)
cdf
of f
orec
ast
erro
rs (
perc
ent)
rms error
tg obs (OI rank 1)
obs at 20
control
Distribution of 2-day forecast errors over Europe
Full KF v. Method 1 ( Reduced-rank Kalman filter)
Full KF v. Method 2 ( Reduced-rank OI)
• Targeted observations should be directed to ‘sensitive’ regions of the atmosphere.
ECMWF Predictability Training Course - April 2006
SV dependency on initial time metric
p1
p0 S S
• For predictability studies, an appropriate metric is based on energy.–Total Energy provides a dynamical basis (has no knowledge of error statistics):
||x||e2 = xTEx = ½ u2 + v2 + (cp/Tr) T2 dp dS + ½ Rd Tr pr ( ln psfc )2 dS
• Correct metric is dependent on the purpose for making the targeted observations (study precursor developments or to improve forecast initial conditions).
Hessian Singular Vectors
ECMWF Predictability Training Course - April 2006
• The Hessian of the cost-function provides the estimate of the inverse of the analysis error covariance matrix:
J(x) = Jb (x) + Jo (x)
J = B-1 + HTR-1H = A-1
• Initial error estimates are consistent with the covariance estimates of the variational data assimilation scheme (incorporates error statistics).
ECMWF SV VT:Tuesday 23 December 2003 12UTC Model Level 46 temperature
0.02
0.02
ECMWF SV VT:Tuesday 23 December 2003 09UTC Model Level 46 temperature
-0.04
-0.02
0.02
0.02
ECMWF SV VT:Tuesday 23 December 2003 09UTC Model Level 46 temperature
TESV ‘Full’ Hessian SV ‘Partial’ Hessian SV J = Jb + Jo J = Jb
ECMWF Predictability Training Course - April 2006
Total energy SV v Hessian SV
Hessian reduced-rank estimate
ECMWF Predictability Training Course - April 2006
• Similar to Kalman Filter/ OI- reduced rank estimate but is based on a subspace of Hessian Singular vectors vi computed with the metric Jroutine (using only observations from the routine network in Jo) (Leutbecher, 2003)
• Estimate of forecast error variance reduction due to additional observations
trace ([I – C-1] diag (21…. 2
n))
where j denotes the singular value of the routine Hessian SV vj
• Efficient computation of the Hessian metric for modified observation network (routine + additional) in the subspace:
Cij = vTi Jmod vj = vT
i (Jroutine + HTa R-1
a Ha) vj
= ij + (Ha vi)T R-1a Ha vj
Comparison of flight track ranking
ECMWF Predictability Training Course - April 2006
Winter Storms Reconnaissance Program 2003Additional observations on 4th Feb 00UT for forecast verification time: 6thFeb 00UT
Hessian Reduced-rank estimate ETKF
(Doerenbecher et al., 2003)
Summary & Research Issues
ECMWF Predictability Training Course - April 2006
Aim of observation targeting:
Prediction of forecast error variance due to modifications of observing
network.
Factors that affect skill of forecast error variance predictions in operational NWP:
• Covariance estimates – Skill of forecast error variance predictions depends on quality of background
error covariance estimate…incorporate flow-dependant wavelet approach.
– Account for correlations between observation error in current schemes (Important for satellite data with observation error correlations in space and between channels optimal thinning - Liu & Rabier, 2003)
– Predict spatial resolution of routine observations. Harder to do with day-to-day variability of targeted observations, particularly for satellite data affected by cloud.
Summary & Research Issues
ECMWF Predictability Training Course - April 2006
• Error dynamics– TL/AD model simplified due to resolution and physical process
parameterisation. Advances in formulation (moist processes, sensitivity of observation targeting guidance to spatial resolution) should improve variance predictions.
– Validity of tangent linear assumption:
Gilmour et al. 2001: probably not useful beyond 24h; but measure of nonlinearity dominated by small scales
Reynolds & Rosmond 2003: SV’s usefully up to 72h (diagnostic in SV-space and scale dependant diagnostic)
• Reduced Rank SV’s (Subspace): How many SV’s are needed to reliably predict forecast error variance reductions?
– L95: 1 SV is sufficient: the leading SV explains a large fraction of the total forecast error variance in the verification region (Rank 1KF full KF)
– NWP: For rank (LEuM) number of grid-points in verification region times number of variables.
Summary & Research Issues
ECMWF Predictability Training Course - April 2006
• Observation types– Forecast error variance reductions can be determined for different observation
types in sensitive areas. – Will the abundance of satellite data eliminate the need for in-situ
measurements? – Satellite sampling is limited through cloud layers → in-situ measurements
useful if dynamically-sensitive areas are beneath clouds.
• Targeting methodology– Perhaps combination of SV and ensemble-based approach? (expensive, as
requires a dedicated ensemble).– Does validity of linear transformation in ETKF technique extend further than
the validity of the TL-approximation?
• Contribution of model error?– To initial condition error at t0
– To growth of error from t0 to tv
Previous targeting campaigns
Targeted observation techniques and methods were tested during numerous operational campaigns:
FASTEX (Fronts and Atlantic Storm-Track Experiment 1997)Improving forecasting of atmospheric cyclone depressions forming in the North-Atlantic and reaching the west-coast of Europe.
NORPEX (North Pacific Experiment 1998)North Pacific winter-season storms that affect the United States, Canada, and Mexico.
WSRP (Winter Storms Reconnaissance Program 1999-2006)North-Eastern Pacific storms affecting the west-coast of the United States. Use of targeted observations has a positive effect on forecast skill (Majumdar et al. 2001)
ATReC (Atlantic THORPEX Regional Campaign 2003)North-Atlantic storms affecting east-coast United States and Europe.
ECMWF Predictability Training Course - April 2006
1. Case selection: Decision made at tc on whether or not to commit additional observing resources at t0. Decide for which forecast verification time tv and which region R adaptive observations should be taken.
NWP Centres
ECMWF UK MO NCEP NRLMF
Operations Centre
Operational Structure for Observation Targeting
1 2 3 4 5
tc td t0 tv time
ECMWF Predictability Training Course - April 2006
2. Sensitive area prediction: Compute which configuration for adaptive observations (to be taken at t0) is likely to best constrain error of forecast for (tv,R).
Operational Structure for Observation Targeting
1 2 3 4 5
tc td t0 tv time
ECMWF Predictability Training Course - April 2006
NWP Centres
ECMWF UK MO NCEP NRLMF
Operations Centre
3. Select and request additional observations at td.
Operational Structure for Observation Targeting
1 2 3 4 5
tc td t0 tv time
ECMWF Predictability Training Course - April 2006
NWP Centres
ECMWF UK MO NCEP NRLMF
Operations Centre
Observation control centre
4. Observing platforms deployed at t0 and observations taken.
Operational Structure for Observation Targeting
1 2 3 4 5
tc td t0 tv time
ECMWF Predictability Training Course - April 2006
NWP Centres
ECMWF UK MO NCEP NRLMF
AMDAR Radiosonde Res. AircraftASAP
Observation control centreData
Monitoring
Operations Centre
NWP Centres
ECMWF UK MO NCEP NRLMF
Operational Structure for Observation Targeting
1 2 3 4 5
tc td t0 tv time
ECMWF Predictability Training Course - April 2006
AMDAR Radiosonde Res. AircraftASAP
Observation control centreData
Monitoring
Operations CentreOperations Centre
5. Forecast verification time at tv
NWP Centres
ECMWF UK MO NCEP NRLMF
Atlantic THORPEX Regional Campaign 2003
ECMWF Predictability Training Course - April 2006
First field campaign in which multiple observing systems were used.
Dropsondes from research aircraft, ASAP ships, AMDAR, land radiosonde sites.
Observation targeting guidance to predict the sensitive areas based on
• UKMO: ETKF based on ECWMF ensemble• Meteo-France: total energy SV’s run on a (possibly perturbed) trajectory• NRL: SV’s and sensitivity to observations• NCEP: ETKF based on combined ECWMF and NCEP ensembles.
• ECWMF: 2 flavours of total energy SV’s and Hessian SV’s
Config. initial norm TLM res. TLM physicsTE-d42 Total Energy T42 dry
TE-m95 Total Energy TL95 moistH-d42 Hessian T42 dry
Dry TESV (ECMWF) Moist TESV (ECMWF)
Hessian SV (ECMWF) ETKF (Met Office using ECMWF ensemble)
ATReC_029_1: Obs. time: 20031208, 18UT ; Ver. time 20031211, 00 (54h opt)
ECMWF Predictability Training Course - April 2006
H
H
H
H
H
H
L
L
L
L
L
L
L
L
1008
10081008
10081024
30°N30°N
45°N 45°N
60°N60°N
45°W
45°W 0°
0°Targ. time: 20031208, 18 UT / Verif. time: 20031211, 00 UT (opt: 54h)trajectory initialized from fc 20031206, 00 UT +66 hShading: areas of 8, 4, 2, 1 x10
6 km
2
Valid time: 20031208, 18 UT (Targeting Time)ECMWF-SAP based on TE-SVs (dry T42) and MSL
H
H
H
H
H
H
L
L
L
L
L
L
L
L
1008
10081008
10081024
30°N30°N
45°N 45°N
60°N60°N
45°W
45°W 0°
0°Targ. time: 20031208, 18 UT / Verif. time: 20031211, 00 UT (opt: 54h)trajectory initialized from fc 20031206, 00 UT +66 hShading: areas of 8, 4, 2, 1 x10
6 km
2
Valid time: 20031208, 18 UT (Targeting Time)ECMWF-SAP based on TE-SVs (moist TL95) and MSL
H
H
H
H
H
H
H
L
L
L
L
L
LL
1008
.0
1008.0
1008.0
1008.0
30°N30°N
45°N 45°N
60°N60°N
45°W
45°W 0°
0°Targ. time: 20031208, 18 UT / Verif. time: 20031211, 00 UT (opt: 54h)Trajectory initialized from fc 20031205, 12 UT +78 h (Lead time)Shading: areas of 8, 4, 2, 1 x10
6 km
2
Valid time: 20031208, 18 UTUKMO-SAP based on ETKF signal and ECMWF MSL
H
H
H
H
H
H
L
L
L
L
L
L
L
L
1008
10081008
10081024
30°N30°N
45°N 45°N
60°N60°N
45°W
45°W 0°
0°Targ. time: 20031208, 18 UT / Verif. time: 20031211, 00 UT (opt: 54h)trajectory initialized from fc 20031206, 00 UT +66 hShading: areas of 8, 4, 2, 1 x10
6 km
2
Valid time: 20031208, 18 UT (Targeting Time)ECMWF-SAP based on Hessian SVs and MSL
ECMWF calculated overlap ratios for a = 4 x 106 km2 (Leutbecher et al. 2004)
SAP1 SAP2 Number of cases with overlap >0.50 SV TE-d42 SV TE-m95 42/43 98%SV TE-d42 SV H-d42 35/42 83%SV TE-d42 ETKF 31/67 46%SV H-d42 ETKF 32/67 48%
Sensitivity area prediction
ECMWF Predictability Training Course - April 2006
How do we determine where to send the observations?
In ATReC 2003, level of agreement between the sensitive area predictions can be quantified in terms of geographical overlap.
Sensitive area 1: Sj of area a
Sensitive area 2: Sk of area a
Geographical overlap
Ojk = area ( Sj Sk ) /a
H
H
H
H
H
H
H
L
L
L
L
L
LL
1008
.0
1008.0
1008.0
1008.0
30°N30°N
45°N 45°N
60°N60°N
45°W
45°W 0°
0°Targ. time: 20031208, 18 UT / Verif. time: 20031211, 00 UT (opt: 54h)Trajectory initialized from fc 20031205, 12 UT +78 h (Lead time)Shading: areas of 8, 4, 2, 1 x10
6 km
2
Valid time: 20031208, 18 UTUKMO-SAP based on ETKF signal and ECMWF MSL
H
H
H
H
H
H
L
L
L
L
L
L
L
L
1008
10081008
10081024
30°N30°N
45°N 45°N
60°N60°N
45°W
45°W 0°
0°Targ. time: 20031208, 18 UT / Verif. time: 20031211, 00 UT (opt: 54h)trajectory initialized from fc 20031206, 00 UT +66 hShading: areas of 8, 4, 2, 1 x10
6 km
2
Valid time: 20031208, 18 UT (Targeting Time)ECMWF-SAP based on Hessian SVs and MSL
H
H
H
H
H
H
L
L
L
L
L
L
L
L
1008
10081008
10081024
30°N30°N
45°N 45°N
60°N60°N
45°W
45°W 0°
0°Targ. time: 20031208, 18 UT / Verif. time: 20031211, 00 UT (opt: 54h)trajectory initialized from fc 20031206, 00 UT +66 hShading: areas of 8, 4, 2, 1 x10
6 km
2
Valid time: 20031208, 18 UT (Targeting Time)ECMWF-SAP based on TE-SVs (dry T42) and MSL
H
H
H
H
H
H
L
L
L
L
L
L
L
L
1008
10081008
10081024
30°N30°N
45°N 45°N
60°N60°N
45°W
45°W 0°
0°Targ. time: 20031208, 18 UT / Verif. time: 20031211, 00 UT (opt: 54h)trajectory initialized from fc 20031206, 00 UT +66 hShading: areas of 8, 4, 2, 1 x10
6 km
2
Valid time: 20031208, 18 UT (Targeting Time)ECMWF-SAP based on TE-SVs (moist TL95) and MSL
TESV vs Hessian SV vs ETKF
ECMWF Predictability Training Course - April 2006
DESIGNATED TARGET AREA
ETKF sensitivity across Atlantic, main area 40-55N,20-30W; secondary max 45N60W (this is also secondary area for Hessian SVs). Main HSV and dry TESV centre is South tip of Greenland (and down to 50N), and back into Canada at 60N. Moist TESVs have significant area 35N65W (Odette)
Case 43: Obs. time 8th Dec 18UT, Verif. time 11th Dec 00UT
ECMWF Predictability Training Course - April 2006
Radiosonde, Satellite rapid-scan winds, AMDAR flights
Case 47: Obs time 11th Dec 18UT, Verif. Time 13th Dec 12UT
ECMWF Predictability Training Course - April 2006
Radiosondes, ASAP, Satellite, AMDAR
Case 36: Obs time 4th Dec 18UT, Verif. Time 6th Dec 12UT
ECMWF Predictability Training Course - April 2006
Radiosonde, ASAP, AMDAR
Case 37: Obs. Time 5th Dec 18UT, Verif. Time 7th Dec 12UT
ECMWF Predictability Training Course - April 2006
Radiosonde, ASAP, Satellite, AMDAR and Dropsondes
East Coast USA storm 5-7th December 2003
ECMWF Predictability Training Course - April 2006
A major winter storm impacted parts of the Mid-Atlantic and Northeast United States during the 5th-7th. Snowfall accumulations of one to two feet were common across areas of Pennsylvania northward into New England. Boston, MA received 16.2 inches while Providence RI had the greatest single snowstorm on record with 17 inches, beating the previous record of 12 inches set December 5-6, 1981. (from http://www.met.rdg.ac.uk/~brugge/world2003.html)
NASA-GSFC, data from NOAA GOES
Control (routine observations)
ATReC (routine + additional observations)
Summary of ATReC forecast Impacts
ECMWF Predictability Training Course - April 2006
ATReC (routine + additional observations in target area)
Case 37: Obs. Time 5th Dec 18UT, Verif. Time 7th Dec 12UT
ECMWF Predictability Training Course - April 2006
ControlATReC (routine + additional observations)ATReC (routine + additional observations in target area)
• Statistical verification of forecast impacts:– Large sample needed to estimate forecast skill differences in verification regions.– Data denial experiments may provide larger sample size.
• Sensitive area prediction method:– Sample similar observational coverage for both SV and ETKF sensitive regions
…not possible during ATReC.
• ATReC can be used as to plan future adaptive observation campaigns :– Use observation targeting for medium range. – Improve efficiency so as to shorten warning time for observation providers.– Identify cost-effective observation platforms (i.e. dropsondes).– Cancel cases that show sharp reduction in uncertainty as observation time
approaches.
Some ATReC 2003 conclusions
ECMWF Predictability Training Course - April 2006
Jy
J is measures the forecast error: its gradient with respect to the observation vector y.
This gives the forecast error sensitivity with respect to the observations used in the initial condition for model forecast
J ax
ax
ythe sensitivity respect the initial condition xa
Analysis sensitivity with respect the observation
Forecast sensitivity
ECMWF Predictability Training Course - April 2006
From Cardinali & Buizza, 2003
N
J
y
1
NN
J
y
0
0.00002
0.00004
0.00006
0.00008
0.0001
0.00012
0.00014
0.00016
0.00018
0.0002
airep u airep v airep t temp u temp v temp t temp q
Target NoTarget
Total Contribution
0
0.2
0.4
0.6
0.8
1
1.2
1.4
1.6
airep u airep v airep t temp u temp v temp t temp q
Target NoTarget
Mean Contribution
Observation Contribution to Forecast
ECMWF Predictability Training Course - April 2006
1 1 a
N NN N
HxJ
y y
a
N N
HxJ
y y
0
100
200300
400
500
600700
800
900
airep u airep v airep t temp u temp v temp t temp q
Fc Sens An Sens
0
0.2
0.4
0.6
0.8
1
1.2
1.4
airep u airep v airep t temp u temp v temp t temp q
Mean Fc Sens Mean An Sens
Forecast and Analysis Sensitivity
ECMWF Predictability Training Course - April 2006
• Forecast sensitivity to observations has been computed for the campaigns showing a forecast impact (ATreC-Control)/Control ≥ ± 10%
• In general, results show that in 13 cases out of 38: 9 positive and 4 negative– Targeted observations decrease the forecast error in verification area for
60% of cases (ALTHOUGH high skill in control forecasts). – Differences in forecast impact come also from the continuous assimilation
cycling that provides different model trajectories
• From the campaigns of 5th Dec at 18 UTC - Targeted observations improved the forecast of a cyclone moving along the east coast of North America for which severe weather impact was forecast.
Forecast impact of observations
ECMWF Predictability Training Course - April 2006
Adding targeted observations in the Pacific is expected to have a rather small impact (5% on z500/z1000 forecast errors over North America)
Adding targeted observations in the Atlantic is expected to have an even smaller impact (3% on z500/z1000 forecast errors over Europe)
Value of observations taking in oceans is regionally dependant and depends on the underlying observation system.
Up to the optimization time, the value of observations taken in SV-target areas is higher than the value of observations taken in similar size random areas.
Future of Observation Targeting
• Studies of the value of targeted observations:
Buizza et al. (2005) performed experiments on 100+ cases to:(a) Assess impact of observations taken in Pacific and verified over North America
(b) Assess impact of observations taken in Atlantic and verified over Europe
(c) Compare impact of observations taken in SV-target areas to observations taken in random areas.
(d) Estimate the average impact of targeted observations.
ECMWF Predictability Training Course - April 2006
Future of Observation Targeting
• Data denial studies (Kelly, Buizza, Thepaut, Cardinali)– Relatively inexpensive and incorporate a large number of cases (~ 6 months).– Assess value and impacts of specific types of observation platforms.
• Hurricane targeting (Majumdar et al. 2005, 2006)– Improve short-range forecasts of cyclone tracks – Provides useful comparison on ETKF and SV targeting techniques.
• Satellite Targeting – Targeting provides utilisation of ‘dynamical thinning’ techniques.– Improved channel selection can reduce problems in cloudy conditions
ECMWF Predictability Training Course - April 2006
Future of Observation Targeting
• Targeted observation campaigns:– Winter Storms Reconnaissance Program (ongoing) (NCEP/ NRL)– European THORPEX Regional Campaign 2007 : a `virtual’ targeting experiment.– African Monsoon Multidisciplinary-Analyses (AMMA) Observing System test
2005-2010– Pacific-Asian Regional Campaign (PARC) 2008.
• Improve adaptive parts of the observing network– New platforms (e.g driftsondes) are being developed – Rocketsondes– Aerosondes
ECMWF Predictability Training Course - April 2006