Top Banner
Forecast Verification Research Beth Ebert and Laurie Wilson, JWGFVR co- chairs WWRP-JSC meeting, Geneva, 21-24 Feb 2011
24

Forecast Verification Research

Mar 15, 2016

Download

Documents

Gaius

Forecast Verification Research. Beth Ebert and Laurie Wilson, JWGFVR co-chairs WWRP-JSC meeting, Geneva, 21-24 Feb 2011. Aims. Verification component of WWRP, in collaboration with WGNE, WCRP, CBS Develop and promote new verification methods Training on verification methodologies - PowerPoint PPT Presentation
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Forecast Verification Research

Forecast Verification Research

Beth Ebert and Laurie Wilson, JWGFVR co-chairs

WWRP-JSC meeting, Geneva, 21-24 Feb 2011

Page 2: Forecast Verification Research

2

Aims

Verification component of WWRP, in collaboration with WGNE, WCRP, CBS

• Develop and promote new verification methods

• Training on verification methodologies

• Ensure forecast verification is relevant to users

• Encourage sharing of observational data

• Promote importance of verification as a vital part of experiments

• Promote collaboration among verification scientists, model developers and forecast providers

Page 3: Forecast Verification Research

3

Working group members

Beth Ebert (BOM, Australia)Laurie Wilson (CMC, Canada)• Barb Brown (NCAR, USA)• Barbara Casati (Ouranos, Canada)• Caio Coelho (CPTEC, Brazil)• Anna Ghelli (ECMWF, UK)• Martin Göber (DWD, Germany)• Simon Mason (IRI, USA)• Marion Mittermaier (Met Office, UK)• Pertti Nurmi (FMI, Finland)• Joel Stein (Météo-France)• Yuejian Zhu (NCEP, USA)

Page 4: Forecast Verification Research

4

FDPs and RDPs

Sydney 2000 FDP

Beijing 2008 FDP/RDP

SNOW-V10 RDP

Sochi 2014

MAP D-PHASE

Severe Weather FDP

Typhoon Landfall FDP

Page 5: Forecast Verification Research

5

Beijing 2008 FDP

Real Time Forecast Verification (RTFV) system

Fast qualitative and quantitative feedback on forecast system performance in real time– Verification products generated whenever new observations

arrive

Ability to inter-compare forecast systems

3 levels of complexity– Visual (quick look)– Statistics (quantitative)– Diagnostic (more information)

Page 6: Forecast Verification Research

6

Training

In person Online

Page 7: Forecast Verification Research

7

B08FDP lessons for real time verification

• Real time verification considered very useful

• Forecasters preferred scatterplots and quantile-quantile plots

• Format and standardization of nowcasts products was critical to making a robust verification system

• Difficult to compare "like" products created with different aims (e.g., QPF for warning vs hydrological applications)

• Verification system improvements

– User-friendly web display

– More user options for exploring results

Page 8: Forecast Verification Research

8

SNOW-V10• Verification strategy

– User-oriented verification for Olympic period of all forecasts, tuned to decision points of VANOC

– Verification of parallel model forecasts for Jan to August 2010

– Nowcast and regional model verification

• Rich dataset

Page 9: Forecast Verification Research

9

Variable Cat 1 Cat 2 Cat 3 Cat 4 Cat 5 Cat 6 Cat 7 Cat 8 Cat 9

Temperature (°C) -25 < -25≤ T<-20 -20≤ T<-4C -4≤ T<-2 -2≤ T< 0 0≤ T< +2 +2 ≤ T< +4 ≥ +4

RH (%) < 30% 30≤ RH< 65%

65≤ RH< 90%

90≤ RH< 94%

94≤ RH< 98%

≥ 98%

Winds (m/s) < 3 3 ≤ w < 4 4 ≤ w < 5 5 ≤ w < 7 7 ≤ w < 11 11 ≤ w < 13 13 ≤ w < 15 15 ≤ w < 17 ≥ 17

Wind Gust (m/s) < 3 3 ≤ w < 4 4 ≤ w < 5 5 ≤ w < 7 7 ≤ w < 11 11 ≤ w < 13 13 ≤ w < 15 15 ≤ w < 17 ≥ 17

Wind Direction

d ≥ 339 & d < 24º (N)

24 ≤ d < 69º (NE)

69 ≤ d < 114º (E)

114 ≤ d < 159º (SE)

159 ≤ d < 204º (S)

204 ≤ d < 249º (SW)

249 ≤ d < 294º (W)

294 ≤ d < 339º (NW)

Visibility (m) v < 30 30 ≤ v < 50 50 ≤ v < 200200 ≤ v <

300300 ≤ v <

500 ≥ 500 - -

Ceiling (m) c < 50 50 ≤ c< 120 120 ≤ c< 300 300 ≤ c< 750 750 ≤ c<

3000 c ≥ 3000 - - -

Precip Rate (mm/hr) r = 0 (None)

0 < r ≤ 0.2 (Trace)

0.2 < r ≤ 2.5 (Light)

2.5 < r ≤ 7.5 (Moderate)

r > 7.5 (Heavy) - - - -

Precip Type No Precip Liquid Freezing FrozenMixed

(w/Liquid) Unknown - - -

Table 5 (2nd Revised Suggestion for SNOW-V10 Verification)

Suggested categories for SNOW-V10 verification

Page 10: Forecast Verification Research

10

Forecast < 30 30 ≤ x < 50 50 ≤ x < 200 200 ≤ x < 300 300 ≤ x < 500 > 500 Total< 30 0 0 0 0 0 0 0

30 ≤ x < 50 0 0 0 0 0 0 050 ≤ x < 200 0 0 52 20 22 43 137

200 ≤ x < 300 0 0 76 18 19 103 216300 ≤ x < 500 0 1 26 15 12 60 114

> 500 0 9 831 246 170 3743 4999Total 0 10 985 299 223 3949 5466

lam1k Min. Visibility (m) at VOL HSS=0.095Observed

Example:Visibility verification

Page 11: Forecast Verification Research

11

Sochi 2014

Standard verification

Possible verification innovations:• Road weather forecasts

• Real-time verification

• Timing of events – onset, duration, cessation

• Verification in the presence of observation uncertainty

• Neighborhood verification of high-resolution NWP, including in time-height plane

• Spatial verification of ensembles

• User-oriented probability forecast verification

Page 12: Forecast Verification Research

12

Collaboration

• WWRP working groups• THORPEX

– GIFS-TIGGE– Subseasonal prediction– Polar prediction

• CBS– Severe Wx FDPs– Coordination Group on Forecast Verification

• SRNWP• COST 731• ECMWF TAC subgroup on verification measures

Page 13: Forecast Verification Research

13

Spatial Verification Method Intercomparison Project

• International comparison of many new spatial verification methods

• Methods applied by researchers to same datasets (precipitation; perturbed cases; idealized cases)

• Subjective forecast evaluations

• Workshops: 2007, 2008, 2009

• Weather and Forecasting special collection

http://www.rap.ucar.edu/projects/icp

Page 14: Forecast Verification Research

14

Spatial Verification Method Intercomparison Project

Page 15: Forecast Verification Research

15

Spatial Verification Method Intercomparison Project

• Future variables – "Messy" precipitation

– Wind

– Cloud

• Future datasets– MAP D-PHASE / COPS

– SRNWP / European data

– Nowcast dataset(s)

• Verification test bed

Page 16: Forecast Verification Research

16

PublicationsPublications

Recommendations for verifying deterministic and probabilistic quantitative precipitation forecasts

Recommendations for verifying cloud forecasts (this year)

Recommendations for verifying tropical cyclone forecasts (next year)

January 2008 special issue of Meteorological Applications on forecast verification

2009-2010 special collection of Weather & Forecasting on spatial verification

DVD from 2009 Helsinki Verification Tutorial

Page 17: Forecast Verification Research

17

Outreach

• Verification workshops and tutorials– On-site, travelling

• EUMETCAL training modules

• Verification web page

• Sharing of tools

http://www.cawcr.gov.au/projects/verification/

Page 18: Forecast Verification Research

18

International Verification Methods Workshops 4th Workshop – Helsinki 2009Tutorial• 26 students from 24 countries• 3 days• Lectures, hands-on (took tools home)• Group projects - presented at workshop

Workshop• ~100 participants• Topics:

– User-oriented verification– Verification tools & systems– Coping with obs uncertainty– Weather warning verification– Spatial & scale-sensitive methods– Ensembles– Evaluation of seasonal and climate

predictions

Page 19: Forecast Verification Research

19

5th International Verification Methods Workshop

• Melbourne, December 2011

• 3-day tutorial + 3-day scientific workshop

• Additional tutorial foci– Verifying seasonal predictions

– Brief intro to operational verification systems

• Capacity building for FDPs/RDPs, SWFDP, etc.

Page 20: Forecast Verification Research

20

climatechange

New focus areas for JWGFVR research

local

point

regional

global

Spa

tial s

cale

Forecast lead time

minutes hours days weeks months years decades

NWP

nowcasts

"Seamless verification" - consistent across space/time scales

decadalprediction

seasonalprediction

sub-seasonalprediction

veryshortrange

Approaches:• deterministic / categorical• probabilistic• distributional• other?

Page 21: Forecast Verification Research

21

New focus areas for JWGFVR research

Spatial methods for verifying ensemble predictions• Neighborhood, scale-separation, feature-based, deformation

average rain

rain volumemaximum rain

rain area

Page 22: Forecast Verification Research

22

New focus areas for JWGFVR researchExtreme events

Page 23: Forecast Verification Research

23

New focus areas for JWGFVR research

Warnings, including timing

Success ratio (1-FAR)

Hit

rate

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 0

Page 24: Forecast Verification Research

24

Thank you