Group E1: Data Quality Control and Quality Assurance Junhong (June) Wang, Scot Loehrer To introduce the following areas, determine their priorities and make recommendations • Where we are with regard to the service being discussed • What trends we have observed • What challenges we see in the future Facilities to cover: 1. Sounding system: Kate Young/June Wang 2. ISS: Bill Brown 3. ISFS: Steve Oncley 4. S-Pol: Bob Rilling 5. ELDORA: Wen-Chau/Michael 6. REAL: Bruce Morley 7. CSU CHILL: Pat Kennedy 8. WCR: Samuel Haimov 9. Airbone: Al Schanot 10. Composite data: Scot Loehrer
23
Embed
Group E1: Data Quality Control and Quality Assurance Junhong (June) Wang, Scot Loehrer To introduce the following areas, determine their priorities and.
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Group E1: Data Quality Control and Quality Assurance
Junhong (June) Wang, Scot Loehrer
To introduce the following areas, determine their priorities and make recommendations
• Where we are with regard to the service being discussed
• What trends we have observed
• What challenges we see in the future
Facilities to cover:
1. Sounding system: Kate Young/June Wang 2. ISS: Bill Brown 3. ISFS: Steve Oncley 4. S-Pol: Bob Rilling
5. ELDORA: Wen-Chau/Michael 6. REAL: Bruce Morley 7. CSU CHILL: Pat Kennedy 8. WCR: Samuel Haimov 9. Airbone: Al Schanot10. Composite data: Scot Loehrer
Specific possible areas to cover for Data Quality Control & Quality Assurance (Priority, Priority, Priority)
1. Data delivery (timeliness and quality)• Trends: towards real time data delivery• Challenges:
• communications between users and providers, • different delivery time for multiple facilities, • going too far with quick-look the data
• Solutions: • better coordination within EOL for multiple facilities, • asking PIs prioritize the data request,
Specific possible areas to cover for Data Quality Control & Quality Assurance (Priority, Priority, Priority)
2. Automated and in-field QC/QA for real-time data QC/QA
and delivery• Trends: more requests for real-time data QC/QA and delivery• Challenges:
• different community have different needs (DA/quick look)• requirement for combining different sensor data
• Solutions: • collaborations and communications among communities• hardware engineer in the field• automated QC/QA is based on multiple years of
experiences
Specific possible areas to cover for Data Quality Control & Quality Assurance (Priority, Priority, Priority)
3. Value-added data and “statistic views of data”• Trend: more requests• Challenges:
• Where to set the threshold?• Define user requirements• Different ways to calculate certain parameters
• Solutions: • VAD is a good practice for original data quality • With new techniques, there are some probabilistic
evaluations of data. Leave the decision to PIs. • For long term, it is good not to remove the “bad” data,
which might mean removing the good data. Important not to remove marginal data.
• Provide a list of algorithm commonly used.
Specific possible areas to cover for Data Quality Control & Quality Assurance (Priority, Priority, Priority)
4. Composites and operational data sources: common
QC/QA• Trends: more needs• Challenges:
• Access to consistent and centralized detailed metadata from all networks
• Adequately obtaining the operational data• Different version of QCed operational data not produced by
• a good reference on metadata definition (Fed., …, USGS)• Development of metadata database for networks (e.g. Fac.
Assessment)
Specific possible areas to cover for Data Quality Control & Quality Assurance (Priority, Priority, Priority)
5. Formal characterization of measurement uncertainties• Trends: Community needs such information. Otherwise they
make a guess.• Challenges:
• More and intensive work need to be done for this• Inaccuracy of manufactures’ accuracy information from
their spec. sheet.• Easy for surface sensors, but hard for airborne sensors
• Solutions:• Awareness of the importance of this activity• Collaboration with Manufactures
Specific possible areas to cover for Data Quality Control & Quality Assurance (Priority, Priority, Priority)
6. Other QC/QA approaches: • Integration and inter-comparisons of the same parameters
from different instruments • Too much data QC/QA v.s. your specific needs• Successful communication with users on what have and
haven’t been done. • Documentation of data QC/QA procedures for different
versions, especially old version• Interaction between data QC/QA staff and users• Dataset tracking of different QC/QA versions• Education: instrument accuracy, collection procedures, …
Spectral Analysis: response time, flux calculationsSpectral Analysis: response time, flux calculations
ISS: Integrated Sounding System
MAPR at ISPA
MISS at T-REX
Wind Profiler QC:
• NIMA:
NCAR Improved Moment Algorithm
• fuzzy logic image processing
• removes bad data, extends range, improves accuracy.
NIMA QC
Before NIMA
• MAPR:
Multiple Antenna Profiler Radar
• Developing fuzzy logic
• Some success in cleaning data
• Bird removal tricky
S-Polka Data Quality • Radar power (reflectivity) calibration
– S -band Horizontal and Vertical polarizations– Ka-band Horizontal and Vertical polarizations– Engineering measurements– Solar measurements– Self consistency of dual-polarimetric measurements
• ZDR calibration– Vertical pointing in light rain– Cross-polar power analysis
• S-band pointing and ranging – Solar – Towers
• S and Ka-band beam and range gate alignment• Systems stability monitoring • Redundant RDAs and data recording
– Instantaneous backup
S-Polka Data Quality • Newly installed Automatic Test Equipment
– Streamlines setup– daily updates of calibration measurements– Goal – real time “final data set”
• Real time ground clutter mitigation (CMD)– Identifies clutter in processor – Applies filter to clutter before final moment
computation– Avoids filter bias in
pure weather echoes• Hydrometeor ID• Mitigation of range folding
through phase-coded pulses folding• Increased sensitivity
Utilizes an inverse distance weighting objective analysis method adapted from Cressman (1959) and Barnes (1964). The deviation between measured value and the value expected from the objective analysis is subjected to dynamically determined limits (sensitive to diurnal and intra-seasonal variations and dependent on spatial and temporal continuity).
Parameters: SLP, Calc SLP, T, Td, WS, WD
200 km
ELDORA Airborne Doppler Data Processing Steps
1. * Translate the raw ELDORA field format data into DORADE sweep files and inspect for errors.
2. * Calculate navigation correction factors (cfac files) for each flight
3. Fine-tune navigation corrections for each leg of data
4. Edit the data to remove ground echo, noise, clutter, and radar side-lobes, as well as velocity unfolding.
5. Interpolate and synthesize data to get 3-dimensional wind field and derived quantities.
* Steps performed at NCAR by EOL staff
ELDORA Navigation Corrections• Accurate knowledge of the
aircraft orientation and radar beam pointing angle is essential to airborne Doppler analysis
Lee et al, 1994; Testud et al, 1995; Georgis et al, 2000; Bosart et al, 2002
ELDORA Editing & Synthesis
• EOL provides assistance and advice to users on editing and synthesis of data as an additional form of Quality Assurance
• For more information about ELDORA QC/QA and analysis, see Michael Bell or Wen-Chau Lee
REAL Data Highlights• Data from CHATS 15 March 11 June 2007