www.inl.gov Benchmarking Experiments for Criticality Safety and Reactor Physics Applications – II – Tutorial John D. Bess and J. Blair Briggs – INL Ian Hill (IDAT) – OECD/NEA This paper was prepared at Idaho National Laboratory for the U.S. Department of Energy under Contract Number (DE-AC07-05ID14517) 2012 ANS Annual Meeting Chicago, Illinois June 24-28, 2012
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
ww
w.in
l.gov
Benchmarking Experiments for Criticality Safety and Reactor
Physics Applications – II – Tutorial
John D. Bess and J. Blair Briggs – INL
Ian Hill (IDAT) – OECD/NEA
This paper was prepared at Idaho National Laboratory for the U.S. Department of Energy under Contract Number (DE-AC07-05ID14517)
2012 ANS Annual MeetingChicago, IllinoisJune 24-28, 2012
2
Purpose of this Tutorial
•Discuss the benchmark process for ICSBEP and IRPhEP
•Provide brief demonstration of DICE and IDAT
3
Acknowledgements
• ICSBEP and IRPhEP are collaborative efforts that involve numerous scientists, engineers, administrative support personnel and program sponsors from 24 different countries and the OECD/NEA.
• The authors would like to acknowledge the efforts of all of those dedicated individuals without whom those two projects would not be possible.
4
Outline
I. Introduction to Benchmarkinga. Overviewb. ICSBEP/IRPhEP
II. Benchmark Experiment Availabilitya. DICE Demonstrationb. IDAT Demonstration
III. Dissection of a Benchmark Reporta. Experimental Datab. Experiment Evaluationc. Benchmark Modeld. Sample Calculationse. Benchmark Measurements
IV. Benchmark Participation NRAD
5
Outline
I. Introduction to Benchmarkinga. Overviewb. ICSBEP/IRPhEP
II. Benchmark Experiment Availabilitya. DICE Demonstrationb. IDAT Demonstration
III. Dissection of a Benchmark Reporta. Experimental Datab. Experiment Evaluationc. Benchmark Modeld. Sample Calculationse. Benchmark Measurements
IV. Benchmark Participation
6
INTRODUCTION TO BENCHMARKING
7
What Is a Benchmark?• Merriam-Webster• “a point of reference from
which measurements may be made”
• “something that serves as a standard by which others are measured or judged”
• “a standardized problem or test that serves as a basis for evaluation or comparison (as of computer system performance)”
The majority of data testing utilize critical benchmark models defined in
the ICSBEP Handbook.
CENDL-3.1 was also extensively
tested with ICSBEP Handbook
data.
12
Monte Carlo Code Validation
Use of select ICSBEP
benchmarks to validate code performance
Useful to validate other codes:
SCALE, SERPENT,
TRIPOLI, etc.
13
Prior to the ICSBEP and IRPhEP
LARRY
JOHN
STEVE
14
Physics Criteria for Benchmarks
Physics Criteria for Benchmarks Working Group was initiated in the 1980s as part of the U. S. DOE sponsored Nuclear Criticality Technology and Safety Project (NCT&SP)(1) Method used to determine k‑effective(2) Consistency among experimentally measured parameters. (e.g.,
fundamental mode multiplication should be determined by more than one method in order to insure consistency.)
(3) A rigorous and detailed description of the experimental mockup, its mechanical supports, and its surroundings is necessary. (Accompanying photographs and drawings are essential.)
(4) A complete specification of the geometry dimensions and material compositions including the methods of determination and the known sources of error and their potential propagation is necessary. Also, for completeness, unknown but suspected sources of error should be listed.
(5) A series of experiments is desirable in order to demonstrate the reproducibility of the results.
15
Physics Criteria for Benchmarks (continued)
Physics Criteria for Benchmarks Working Group (continued)
(6) A description of the experiment and results should appear in a refereed publication.
(7) These criteria were established primarily to provide guidelines for future experiments.
(8) Many of the earlier experiments do not satisfy all of these criteria.
(9) Failure to meet these criteria does not automatically disqualify an experiment from being considered as acceptable for use as a benchmark.
(10) An attempt is being made here to supplement the originally published data, through the evaluation process, to meet these criteria.
16
ANSI/ANS-19.5-1995 (Currently being Revised)
• Requirements for Reference Reactor Physics Measurements
– Criteria for Evaluation of Data:• To qualify as reference data
– Measurements performed with accepted and proven techniques– Or techniques shall be adequately demonstrated to be valid
• Reported uncertainties should be consistent with evidence• Compare with similar independent configurations or with a
wide range of data sets using accepted calculational methods• Check for consistency
17
Dilbert’s Incorrect View of Benchmarking
18
Purpose of the ICSBEP
• Compile benchmark-experiment data into a standardized format that allows analysts to easily use the data to validate calculational techniques and cross section data.
• Evaluate the data and quantify overall uncertainties through various types of sensitivity analyses
• Eliminate a large portion of the tedious and redundant research and processing of experiment data
• Streamline the necessary step of validating computer codes and nuclear data with experimental data
• Preserve valuable experimental data that will be of use for decades
19
History of ICSBEP
• Initiated in October of 1992 by the Department of Energy’s Defense Programs
• Organized and managed through the Idaho National Laboratory (INL)
• Involves nationally known criticality safety experts
20
ICSBEP U.S. Participants
LLNL
LANL
RFP
SNL
Hanford
ANL BAPL
SRNL
ORNLOak Ridge Y-12
INL
21
History of ICSBEP (continued)
• An International Criticality Safety Data Exchange component was added to the project in 1994
• The ICSBEP became an official activity of the OECD/NEA’s Nuclear Science Committee (NSC) in 1995
• Similar to the ICSBEP• Focus to collect data regarding the numerous
experiments in support of nuclear energy and technology performed at research laboratories
• Experiments represent significant investments of time, infrastructure, expertise, and cost that might not have received adequate documentation
• Measurements also include data regarding reactivity measurements, reaction rates, buckling, burnup, etc., that are of significant worth for current and future research and development efforts
24
History of IRPhEP
• Initiated as a pilot activity in 1999 by the OECD NEA Nuclear Science Committee – INL involvement from the beginning
• Endorsed as an official activity of the NSC in June of 2003
• Patterned after its predecessor, the ICSBEP
• Focuses on other integral measurements such as buckling, spectral characteristics, reactivity effects, reactivity coefficients, kinetics measurements, reaction-rate and power distributions, nuclide compositions and other miscellaneous types of measurements in addition to the critical configuration
• Involves internationally known reactor physics experts
• Technical Review Group managed through the Idaho National Laboratory (INL) for the OECD/NEA
As a result of ICSBEP and IRPhEP efforts:• A large portion of the tedious and redundant
research and processing of critical experiment data has been eliminated
• The necessary step in criticality safety analyses of validating computer codes with benchmark critical data is greatly streamlined
• Valuable criticality safety experimental data are preserved and will be of use for decades
31
Accomplishments of the ICSBEP and IRPhEP(continued)
The work of the ICSBEP and IRPhEP has:
• Highlighted gaps in data• Retrieved “lost” data• Helped to identify deficiencies and errors in cross
section processing codes and neutronics codes• Improved experimental planning, execution and
reporting
32
Accomplishments of the ICSBEP and IRPhEP(continued)
Over 400 scientists from 24 different countries have combined their efforts to produce the ICSBEP and IRPhEP Handbooks.
These two handbooks continue to grow and provide high-quality integral benchmark data that will be of use to the criticality safety, nuclear data, and reactor physics communities for future decades.
33
The ICSBEP is Featured in the 2003 September & October Issues of NS&E
34
ICSBEP Evaluation Content & Format
All ICSBEP evaluations follow the same general format:
1. Describe the Experiments2. Evaluate the Experiments3. Derive Benchmark Specifications4. Provide Results from Sample Calculations A. Typical Input ListingsB. Supporting Information
35
IRPhEP Evaluation Format
1.0 DETAILED DESCRIPTION
1.1 Description of the Critical or Subcritical Configuration1.1.1 Overview of Experiment1.1.2 Description of Experimental Configuration1.1.3 Description of Material Data1.1.4 Temperature Data1.1.5 Additional Information Relevant to Critical and
Subcritical Measurements
36
IRPhEP Evaluation Format (continued)
1.2 Description of Buckling and Extrapolation Length Measurements
1.3 Description of Spectral Characteristics Measurements1.4 Description of Reactivity Effects Measurements1.5 Description of Reactivity Coefficient Measurements1.6 Description of Kinetics Measurements1.7 Description of Reaction Rate Distribution Measurements1.8 Description of Power Distribution Measurements1.9 Description of Isotopic Measurements1.10 Description of Other Miscellaneous Types of Measurements
Etc.
37
Benchmark Process General Overview
1. Identify Experiment
2. Evaluate Experimenta. Prepare Benchmark
Report
3. Internal Review of Benchmark Report
4. Submit Benchmark Experiment to ICSBEP/IRPhEP
5. Independent Review of Benchmark Report
6. Distribution of Benchmark Report to Technical Review Group