Top Banner
www.inl.gov Benchmarking Experiments for Criticality Safety and Reactor Physics Applications – II – Tutorial John D. Bess and J. Blair Briggs – INL Ian Hill (IDAT) – OECD/NEA This paper was prepared at Idaho National Laboratory for the U.S. Department of Energy under Contract Number (DE-AC07-05ID14517) 2012 ANS Annual Meeting Chicago, Illinois June 24-28, 2012
38
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Benchmark Tutorial -- I - Introduction

ww

w.in

l.gov

Benchmarking Experiments for Criticality Safety and Reactor

Physics Applications – II – Tutorial

John D. Bess and J. Blair Briggs – INL

Ian Hill (IDAT) – OECD/NEA

This paper was prepared at Idaho National Laboratory for the U.S. Department of Energy under Contract Number (DE-AC07-05ID14517)

2012 ANS Annual MeetingChicago, IllinoisJune 24-28, 2012

Page 2: Benchmark Tutorial -- I - Introduction

2

Purpose of this Tutorial

•Discuss the benchmark process for ICSBEP and IRPhEP

•Provide brief demonstration of DICE and IDAT

Page 3: Benchmark Tutorial -- I - Introduction

3

Acknowledgements

• ICSBEP and IRPhEP are collaborative efforts that involve numerous scientists, engineers, administrative support personnel and program sponsors from 24 different countries and the OECD/NEA.

• The authors would like to acknowledge the efforts of all of those dedicated individuals without whom those two projects would not be possible.

Page 4: Benchmark Tutorial -- I - Introduction

4

Outline

I. Introduction to Benchmarkinga. Overviewb. ICSBEP/IRPhEP

II. Benchmark Experiment Availabilitya. DICE Demonstrationb. IDAT Demonstration

III. Dissection of a Benchmark Reporta. Experimental Datab. Experiment Evaluationc. Benchmark Modeld. Sample Calculationse. Benchmark Measurements

IV. Benchmark Participation NRAD

Page 5: Benchmark Tutorial -- I - Introduction

5

Outline

I. Introduction to Benchmarkinga. Overviewb. ICSBEP/IRPhEP

II. Benchmark Experiment Availabilitya. DICE Demonstrationb. IDAT Demonstration

III. Dissection of a Benchmark Reporta. Experimental Datab. Experiment Evaluationc. Benchmark Modeld. Sample Calculationse. Benchmark Measurements

IV. Benchmark Participation

Page 6: Benchmark Tutorial -- I - Introduction

6

INTRODUCTION TO BENCHMARKING

Page 7: Benchmark Tutorial -- I - Introduction

7

What Is a Benchmark?• Merriam-Webster• “a point of reference from

which measurements may be made”

• “something that serves as a standard by which others are measured or judged”

• “a standardized problem or test that serves as a basis for evaluation or comparison (as of computer system performance)”

© Gary Price

Page 8: Benchmark Tutorial -- I - Introduction

8

WHY ARE YOU INTERESTED IN BENCHMARKS?

?

Page 9: Benchmark Tutorial -- I - Introduction

9

How Does Benchmark Design Apply to You?

Page 10: Benchmark Tutorial -- I - Introduction

10

Why Do We Have Nuclear Benchmarks?

• Nuclear Safety– Plant Operations

• Training– Transportation– Waste Disposal– Experimentation– Accident Analysis– Standards Development– Homeland Security

• Materials– Testing– Physics Validation– Interrogation

• Research and Development

– New Reactor Designs– Design Validation

• Computational Methods– Cross-Section Data– Code Verification

• Fundamental Physics– Model Validation

• Fun

Page 11: Benchmark Tutorial -- I - Introduction

11

Cross Section Evaluation Working Group (CSEWG)

The majority of data testing utilize critical benchmark models defined in

the ICSBEP Handbook.

CENDL-3.1 was also extensively

tested with ICSBEP Handbook

data.

Page 12: Benchmark Tutorial -- I - Introduction

12

Monte Carlo Code Validation

Use of select ICSBEP

benchmarks to validate code performance

Useful to validate other codes:

SCALE, SERPENT,

TRIPOLI, etc.

Page 13: Benchmark Tutorial -- I - Introduction

13

Prior to the ICSBEP and IRPhEP

LARRY

JOHN

STEVE

Page 14: Benchmark Tutorial -- I - Introduction

14

Physics Criteria for Benchmarks

Physics Criteria for Benchmarks Working Group was initiated in the 1980s as part of the U. S. DOE sponsored Nuclear Criticality Technology and Safety Project (NCT&SP)(1) Method used to determine k‑effective(2) Consistency among experimentally measured parameters. (e.g.,

fundamental mode multiplication should be determined by more than one method in order to insure consistency.)

(3) A rigorous and detailed description of the experimental mockup, its mechanical supports, and its surroundings is necessary. (Accompanying photographs and drawings are essential.)

(4) A complete specification of the geometry dimensions and material compositions including the methods of determination and the known sources of error and their potential propagation is necessary. Also, for completeness, unknown but suspected sources of error should be listed.

(5) A series of experiments is desirable in order to demonstrate the reproducibility of the results.

Page 15: Benchmark Tutorial -- I - Introduction

15

Physics Criteria for Benchmarks (continued)

Physics Criteria for Benchmarks Working Group (continued)

(6) A description of the experiment and results should appear in a refereed publication.

(7) These criteria were established primarily to provide guidelines for future experiments.

(8) Many of the earlier experiments do not satisfy all of these criteria.

(9) Failure to meet these criteria does not automatically disqualify an experiment from being considered as acceptable for use as a benchmark.

(10) An attempt is being made here to supplement the originally published data, through the evaluation process, to meet these criteria.

Page 16: Benchmark Tutorial -- I - Introduction

16

ANSI/ANS-19.5-1995 (Currently being Revised)

• Requirements for Reference Reactor Physics Measurements

– Criteria for Evaluation of Data:• To qualify as reference data

– Measurements performed with accepted and proven techniques– Or techniques shall be adequately demonstrated to be valid

• Reported uncertainties should be consistent with evidence• Compare with similar independent configurations or with a

wide range of data sets using accepted calculational methods• Check for consistency

Page 17: Benchmark Tutorial -- I - Introduction

17

Dilbert’s Incorrect View of Benchmarking

Page 18: Benchmark Tutorial -- I - Introduction

18

Purpose of the ICSBEP

• Compile benchmark-experiment data into a standardized format that allows analysts to easily use the data to validate calculational techniques and cross section data.

• Evaluate the data and quantify overall uncertainties through various types of sensitivity analyses

• Eliminate a large portion of the tedious and redundant research and processing of experiment data

• Streamline the necessary step of validating computer codes and nuclear data with experimental data

• Preserve valuable experimental data that will be of use for decades

Page 19: Benchmark Tutorial -- I - Introduction

19

History of ICSBEP

• Initiated in October of 1992 by the Department of Energy’s Defense Programs

• Organized and managed through the Idaho National Laboratory (INL)

• Involves nationally known criticality safety experts

Page 20: Benchmark Tutorial -- I - Introduction

20

ICSBEP U.S. Participants

LLNL

LANL

RFP

SNL

Hanford

ANL BAPL

SRNL

ORNLOak Ridge Y-12

INL

Page 21: Benchmark Tutorial -- I - Introduction

21

History of ICSBEP (continued)

• An International Criticality Safety Data Exchange component was added to the project in 1994

• The ICSBEP became an official activity of the OECD/NEA’s Nuclear Science Committee (NSC) in 1995

Page 22: Benchmark Tutorial -- I - Introduction

22

ICSBEP International Partners – 20 Countries

Republicof Korea

1996Japan1994

China2006

India2005

Spain2000

UnitedKingdom

1994

United States1992

Brazil2004

Russian Federation1994

Israel 2000

France1994

Hungary 1994Slovenia 1998Serbia 1999Kazakhstan 1999Czech Republic 2004Poland 2005

Argentina2008

Canada2005

Sweden2007

Page 23: Benchmark Tutorial -- I - Introduction

23

Purpose of the IRPhEP

• Similar to the ICSBEP• Focus to collect data regarding the numerous

experiments in support of nuclear energy and technology performed at research laboratories

• Experiments represent significant investments of time, infrastructure, expertise, and cost that might not have received adequate documentation

• Measurements also include data regarding reactivity measurements, reaction rates, buckling, burnup, etc., that are of significant worth for current and future research and development efforts

Page 24: Benchmark Tutorial -- I - Introduction

24

History of IRPhEP

• Initiated as a pilot activity in 1999 by the OECD NEA Nuclear Science Committee – INL involvement from the beginning

• Endorsed as an official activity of the NSC in June of 2003

• Patterned after its predecessor, the ICSBEP

• Focuses on other integral measurements such as buckling, spectral characteristics, reactivity effects, reactivity coefficients, kinetics measurements, reaction-rate and power distributions, nuclide compositions and other miscellaneous types of measurements in addition to the critical configuration

• Involves internationally known reactor physics experts

• Technical Review Group managed through the Idaho National Laboratory (INL) for the OECD/NEA

Page 25: Benchmark Tutorial -- I - Introduction

25

IRPhEP International Partners – 16 Countries

Page 26: Benchmark Tutorial -- I - Introduction

26

Purpose of the ICSBEP and IRPhEP

Page 27: Benchmark Tutorial -- I - Introduction

27

What Gets Benchmarked?

• Criticality/Subcriticality• Buckling/Extrapolation

Length• Spectral Characteristics• Reactivity Effects• Reactivity Coefficient

Data• Kinetics Measurements

Data

• Reaction-Rate Distributions

• Power Distribution Data• Isotopic Measurements• Miscellaneous

If it is worth measuring, then it is worth evaluating.

Page 28: Benchmark Tutorial -- I - Introduction

28

International Handbook of Evaluated Criticality Safety Benchmark Experiments

September 2011 Edition• 20 Contributing Countries• Spans over 62,600 Pages• Evaluation of 532 Experimental Series• 4,550 Critical, Subcritical, or K∞

Configurations• 24 Criticality-Alarm/ Shielding

Benchmark Configurations – numerous dose points each

• 155 fission rate and transmission measurements and reaction rate ratios for 45 different materials

http://icsbep.inl.gov/http://www.oecd-nea.org/science/wpncs/icsbep/

Sept 2012617 & 4,703

Page 29: Benchmark Tutorial -- I - Introduction

29

International Handbook of Evaluated Reactor Physics Benchmark Experiments

March 2012 Edition • 16 Contributing Countries• Data from 56 Experimental Series

performed at 32 Reactor Facilities• Data from 52 out of the 56 series are

published as approved benchmarks• Data from 4 out of the 56 series are

published in DRAFT form

http://irphep.inl.govhttp://www.oecd-nea.org/science/wprs/irphe/

Page 30: Benchmark Tutorial -- I - Introduction

30

Accomplishments of the ICSBEP and IRPhEP

As a result of ICSBEP and IRPhEP efforts:• A large portion of the tedious and redundant

research and processing of critical experiment data has been eliminated

• The necessary step in criticality safety analyses of validating computer codes with benchmark critical data is greatly streamlined

• Valuable criticality safety experimental data are preserved and will be of use for decades

Page 31: Benchmark Tutorial -- I - Introduction

31

Accomplishments of the ICSBEP and IRPhEP(continued)

The work of the ICSBEP and IRPhEP has:

• Highlighted gaps in data• Retrieved “lost” data• Helped to identify deficiencies and errors in cross

section processing codes and neutronics codes• Improved experimental planning, execution and

reporting

Page 32: Benchmark Tutorial -- I - Introduction

32

Accomplishments of the ICSBEP and IRPhEP(continued)

Over 400 scientists from 24 different countries have combined their efforts to produce the ICSBEP and IRPhEP Handbooks.

These two handbooks continue to grow and provide high-quality integral benchmark data that will be of use to the criticality safety, nuclear data, and reactor physics communities for future decades.

Page 33: Benchmark Tutorial -- I - Introduction

33

The ICSBEP is Featured in the 2003 September & October Issues of NS&E

Page 34: Benchmark Tutorial -- I - Introduction

34

ICSBEP Evaluation Content & Format

All ICSBEP evaluations follow the same general format:

1. Describe the Experiments2. Evaluate the Experiments3. Derive Benchmark Specifications4. Provide Results from Sample Calculations A. Typical Input ListingsB. Supporting Information

Page 35: Benchmark Tutorial -- I - Introduction

35

IRPhEP Evaluation Format

1.0 DETAILED DESCRIPTION

1.1 Description of the Critical or Subcritical Configuration1.1.1 Overview of Experiment1.1.2 Description of Experimental Configuration1.1.3 Description of Material Data1.1.4 Temperature Data1.1.5 Additional Information Relevant to Critical and

Subcritical Measurements

Page 36: Benchmark Tutorial -- I - Introduction

36

IRPhEP Evaluation Format (continued)

1.2 Description of Buckling and Extrapolation Length Measurements

1.3 Description of Spectral Characteristics Measurements1.4 Description of Reactivity Effects Measurements1.5 Description of Reactivity Coefficient Measurements1.6 Description of Kinetics Measurements1.7 Description of Reaction Rate Distribution Measurements1.8 Description of Power Distribution Measurements1.9 Description of Isotopic Measurements1.10 Description of Other Miscellaneous Types of Measurements

Etc.

Page 37: Benchmark Tutorial -- I - Introduction

37

Benchmark Process General Overview

1. Identify Experiment

2. Evaluate Experimenta. Prepare Benchmark

Report

3. Internal Review of Benchmark Report

4. Submit Benchmark Experiment to ICSBEP/IRPhEP

5. Independent Review of Benchmark Report

6. Distribution of Benchmark Report to Technical Review Group

7. Technical Review Meeting

8. Resolve Action Items

9. Handbook Publication

Page 38: Benchmark Tutorial -- I - Introduction

38

Questions?

Questions?

GROTESQUE