www.inl.gov Benchmarking Experiments for Criticality Safety and Reactor Physics Applications – II – Tutorial John D. Bess and J. Blair Briggs – INL Ian Hill (IDAT) – OECD/NEA This paper was prepared at Idaho National Laboratory for the U.S. Department of Energy under Contract Number (DE-AC07-05ID14517) 2012 ANS Annual Meeting Chicago, Illinois June 24-28, 2012
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
ww
w.in
l.g
ov
Benchmarking Experiments for Criticality Safety and Reactor
Physics Applications – II –Tutorial
John D. Bess and J. Blair Briggs – INL
Ian Hill (IDAT) – OECD/NEA
This paper was prepared at Idaho National Laboratory for the U.S. Department of
Energy under Contract Number (DE-AC07-05ID14517)
2012 ANS Annual Meeting
Chicago, Illinois
June 24-28, 2012
Outline
I. Introduction to Benchmarkinga. Overviewb. ICSBEP/IRPhEP
II. Benchmark Experiment Availabilitya. DICE Demonstrationb. IDAT Demonstration
III. Dissection of a Benchmark Reporta. Experimental Datab. Experiment Evaluationc. Benchmark Modeld. Sample Calculationse. Benchmark Measurements
IV. Benchmark Participation
2
BENCHMARK PARTICIPATION
3
Getting Started
• Criticality Safety Analyses– Most of the work has
already been performed– Typically only uncertainty
and bias calculations remain
• Reactor Operations– Occur on a daily basis
worldwide– Requires reorganization
and evaluation of activities performed
• Student and Young Generation Involvement– Provide education
opportunities prior to incorporation into the workforce
– Expand upon training at facilities/workplace
• Especially when handbook data is already widely utilized
• Computational analysis is rapidly being incorporated in the nuclear community
• Just for Fun
4
Getting Started (Continued)
• ICSBEP Handbook– Uncertainty Guide
– Critical/Subcritical Guide
– Alarm/Shielding Guide
– Fundamental Physics Guide
– DICE User’s Manual
– Reviewer Checklist
• IRPhEP Handbook– Evaluation Guide
– Uncertainty Guide
• Look at available benchmark data
• Talk with current and past participants in the benchmark programs
• What are you interested in benchmarking?
• What if you can’t do a complete benchmark?
5
6
Past and Present Student Involvement
• Since 1995, ~30 students have participated in the ICSBEP and/or IRPhEP– 14 directly at INL
• Students have authored or coauthored over 60 benchmark evaluations– 28 & 2+ in progress at INL
• They have also submitted technical papers to various conferences and journals
7
Opportunities for Involvement
• Each year the ICSBEP hosts up to one or two Summer Interns at the Idaho National Laboratory
• INL has also funded benchmark development via the CSNR Next Degree Program
• Students have participated in the projects as subcontractors through various universities and laboratories
• Benchmark development represents excellent work for collaborative Senior Design Projects, Master of Engineering Project, or Master of Science Thesis Topic
• Further information can be Obtained by contacting the Chairman of the ICSBEP– J. Blair Briggs, [email protected]
• Research Reactors– More TRIGAs– AGN– PULSTAR– Argonaut– Pool-type such as MSTR– MNSR– HFIR– ACRR– Fast Burst Reactors– Kodak CFX
• Experiments at ORCEF– Mihalczo HEU Sphere– USS Sandwich– Moderated Jemima– Numerous HEU Cases– SNAP Water Immersion– “Libby” Johnson Cases
• And many, many, more– Similar experiments
validate the benchmark process and further improve nuclear data
– See OECD/NEA IRPhEP website for reports with data for benchmarking
18
General Benchmark Evaluation/Review Process
• ICSBEP and IRPhEP benchmarks are subject to extensive review
– Evaluator(s) – primary assessment of the benchmark
– Internal Reviewer(s) – in-house verification of the analysis and adherence to procedure
– Independent Reviewer(s) – external (often foreign) verification of the analysis via international experts
– Technical Review Workgroup Meeting – annual international effort to review all benchmarks prior to inclusion in the handbook
• Sometimes a subgroup is assigned to assess any final workgroup comments and revisions prior to publication
– Benchmarks are determined to be Acceptable or Unacceptable for use depending on uncertainty in results
• All Approved benchmarks are retained in the handbook
• Unacceptable data are published for documentation purposes, but benchmark specifications are not provided
19
20
Quality Assurance
Each experiment evaluation included in the Handbook undergoes a thorough internal review by the evaluator's organization. Internal reviewers are expected to verify:
1. The accuracy of the descriptive information given in the evaluation by comparison with original documentation (published and unpublished).
2. That the benchmark specification can be derived from the descriptive information given in the evaluation
3. The completeness of the benchmark specification
4. The results and conclusions
5. Adherence to format.
21
Quality Assurance (continued)
In addition, each evaluation undergoes an independent peer review by another Technical Review Group member at a different facility. Starting with the evaluator's submittal in the appropriate format, independent peer reviewers are expected to verify:
1.That the benchmark specification can be derived from the descriptive information given in the evaluation
2.The completeness of the benchmark specification
3.The results and conclusions
4.Adherence to format.
22
Quality Assurance (continued)
A third review by the Technical Review Group verifies that the benchmark specifications and the conclusions were adequately supported.
Annual Technical Review Meeting
• Typically 30 international participants
• Usually OECD/NEA Headquarters in Paris, France
• Technical review group converges to discuss each benchmark, page-by-page, prior to approval for entry into the handbook
• List of Action Items are recorded; these must be resolved prior to publication– Final approval given by Internal and Independent
Reviewers and, if necessary, a Subgroup of additional reviewers
• Final benchmark report sent to INL for publication
23
What to Expect at Tech Review Group Meetings
24
Developing International Camaraderie
25
Annual Important Dates (Guidelines)
ICSBEP IRPhEP
Benchmark Evaluation Year-round Year-round
Internal Review Year-round Year-round
Independent Review February – March August – September
Technical Review Group
Distribution
Late March –
Early April
Late September –
Early October
Technical Review Meeting Early May Late October
Resolution of Action Items May – July November – January
Submission of Final Report Before Late July Before Late January
Handbook Preparation June – September November – March