1 July 19, 2012 Bayesian Methods in Probability of Detection Estimation and Model-assisted Probability of Detection (MAPOD) Evaluation John C. Aldrin*, Jeremy S. Knopp, Harold A. Sabbagh † Nondestructive Evaluation Branch (AFRL/RXLP) Materials and Manufacturing Directorate Air Force Research Laboratory Wright-Patterson AFB, Ohio, USA *Computational Tools, Gurnee, Illinois, USA † Victor Technologies LLC, Bloomington, Indiana, USA Review of Progress in Quantitative NDE Denver, CO, USA July 19, 2012
24
Embed
Bayesian Methods in Probability of Detection …computationaltools.com/Papers/Aldrin Knopp - 2012 - QNDE -Bayes POD...Bayesian Methods in Probability of Detection Estimation and Model
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
1 July 19, 2012
Bayesian Methods in Probability of Detection Estimation and Model-assisted Probability of
Detection (MAPOD) Evaluation John C. Aldrin*, Jeremy S. Knopp, Harold A. Sabbagh†
Nondestructive Evaluation Branch (AFRL/RXLP)
Materials and Manufacturing Directorate Air Force Research Laboratory
Wright-Patterson AFB, Ohio, USA
*Computational Tools, Gurnee, Illinois, USA † Victor Technologies LLC, Bloomington, Indiana, USA
Review of Progress in Quantitative NDE
Denver, CO, USA July 19, 2012
2 July 19, 2012
Outline
• Background
• Demonstrations:
– Multiparameter Regression Models
– Integration of Physics-based Models
– Hierarchical Models in POD (MAPOD) Evaluation
• Discussion:
– Software Tools
– Challenges / Future Work
3 July 19, 2012
What is Probability of Detection (POD) and Model Assisted POD (MAPOD)?
• Probability of detection (POD) of a certain discontinuity as a function of some size metric given a defined inspection technique and target population.
• We define “MAPOD” as the collection of approaches that use models of inspections as some portion of the inputs that are processed to yield an estimate of POD.
4 5 6 7 8 9 2 3 4 5 6 7 8 9
4
5
6
7
8
9
2
3
4
5
6
7
8
9
m h 1 8 2 3
E X A M P L E 1 â v s a . x l s
0 . 6
0 . 8
resp
onse
(arb
itrar
y un
its)
discontinuity size (arbitrary units)
noise in the absence of
discontinuity
data
decision threshold
FALSE CALLSβ0
β1
ε
εββ ++= aa lnln 10
^
11
0 and )ln(
whereβδσ
ββ
µ =−
= thy
−
Φ−=σ
µaaPOD ln1)(
:Φ cumulative normal distribution function
Evaluating Reliability Using Simulated and Empirical Data: • To mitigate cost of validation study, one must better assess the
critical sources of error and variation on reliability performance • Hoppe [2009] presented historical case highlighting benefit of improving
the measurement model through including crack length and depth in fit •
•
• Physics-based models provide opportunity for reducing experimental samples and cost
Mitigating Cost of POD Study through Improved Model Accuracy
Objective: Explore Case Studies to Assess Impact of Measurement Model Quality on POD Estimation and Sample Number
εββ ++= 110ˆ aa
εβββ +++= 22110ˆ aaa
Increase Model Accuracy
Reduces
Residuals in Model Fit
Improves Bounds on Parameter Estimates (POD)
Impacts Experimental
Sampling Requirements ( ) εββ ++= 2110 ,ˆ aafa
Model-Assisted POD Model Building Process [MIL-HNBK 1823A, Appendix H (2009)]
Uncertainty Propagation
Model Error
Input Parameter Variability
(Distributions)
Stochastic Models
Model ‘Calibration‘
Revise Model Estimates Using Bayesian Methods
Confidence Bounds (Limited Samples)
Objective: Leverage Bayesian Method in MAPOD Evaluation
Assess Key Factors (Joint PDFs) using Bayesian Methods
Approach: Integrate Modeling and Simulations with Empirical Studies
• Bayesian methods are necessary to incorporate empirical data with NDE models (prior information)
• Application of Bayes’ rule: • : prior probability of θ • : conditional probability (likelihood)
of new evidence (data), x , given θ • : posterior probability of θ given
new evidence x • Posterior distribution can be evaluated, providing
a refinement to the original prior distribution through numerical methods such as Markov Chain Monte Carlo (MCMC) simulation
General References: 1. Gelman, A., Carlin, J.B., Stern, H.S., Rubin, D.B., Bayesian Data Analysis, 2003. 2. Lunn, D.; Spiegelhalter, D.; Thomas, A.; Best, N. (2009). "The BUGS project:
Evolution, critique and future directions". Statistics in Medicine 28: 3049–3067 3. Christensen, R., W. Johnson, and A. Branscum, Bayesian Ideas and Data Analysis:
An Introduction for Scientists and Statisticians,” CRC Press, 2010. NDE References: 1. Meeker, W.Q. and L.A. Escobar, "Introduction to the Use of Bayesian Methods for
Reliability Data," Statistical Methods for Reliability Data, Wiley, 1998, pp. 343-368. 2. Leemans, D.V, and Forsyth, D., “Bayesian Approaches to Using Field Test Data in
Determining the Probability of Detection,” Materials Evaluation, 2004. – early demonstration of Bayesian methods in hit-miss POD evaluation
4. Wang, Y., “Advanced statistical methods for analysis of NDE data ,” Dissertation, 2006. (Advisor: W.Q. Meeker.)
5. Thompson, R.B., A Bayesian Approach to the Inversion of NDE and SHM Data”, Rev. Prog. Quant. Nondestr. Eval, Vol 29, 2010, pp. 679-686.
Bayesian Methods in MAPOD/POD Evaluation – Prior Work
Bayesian Methods in MAPOD/POD Evaluation - Recent Work
NDE References (cont.): 6. Li, Meeker and Hovey, “Joint Estimation of NDE Inspection Capability and Flaw-
size Distribution for In-service Aircraft Inspections,” RNDE, 2012. – Evaluate noise interference model POD and crack distribution
7. Kanzler, D., Muller, C., Pitkanen, J., Ewert, U.,“Bayesian Approach for the Evaluation of the Reliability of Non-Destructive Testing Methods,” WCNDT 2012.
Related Recent Work for AFRL: 1. Statistical Analysis of Hit/Miss Data using Bayes Factors (Model Selection)
[Knopp and Zeng, 2012, submitted for publication] 2. Application of Gaussian Process Models for Quantifying the Accuracy and
Capability of Nondestructive Sensing Methods for Damage Characterization – Victor Technologies Phase I SBIR [Aldrin et al., 2012; ASNT Fall conference ]
Objectives of Presentation: • Present Bayesian methods for POD evaluation with NDE measurement
models of increasing complexity: – Multivariate models – Physics-based models
• Explore Bayesian methods for stochastic model parameter estimation
Demo 1: Eddy Current Inspection of Surface-breaking Cracks in Ti-6Al-4V
• Comprehensive tool for MCMC simulation • Define ‘model’ as separate file • Guide to Running WinBugs and OpenBugs from R
• http://www.stat.columbia.edu/~gelman/bugsR/ • Interface with R / Matlab for Bayesian POD Evaluation
• R: function (x1, y1, a.hat.decision, model.file, winbugs.path) • Very difficult to embed numerical model results in ‘model’
• Matlab - DRAM - Delayed Rejection Adaptive Metropolis [http://www.helsinki.fi/~mjlaine/dram/] • Provides means to embed Matlab function calls in MCMC • Facilitates integration of physics-based (surrogate) models • Not as general and robust as OpenBUGS
• Matlab - Statistics Toolbox • Pymc: Python Toolkit for Markov Chain Monte Carlo sampling
Demo 2: EC Inspection of Fastener Sites for Fatigue Cracks
• C-5 Wing Splice Fatigue Crack Specimens: – Two layer specimens are 14" long and 2" wide, – 0.156" top layer, 0.100" bottom layer – 90% fasteners were titanium, 10% fasteners were steel – Fatigue cracks position at 6 and 12 o’clock positions – Crack length ranged from 0.027" – 0.169“ (2nd layer) – vary: location of cracks – at both 1st and 2nd layer
• AFRL/UDRI Acquired Data (Hughes, Dukate, Martin)
• Example: Eddy Current Inspection of Cracks at Fastener Sites • Case Study for Physics-based Model Evaluation:
• • where f () is a function call for a physics-based model (i.e. VIC-3D)
• Bayesian POD Analysis Performed in Matlab + R: • MCMC library in Matlab used for Bayesian Analysis • Matlab Provides Option for Integration of Model Function in Bayesian Fit
• Compare Ahat-vs-a fit (MLE, Wald bounds) and a Physics-based Model Fit
• Example: Eddy Current Inspection of Cracks at Fastener Sites • Case Study for Physics-based Model Evaluation:
• • where f () is a function call for a physics-based model (i.e. VIC-3D)
• Bayesian POD Analysis Performed in Matlab + R: • MCMC library in Matlab used for Bayesian Analysis • Matlab Provides Option for Integration of Model Function in Bayesian Fit
• Compare Ahat-vs-a fit (MLE, Wald bounds) and a Physics-based Model Fit
Bayesian Methods for POD / MAPOD Evaluation (2)
( ) εββ ++= 2110 ,ˆ aafa
zb
a
0 0.02 0.04 0.06 0.08 0.1 0.12 0.14 0.16 0.180
0.5
1
a1 (in.)
PO
D
Ahat-vs-a fit (MLE, Wald bounds) Physics-based model fit (Bayes/MCMC)
Result is More Accurate Representation of the Data in
• Example: Eddy Current Inspection of Cracks at Fastener Sites • Case Study for Physics-based Model Evaluation:
• • where f () is a function call for a physics-based model • β0 , β1 = model calibration parameters • β2 = random variable associated with crack aspect ratio (b/a) • β3 = random variable associated with liftoff variation
• Results: Fit POD Model and Estimate of Variation in Aspect Ratio [use non-informative priors]
• Issues with ‘Naïve’ Approach: • Need true estimate of variance for
crack aspect ratio random variable → Use hierarchical models
• Address correlated / confounded parameters in estimation problem → Use informative priors and
• Example: Eddy Current Inspection of Cracks at Fastener Sites • Challenge: Address non-constant variance wrt flaw size • Case Study for Physics-based Model Evaluation:
• • where f () is a function call for a physics-based model • β0 , β1 = model calibration parameters • β2 = random variable associated with crack aspect ratio (b/a)
Hierarchical Models for Estimating Variance of a Random Variable
( ) εβββ ++= 2110 ;ˆ afa
zb
a
),0(~ 2εσε N
experimental results simulated examples
β0 = 0.0 β1 = 1.0
µ_β2 = 0.75 σ_β2 = 0.12 σ_ε = 0.0
1st
2nd
0 0.05 0.1 0.15 0.2-0.1
0
0.1
0.2
0.3
a1 (in.)
a hat
• Example: Eddy Current Inspection of Cracks at Fastener Sites • Challenge: Address non-constant variance wrt flaw size • Hierarchical NDE Measurement Models:
•
• where f () is a function call for a physics-based model • β0 , β1 = model calibration parameters • η = random variable (varying-slope model) • σ2
η = variance in slope parameter • β2 = random variable associated
with crack aspect ratio (b/a) • σ2
η = variance in slope parameter
• Simple Test Case: Fit data from model with varying slope >> noise.
Hierarchical Models for Estimating Variance of a Random Variable
( ) aaa ˆ110ˆ εηββ +++=
zb
a
),0(~ 2ˆˆ aa N σε
1st 2nd
( ) aafa ˆ2110 ;ˆ εβββ ++=
ηεη = ),0(~ 2ηη σε N ),(~ 2
2 22 ββ σµβ N);,0(~ 2ˆˆ aa N σε
physics-based model statistical model
• A. Gelman, J. B. Carlin, H. S. Stern, and D. B. Rubin, Bayesian Data Analysis, 2003. • A. Gelman and J. Hill, Data Analysis Using Regression and Multilevel/Hierarchical Models, 2007.
• Simple Test Case: Fit data from model with varying slope >> noise. • Hierarchical NDE Measurement Models: • Results: Ns = 100
Hierarchical Models for Estimating Variance of a Random Variable
( ) aaa ˆ110ˆ εηββ +++= ),0(~ 2ˆˆ aa N σε
ηεη = ),0(~ 2ηη σε N
Ns = 100 β0 β1 ση σε True Value 0.0000 1.0000 0.3000 0.00100 WinBUGS
for the use of computational models with observational data • SAMSI program on UQ (2012): • SIAM UQ conference 2012: http://www.siam.org/meetings/uq12/
Key Insight / Research Directions: • 1) Must include model discrepancy and not treat it as random error.
• Calibrating (inverting, tuning) a wrong model gives parameter estimates that are wrong (not equal to their true physical values) [O’Hagan, 2012]
• Gaussian Process (GP) models typically used to fit model discrepancy [Kennedy/O’Hagan 2002].
• 2) Use of prior information in Bayesian framework can greatly help. • To learn about model parameters in the presence of discrepancy, better
prior information is needed [Bayarri, 2012] • Elicitation of expert opinion is an active research topic [O’Hagan, 2012]
• 3) Should leverage model form uncertainty (assessment) approaches. • To identify best models and address limitations cited by UQ community
[Grandhi et al, Wright State University]
24 July 19, 2012
Acknowledgements
• This work was partially supported by the U.S. Air Force Research Laboratory under UTC Prime Contract, FA8650-10-D-5210
• Charles Annis, Statistical Engineering
• David Forsyth, TRI/Austin
• Eric Lindgren, AFRL
Bayesian POD Evaluation Examples and Code Coming Soon www.computationaltools.com