Motivation New algorithm integral Computational Cost of integral Discussion References A New Guaranteed Adaptive Trapezoidal Rule Algorithm Fred J. Hickernell Department of Applied Mathematics, Illinois Institute of Technology [email protected]www.iit.edu/ ~ hickernell Joint work with Martha Razo (IIT BS student) and Sunny Yun (Stevenson High School 2014 graduate) Supported by NSF-DMS-1115392 February 18, 2015 [email protected]New Adaptive Trapezoidal Rule Meshfree Methods 1 / 30
33
Embed
A New Guaranteed Adaptive Trapezoidal Rule AlgorithmMeshfree-methods-seminar/GAIL_slides/Guarantee… · Motivation New algorithm integral Computational Cost of integral DiscussionReferences
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Motivation New algorithm integral Computational Cost of integral Discussion References
A New GuaranteedAdaptive Trapezoidal Rule Algorithm
Fred J. Hickernell
Department of Applied Mathematics, Illinois Institute of [email protected] www.iit.edu/~hickernell
Joint work with Martha Razo (IIT BS student) andSunny Yun (Stevenson High School 2014 graduate)
Motivation New algorithm integral Computational Cost of integral Discussion References
We Need Adaptive Algorithms
We Need Adaptive Numerical Algorithms
I We rely on the numerical software to solve mathematical and statisticalproblems: the NAG library (The Numerical Algorithms Group, 2013),MATLAB (The MathWorks, 2014), Mathematica, (Wolfram Research Inc.2014), and R (R Development Core Team, 2014).
I Functions like cos and erf give us the answer with the desired accuracyautomatically.
I Many numerical algorithms that we use are adaptive: MATLAB’s integral,fminbnd, and ode45, and the Chebfun MATLAB toolbox (Hale et al., 2014).They determine how much effort is needed to satisfy the error tolerance.
Motivation New algorithm integral Computational Cost of integral Discussion References
We Need Adaptive Algorithms
We Need Better Adaptive Numerical Algorithms
Most adaptive algorithms use heuristics. There are no guarantees that theyactually do what they claim. Exceptions are
I guaranteed algorithms for finding one zero of a function and for findingminima of unimodal functions that date from the early 1970s (Brent, 2013),
I guaranteed adaptive multivariate integration algorithms using Monte Carlo(Hickernell et al., 2014) and quasi-Monte Carlo methods (Hickernell andJimenez Rugama, 2014; Jimenez Rugama and Hickernell, 2014), and
I guaranteed adaptive algorithms for univariate function approximation (Clancyet al., 2014) and optimization of multimodal univariate functions (Tong,2014) using linear splines, and
I a guaranteed adaptive trapezoidal rule for univariate integration (Clancy etal., 2014).
Motivation New algorithm integral Computational Cost of integral Discussion References
Bounds on the Computational Cost of integral
Bounds on the Computational Cost of integral
Theorem
Let N(f, ε) denote the final number of trapezoids that is required byintegral(f, a, b, ε). Then this number is bounded below and above in terms ofthe true, yet unknown, Var(f ′).
max
(⌊2(b− a)
h
⌋+ 1,
⌈(b− a)
√Var(f ′)
8ε
⌉)≤ N(f, ε)
≤ 2 min0<α≤1
max
(⌊2(b− a)
αh
⌋+ 1,
⌈(b− a)
√C(αh) Var(f ′)
8ε
⌉).
The number of function values required by integral(f, a, b, ε) is N(f, ε) + 1.
Motivation New algorithm integral Computational Cost of integral Discussion References
Bounds on the Computational Cost of integral
Proof of Upper Bound on Computational Cost cont’d
So far we have
N(f, ε) ≤ 2n∗, n∗ := min
{n ∈ N : n ≥
⌊2(b− a)
h
⌋+ 1, η(n) Var(f ′) ≤ ε
}.
For fixed α ∈ (0, 1], we need only consider the case where n∗ >
⌊2(b− a)
αh
⌋+ 1,
so n∗ − 1 ≥⌊
2(b− a)
αh
⌋+ 1 >
2(b− a)
αh. Then
n∗ − 1 < (n∗ − 1)
√η(n∗ − 1) Var(f ′)
ε
= (n∗ − 1)
√(b− a)2C(2(b− a)/(n∗ − 1)) Var(f ′)
8(n∗ − 1)2ε
≤ (b− a)
√C(αh) Var(f ′)
8ε,
which completes the proof of the upper bound on n∗[email protected] New Adaptive Trapezoidal Rule Meshfree Methods 19 / 30
Motivation New algorithm integral Computational Cost of integral Discussion References
Lower Complexity Bound for Integration on C
Lower Complexity Bound for Integration on C
Theorem
Let int be any (possibly adaptive) algorithm that succeeds for all integrands in C,and only uses function values. For any error tolerance ε > 0 and any arbitraryvalue of Var(f ′), there will be some f ∈ C for which int must use at least
−3
2+ (b− a− 3h)
√[C(0)− 1] Var(f ′)
32ε
function values. As Var(f ′)/ε→∞ the asymptotic rate of increase is the same asthe computational cost of integral, provided C(0) > 1.
Motivation New algorithm integral Computational Cost of integral Discussion References
What Comes Next?
References I
Brent, R. P. 2013. Algorithms for minimization without derivatives, Dover Publications, Inc.,Mineola, NY. republication of the 1973 edition by Prentice-Hall, Inc.
Clancy, N., Y. Ding, C. Hamilton, F. J. Hickernell, and Y. Zhang. 2014. The cost ofdeterministic, adaptive, automatic algorithms: Cones, not balls, J. Complexity 30, 21–45.
Hale, N., L. N. Trefethen, and T. A. Driscoll. 2014. Chebfun version 5.
Hickernell, F. J., L. Jiang, Y. Liu, and A. B. Owen. 2014. Guaranteed conservative fixed widthconfidence intervals via Monte Carlo sampling, Monte Carlo and quasi-Monte Carlo methods2012, pp. 105–128.
Hickernell, F. J. and Ll. A. Jimenez Rugama. 2014. Reliable adaptive cubature using digitalsequences. submitted for publication, arXiv:1410.8615 [math.NA].
Jimenez Rugama, Ll. A. and F. J. Hickernell. 2014. Adaptive multidimensional integrationbased on rank-1 lattices. submitted for publication, arXiv:1411.1966.
Lyness, J. N. 1983. When not to use an automatic quadrature routine, SIAM Rev. 25, 63–87.
Moore, R. E., R. B. Kearfott, and M. J. Cloud. 2009. Introduction to interval analysis,Cambridge University Press, Cambridge.
R Development Core Team. 2014. The R Project for Statistical Computing.