Representation and Analysis of Piecewise Linear Functions in Abs-normal Form Representation and Analysis of Piecewise Linear Functions in Abs-normal Form Andreas Griewank with thanks to Andrea Walther, Torsten Bosse, Nikolai Strogies, et al Sophia-Antipolis 10. June 2013
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Representation and Analysis ofPiecewise Linear Functions
in Abs-normal Form
Andreas Griewank
with thanks to
Andrea Walther, Torsten Bosse, Nikolai Strogies, et al
Sophia-Antipolis 10. June 2013
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Generalized Derivatives and SemismoothnessBackground and MotivationGeneralized differentiation rulesSemismooth Newton Result
Piecewise linearization ApproachAlgorithmic piecewise linearizationPiecewise linearization rulesProperties of PL functions and Abs-normal formComputing Generalized Jacobians
Back to Abs-normal FormRepresentation and AnalysisIterative Equation SolvingEquivalence to Linear Complementarity
Recent Observations on Generalized Hessians
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Consequence:All compositions of smooth functions and abs() aresemismooth in Rn !!!
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Generalized Derivatives and Semismoothness
Semismooth Newton Result
Proposition by Kummer, Qi Kunisch et alSemismoothness ensures that generalized Newton:
xk+1 = xk − J−1F (xk) with J ∈ ∂LF (xk)
converges superlinearly to root x∗ ∈ F−1(0) provided
‖x0 − x∗‖ ≤ ρ and ‖J−1‖ ≤ M <∞ for J ∈ ∂LF (x∗)
Doubts concerning Applicability:
I How can we calculate some J ∈ ∂LF (x) ?I How small is contraction radius ρ > 0 ?
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Generalized Derivatives and Semismoothness
Semismooth Newton Result
Proposition by Kummer, Qi Kunisch et alSemismoothness ensures that generalized Newton:
xk+1 = xk − J−1F (xk) with J ∈ ∂LF (xk)
converges superlinearly to root x∗ ∈ F−1(0) provided
‖x0 − x∗‖ ≤ ρ and ‖J−1‖ ≤ M <∞ for J ∈ ∂LF (x∗)
Doubts concerning Applicability:
I How can we calculate some J ∈ ∂LF (x) ?I How small is contraction radius ρ > 0 ?
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Generalized Derivatives and Semismoothness
Semismooth Newton Result
Proposition by Kummer, Qi Kunisch et alSemismoothness ensures that generalized Newton:
xk+1 = xk − J−1F (xk) with J ∈ ∂LF (xk)
converges superlinearly to root x∗ ∈ F−1(0) provided
‖x0 − x∗‖ ≤ ρ and ‖J−1‖ ≤ M <∞ for J ∈ ∂LF (x∗)
Doubts concerning Applicability:
I How can we calculate some J ∈ ∂LF (x) ?I How small is contraction radius ρ > 0 ?
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Generalized Derivatives and Semismoothness
Semismooth Newton Result
Contraction radius ≤ distance to next kink
−2 −1.5 −1 −0.5 0 0.5 1 1.5 2−0.8
−0.6
−0.4
−0.2
0
0.2
0.4
0.6
0.8
1Small Radius of Contraction for Semi−Smooth Newton
sin(1/3*x) for x ≤ 0 sin(x) for x > 0
y = ε
A. Griewank, F. Dalkowski, N. Krejic, Z. Luzanin F. Rodrigues, A. Walther SCAN2010
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Generalized Derivatives and Semismoothness
Semismooth Newton Result
Lessons for/from generalized Newton
I Generally, Newton cannot cross kinks, but must start in aneighboorhod of smoothness, whose closure contains the root.In other words the combinatorial aspect of the problem musthave been sorted out beforehand by picking the initial point.
I Usually semi-smooth Newton is applied to piecewise-smoothproblems where selection occurs only at one level andgeneralized derivatives can be coded by hand. In general theeffect of superimposed nonsmoothness cannot be handled.
I Remedy for Jacobian calculation and convergence stabilization:Piecewise Linearization and Piecewise Linear Newton!!!!
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Generalized Derivatives and Semismoothness
Semismooth Newton Result
Lessons for/from generalized Newton
I Generally, Newton cannot cross kinks, but must start in aneighboorhod of smoothness, whose closure contains the root.In other words the combinatorial aspect of the problem musthave been sorted out beforehand by picking the initial point.
I Usually semi-smooth Newton is applied to piecewise-smoothproblems where selection occurs only at one level andgeneralized derivatives can be coded by hand. In general theeffect of superimposed nonsmoothness cannot be handled.
I Remedy for Jacobian calculation and convergence stabilization:Piecewise Linearization and Piecewise Linear Newton!!!!
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Generalized Derivatives and Semismoothness
Semismooth Newton Result
Lessons for/from generalized Newton
I Generally, Newton cannot cross kinks, but must start in aneighboorhod of smoothness, whose closure contains the root.In other words the combinatorial aspect of the problem musthave been sorted out beforehand by picking the initial point.
I Usually semi-smooth Newton is applied to piecewise-smoothproblems where selection occurs only at one level andgeneralized derivatives can be coded by hand. In general theeffect of superimposed nonsmoothness cannot be handled.
I Remedy for Jacobian calculation and convergence stabilization:Piecewise Linearization and Piecewise Linear Newton!!!!
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
and s new independent variables vi .Smooth mapping (x , v) 7→ (u, y) has Jacobian C .
A. Griewank, F. Dalkowski, N. Krejic, Z. Luzanin F. Rodrigues, A. Walther SCAN2010
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Piecewise linearization Approach
Computing Generalized Jacobians
Accumulation of Jacobians
The so-called vector forward mode yields:
∇vi−n = ei for i = 1 . . . n
∇ui =∑
j≺i cij ∇vj}
for i = 1 . . . s∇vi = σi∇ui
∇yi−s =∑
j≺i cij ∇vj for i = s + 1 . . . s + m
Proposition
If σi ≡ firstsign(ui ,∇ui) is (permuted) lexicographicthen the resulting matrix Jσ is a generalized Jacobian of thethe piecewise linearization at the current argument.
Simple choice σi = sign(ui) fails for
y = f (x) ≡ |x + |x || − |x | ≡ x at x = 0
Signature vectors (σ1, σ2) work only if (1 + σ1)σ2 − σ1 = 1A. Griewank, F. Dalkowski, N. Krejic, Z. Luzanin F. Rodrigues, A. Walther SCAN2010
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Piecewise linearization Approach
Computing Generalized Jacobians
Proposition (Khan & Barton and A. G.)∂KF (x) ≡ ∂L
∆x∆F (x ; ∆x)∣∣∆x=0 ⊂ ∂LF (x)
∣∣x=x
contains those Jacobians ∂Fσ(x) for which the tangent cone
Tσ ≡ Tx{x ∈ D : Fσ(x) = F (x)}
has a nonempty interior. (i.e. Fσ and ∂Fσ are conically active)
RemarkWe can find several of them at cost n OPS(F) in worst case.All of them likely a stretch, there could be 2s different ones.For most x result is still trivial, should allow small shift of xalong direction v to compute two Jacobians at nearby kink.
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Piecewise linearization Approach
Computing Generalized Jacobians
Proposition (Khan & Barton and A. G.)∂KF (x) ≡ ∂L
∆x∆F (x ; ∆x)∣∣∆x=0 ⊂ ∂LF (x)
∣∣x=x
contains those Jacobians ∂Fσ(x) for which the tangent cone
Tσ ≡ Tx{x ∈ D : Fσ(x) = F (x)}
has a nonempty interior. (i.e. Fσ and ∂Fσ are conically active)
RemarkWe can find several of them at cost n OPS(F) in worst case.All of them likely a stretch, there could be 2s different ones.For most x result is still trivial, should allow small shift of xalong direction v to compute two Jacobians at nearby kink.
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Piecewise linearization Approach
Computing Generalized Jacobians
Proposition (Khan & Barton and A. G.)∂KF (x) ≡ ∂L
∆x∆F (x ; ∆x)∣∣∆x=0 ⊂ ∂LF (x)
∣∣x=x
contains those Jacobians ∂Fσ(x) for which the tangent cone
Tσ ≡ Tx{x ∈ D : Fσ(x) = F (x)}
has a nonempty interior. (i.e. Fσ and ∂Fσ are conically active)
RemarkWe can find several of them at cost n OPS(F) in worst case.All of them likely a stretch, there could be 2s different ones.For most x result is still trivial, should allow small shift of xalong direction v to compute two Jacobians at nearby kink.
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Back to Abs-normal Form
Representation and Analysis
Reduced Computational Graph
Figure: Reduced computational graph with switching variables
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Back to Abs-normal Form
Representation and Analysis
Switching depth with boundThe Switching depth νis the length of the largest chain of mutually dependentswitching variables in the reduced graph. Disregardingaccidental cancellations it equals the minimal ν ≤ s for which
Lν = 0
the degree of nilpotency of the strictly triangular matrix L.
Proposition (Griewank)Every PL function in n variables has an abs-normalrepresentation of switching depths
ν ≤ ν(n) ≡ 2 n − 1
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Back to Abs-normal Form
Representation and Analysis
Reduction to fixed point equation in uProvided det(J) 6= 0, which can always be achieved usingthe trivial identity, we have the Schur complement
S ≡ L− U J−1V ∈ Rs×s
which can be used to reduce the abs-normal form[uy
]=
[cb
]+
[U LJ V
] [x|u|
]
to the equation with constant term c = c − U J−1b
u = c + S |u| ≡ c + S Σ u
It is simply switched and has orthants as linearity domains.
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Back to Abs-normal Form
Representation and Analysis
Contraction Radius and Smooth DominanceSmooth dominanceThe fixed point iteration is ’contractive’ if the spectral radiusρ(|S |) of the component-wise modulus |S | of S is less than 1.
Perron-Frobenius scaling of uSince |S | ≥ 0 there exists z ≥ 0 such that with Z = diag(z)
|S |Z e ≡ |S | z = ρ(|S |) z ≡ ρ(|S |)Ze
where e = (1, . . . , 1). Provided ρ(|S |) > 0 we have z > 0so that the similarity transformation S ≡ Z−1SZ statisfies
|S | e = ρ(|S |)e and ‖S‖∞ = ρ(|S |) = ρ(|S |)
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Back to Abs-normal Form
Representation and Analysis
Contraction Radius and Smooth DominanceSmooth dominanceThe fixed point iteration is ’contractive’ if the spectral radiusρ(|S |) of the component-wise modulus |S | of S is less than 1.
Perron-Frobenius scaling of uSince |S | ≥ 0 there exists z ≥ 0 such that with Z = diag(z)
|S |Z e ≡ |S | z = ρ(|S |) z ≡ ρ(|S |)Ze
where e = (1, . . . , 1). Provided ρ(|S |) > 0 we have z > 0so that the similarity transformation S ≡ Z−1SZ statisfies
|S | e = ρ(|S |)e and ‖S‖∞ = ρ(|S |) = ρ(|S |)
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Back to Abs-normal Form
Iterative Equation Solving
Fixed point Solver:
u+ = c + S |u| ⇐⇒ v+ = |c + S v |is globally and linearly convergent to unique root if ρ(|L|) < 1.Examples show that smooth dominance is not necessary.Gets by without forming S , requires 1 solve in J per iteration.
Semi-smooth Newton:
u+ = [I − S Σ(u)]−1 c
is globally and finitely convergent to unique root if ρ(|L|) ≤ 13 .
Examples show that smooth dominance is not sufficient.
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Back to Abs-normal Form
Iterative Equation Solving
Fixed point Solver:
u+ = c + S |u| ⇐⇒ v+ = |c + S v |is globally and linearly convergent to unique root if ρ(|L|) < 1.Examples show that smooth dominance is not necessary.Gets by without forming S , requires 1 solve in J per iteration.
Semi-smooth Newton:
u+ = [I − S Σ(u)]−1 c
is globally and finitely convergent to unique root if ρ(|L|) ≤ 13 .
Examples show that smooth dominance is not sufficient.
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Back to Abs-normal Form
Iterative Equation Solving
Piecewise Newton:Stopping Newton steps whenever a component of u switchesits sign and continuing in the new Newton direction yieldsfinite convergence if z = c + S |z | is coherently oriented, i.e.
det(I − S Σ) det(I − S) > 0 for Σ ∈ {−1, 0, 1}s
Since ν = 1 and by Clarke’s IFT this implies bijectivity.Sufficient but probably not necessary for that is ρ(|S |) < 1.
Best of all worlds:Combination of cheap (Fixed point), fast (Newton), andsafe (Piecewise Newton) option under construction.
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Back to Abs-normal Form
Iterative Equation Solving
Piecewise Newton:Stopping Newton steps whenever a component of u switchesits sign and continuing in the new Newton direction yieldsfinite convergence if z = c + S |z | is coherently oriented, i.e.
det(I − S Σ) det(I − S) > 0 for Σ ∈ {−1, 0, 1}s
Since ν = 1 and by Clarke’s IFT this implies bijectivity.Sufficient but probably not necessary for that is ρ(|S |) < 1.
Best of all worlds:Combination of cheap (Fixed point), fast (Newton), andsafe (Piecewise Newton) option under construction.
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Back to Abs-normal Form
Equivalence to Linear Complementarity
Reduction to LCP
Setting u = z − w for z ⊥ w in that z ≥ 0 ≤ w and z>w = 0we obtain v = |u| = z + w . Now the fixed point equation is
z − w = c + S(z + w) with 0 ≤ z ⊥ w ≥ 0
Assuming det(I − S) 6= 0 we may solve for z and obtain
0 ≤ z ≡ q + M w ⊥ w ≥ 0
with q ≡ (I − S)−1c and M ≡ (I − S)−1(I + S)
Here ρ(|S |) < 1 implies det(I − S) 6= 0 and M is P-matrix.That implies again unique solvability for any q ∈ Rs .
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Recent Observations on Generalized Hessians
When are Hessians symmetric ??I Euler, Clairault, Bernoulli, Cauchy, and others tried to
prove that matrices of second derivatives are symmetric.
I Lindelöf demonstrated in 1857 that all their assertionsand/or proofs were wrong. Beginner’s analysis errors !!
I A. H. Schwarz, student of Weierstrass proved in 1863
g = ∇f ∈ C1(D) =⇒ (g ′)> = g ′ = ∇2f
I Peano provided counter example where in ’some sense’
∇2f (0, 0) =
[0 1−1 0
]for f (x , y) = x y
(x2 − y 2)
(x2 + y 2)
I What with generalized Hessians of Lipschitzian gradients?
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Recent Observations on Generalized Hessians
When are Hessians symmetric ??I Euler, Clairault, Bernoulli, Cauchy, and others tried to
prove that matrices of second derivatives are symmetric.
I Lindelöf demonstrated in 1857 that all their assertionsand/or proofs were wrong. Beginner’s analysis errors !!
I A. H. Schwarz, student of Weierstrass proved in 1863
g = ∇f ∈ C1(D) =⇒ (g ′)> = g ′ = ∇2f
I Peano provided counter example where in ’some sense’
∇2f (0, 0) =
[0 1−1 0
]for f (x , y) = x y
(x2 − y 2)
(x2 + y 2)
I What with generalized Hessians of Lipschitzian gradients?
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Recent Observations on Generalized Hessians
When are Hessians symmetric ??I Euler, Clairault, Bernoulli, Cauchy, and others tried to
prove that matrices of second derivatives are symmetric.
I Lindelöf demonstrated in 1857 that all their assertionsand/or proofs were wrong. Beginner’s analysis errors !!
I A. H. Schwarz, student of Weierstrass proved in 1863
g = ∇f ∈ C1(D) =⇒ (g ′)> = g ′ = ∇2f
I Peano provided counter example where in ’some sense’
∇2f (0, 0) =
[0 1−1 0
]for f (x , y) = x y
(x2 − y 2)
(x2 + y 2)
I What with generalized Hessians of Lipschitzian gradients?
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Recent Observations on Generalized Hessians
When are Hessians symmetric ??I Euler, Clairault, Bernoulli, Cauchy, and others tried to
prove that matrices of second derivatives are symmetric.
I Lindelöf demonstrated in 1857 that all their assertionsand/or proofs were wrong. Beginner’s analysis errors !!
I A. H. Schwarz, student of Weierstrass proved in 1863
g = ∇f ∈ C1(D) =⇒ (g ′)> = g ′ = ∇2f
I Peano provided counter example where in ’some sense’
∇2f (0, 0) =
[0 1−1 0
]for f (x , y) = x y
(x2 − y 2)
(x2 + y 2)
I What with generalized Hessians of Lipschitzian gradients?
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Recent Observations on Generalized Hessians
When are Hessians symmetric ??I Euler, Clairault, Bernoulli, Cauchy, and others tried to
prove that matrices of second derivatives are symmetric.
I Lindelöf demonstrated in 1857 that all their assertionsand/or proofs were wrong. Beginner’s analysis errors !!
I A. H. Schwarz, student of Weierstrass proved in 1863
g = ∇f ∈ C1(D) =⇒ (g ′)> = g ′ = ∇2f
I Peano provided counter example where in ’some sense’
∇2f (0, 0) =
[0 1−1 0
]for f (x , y) = x y
(x2 − y 2)
(x2 + y 2)
I What with generalized Hessians of Lipschitzian gradients?
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Recent Observations on Generalized Hessians
When are Hessians symmetric ??I Euler, Clairault, Bernoulli, Cauchy, and others tried to
prove that matrices of second derivatives are symmetric.
I Lindelöf demonstrated in 1857 that all their assertionsand/or proofs were wrong. Beginner’s analysis errors !!
I A. H. Schwarz, student of Weierstrass proved in 1863
g = ∇f ∈ C1(D) =⇒ (g ′)> = g ′ = ∇2f
I Peano provided counter example where in ’some sense’
∇2f (0, 0) =
[0 1−1 0
]for f (x , y) = x y
(x2 − y 2)
(x2 + y 2)
I What with generalized Hessians of Lipschitzian gradients?
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Recent Observations on Generalized Hessians
Real Hessians are always symmetric !!
I Peano Hessian is algebraic fluke, not a Fréchet derivative:g(x + ∆x)− g(x) 6= g ′(x) ∆x + o(‖∆x‖)
I Dieudonné (1960) showed that derivatives of gradients aresymmetric where they exist ⇐⇒ No Perpetuum Mobile !!
I Limiting and Convexification maintain: (∂Cg)> = ∂Cg
I Griewank et al (2013) are showing the converse, i.e.
g ∈ C0,1(D) with (∂Cg)> = ∂Cg =⇒ g = ∇fI Now, compute generalized Hessians via PL of gradient !!
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Recent Observations on Generalized Hessians
Real Hessians are always symmetric !!
I Peano Hessian is algebraic fluke, not a Fréchet derivative:g(x + ∆x)− g(x) 6= g ′(x) ∆x + o(‖∆x‖)
I Dieudonné (1960) showed that derivatives of gradients aresymmetric where they exist ⇐⇒ No Perpetuum Mobile !!
I Limiting and Convexification maintain: (∂Cg)> = ∂Cg
I Griewank et al (2013) are showing the converse, i.e.
g ∈ C0,1(D) with (∂Cg)> = ∂Cg =⇒ g = ∇fI Now, compute generalized Hessians via PL of gradient !!
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Recent Observations on Generalized Hessians
Real Hessians are always symmetric !!
I Peano Hessian is algebraic fluke, not a Fréchet derivative:g(x + ∆x)− g(x) 6= g ′(x) ∆x + o(‖∆x‖)
I Dieudonné (1960) showed that derivatives of gradients aresymmetric where they exist ⇐⇒ No Perpetuum Mobile !!
I Limiting and Convexification maintain: (∂Cg)> = ∂Cg
I Griewank et al (2013) are showing the converse, i.e.
g ∈ C0,1(D) with (∂Cg)> = ∂Cg =⇒ g = ∇fI Now, compute generalized Hessians via PL of gradient !!
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Recent Observations on Generalized Hessians
Real Hessians are always symmetric !!
I Peano Hessian is algebraic fluke, not a Fréchet derivative:g(x + ∆x)− g(x) 6= g ′(x) ∆x + o(‖∆x‖)
I Dieudonné (1960) showed that derivatives of gradients aresymmetric where they exist ⇐⇒ No Perpetuum Mobile !!
I Limiting and Convexification maintain: (∂Cg)> = ∂Cg
I Griewank et al (2013) are showing the converse, i.e.
g ∈ C0,1(D) with (∂Cg)> = ∂Cg =⇒ g = ∇fI Now, compute generalized Hessians via PL of gradient !!
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Recent Observations on Generalized Hessians
Real Hessians are always symmetric !!
I Peano Hessian is algebraic fluke, not a Fréchet derivative:g(x + ∆x)− g(x) 6= g ′(x) ∆x + o(‖∆x‖)
I Dieudonné (1960) showed that derivatives of gradients aresymmetric where they exist ⇐⇒ No Perpetuum Mobile !!
I Limiting and Convexification maintain: (∂Cg)> = ∂Cg
I Griewank et al (2013) are showing the converse, i.e.
g ∈ C0,1(D) with (∂Cg)> = ∂Cg =⇒ g = ∇fI Now, compute generalized Hessians via PL of gradient !!
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Recent Observations on Generalized Hessians
Real Hessians are always symmetric !!
I Peano Hessian is algebraic fluke, not a Fréchet derivative:g(x + ∆x)− g(x) 6= g ′(x) ∆x + o(‖∆x‖)
I Dieudonné (1960) showed that derivatives of gradients aresymmetric where they exist ⇐⇒ No Perpetuum Mobile !!
I Limiting and Convexification maintain: (∂Cg)> = ∂Cg
I Griewank et al (2013) are showing the converse, i.e.
g ∈ C0,1(D) with (∂Cg)> = ∂Cg =⇒ g = ∇fI Now, compute generalized Hessians via PL of gradient !!
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Recent Observations on Generalized Hessians
Summary and tentative conclusionsI Practical functions are semi-smooth and their linearization
goes further than we thought, but not quite far enough.I Yes, we can compute generalized Jacobians! They are not
only essential in the sense of Scholtes but conically active.I But, semi-smooth Newton only yields convergence from
points where combinatorial aspects have been resolved.I Piecewise linearization facilitates nonsmooth equation
solving, optimization, integration of Lipschitzian ODEs...I Smooth dominance facilitates simple fixed point solution.I Next on the agenda: solving algebraic and differential
inclusions as well as bang-bang optimal control problems.
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Recent Observations on Generalized Hessians
Summary and tentative conclusionsI Practical functions are semi-smooth and their linearization
goes further than we thought, but not quite far enough.I Yes, we can compute generalized Jacobians! They are not
only essential in the sense of Scholtes but conically active.I But, semi-smooth Newton only yields convergence from
points where combinatorial aspects have been resolved.I Piecewise linearization facilitates nonsmooth equation
solving, optimization, integration of Lipschitzian ODEs...I Smooth dominance facilitates simple fixed point solution.I Next on the agenda: solving algebraic and differential
inclusions as well as bang-bang optimal control problems.
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Recent Observations on Generalized Hessians
Summary and tentative conclusionsI Practical functions are semi-smooth and their linearization
goes further than we thought, but not quite far enough.I Yes, we can compute generalized Jacobians! They are not
only essential in the sense of Scholtes but conically active.I But, semi-smooth Newton only yields convergence from
points where combinatorial aspects have been resolved.I Piecewise linearization facilitates nonsmooth equation
solving, optimization, integration of Lipschitzian ODEs...I Smooth dominance facilitates simple fixed point solution.I Next on the agenda: solving algebraic and differential
inclusions as well as bang-bang optimal control problems.
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Recent Observations on Generalized Hessians
Summary and tentative conclusionsI Practical functions are semi-smooth and their linearization
goes further than we thought, but not quite far enough.I Yes, we can compute generalized Jacobians! They are not
only essential in the sense of Scholtes but conically active.I But, semi-smooth Newton only yields convergence from
points where combinatorial aspects have been resolved.I Piecewise linearization facilitates nonsmooth equation
solving, optimization, integration of Lipschitzian ODEs...I Smooth dominance facilitates simple fixed point solution.I Next on the agenda: solving algebraic and differential
inclusions as well as bang-bang optimal control problems.
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Recent Observations on Generalized Hessians
Summary and tentative conclusionsI Practical functions are semi-smooth and their linearization
goes further than we thought, but not quite far enough.I Yes, we can compute generalized Jacobians! They are not
only essential in the sense of Scholtes but conically active.I But, semi-smooth Newton only yields convergence from
points where combinatorial aspects have been resolved.I Piecewise linearization facilitates nonsmooth equation
solving, optimization, integration of Lipschitzian ODEs...I Smooth dominance facilitates simple fixed point solution.I Next on the agenda: solving algebraic and differential
inclusions as well as bang-bang optimal control problems.
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form
Recent Observations on Generalized Hessians
Summary and tentative conclusionsI Practical functions are semi-smooth and their linearization
goes further than we thought, but not quite far enough.I Yes, we can compute generalized Jacobians! They are not
only essential in the sense of Scholtes but conically active.I But, semi-smooth Newton only yields convergence from
points where combinatorial aspects have been resolved.I Piecewise linearization facilitates nonsmooth equation
solving, optimization, integration of Lipschitzian ODEs...I Smooth dominance facilitates simple fixed point solution.I Next on the agenda: solving algebraic and differential
inclusions as well as bang-bang optimal control problems.
Representation and Analysis of Piecewise Linear Functions in Abs-normal Form