8/10/2019 non linier.pdf
1/94
Pemrograman Non Linier
8/10/2019 non linier.pdf
2/94
MIT and James Orlin 20032
What is a non-linear program?
maximize 3 sin x + xy + y 3 - 3z + log zSubject to x 2 + y2 = 1
x + 4z 2z 0
A non-linear program is permitted to havenon-linear constraints or objectives.
A linear program is a special case of non-linearprogramming!
8/10/2019 non linier.pdf
3/94
MIT and James Orlin 20033
Nonlinear Programs (NLP)
Nonlinear objective function f(x) and/or Nonlinear constraints gi(x).
Today: we will present several types of non-linear programs.
1 2 , , , ( )
( ) , 1, 2, ,
n
i i
Let x x x x
Max f x
g x b i m
8/10/2019 non linier.pdf
4/94
4
Unimodal and Multimodal Functions A unimodal function f (x ) (in the range
specified for x ) has a single extremum(minimum or maximum).
A multimodal function f (x ) has two or moreextrema.
There is a distinction between the globalextremum (the biggest or smallest between
a set of extrema) and local extrema (anyextremum). Note : many numericalprocedures terminate at a local extremum.
I f ( ) 0 f x at the extremum, the point iscalled a stationary point .
8/10/2019 non linier.pdf
5/94
5
f (x )
x
local max (stationary)
local min (stationary)
global max (not stationary)
stationary point(saddle point)
global min (stationary)
A multimodal function
8/10/2019 non linier.pdf
6/94
6
Multivariate Functions -Surface and Contour Plots
We shall be concerned with basic properties of a scalar function f (x ) of n variables ( x 1,..., x n).
If n = 1, f (x ) is a univariate functionIf n > 1, f (x ) is a multivariate function .
n 1
For any multivariate function, the equation
z = f (x ) defines a surface in n +1 dimensionalspace .
8/10/2019 non linier.pdf
7/94
7
In the case n = 2, the points z = f (x 1,x 2)represent a three dimensional surface.
Let c be a particular value of f (x 1,x 2). Then f (x 1,x 2) = c defines a curve in x 1 and x 2 on theplane z = c.
If we consider a selection of different valuesof c, we obtain a family of curves whichprovide a contour map of the function z =
f (x 1,x 2).
8/10/2019 non linier.pdf
8/94
8
1 2 21 2 1 2 2(4 2 4 2 1)
x z e x x x x x contour map of
- 3 -2 -1 0 1 2- 3
- 2 . 5
- 2
- 1 . 5
- 1
- 0 . 5
0
0 .5
1
1 .5
2
x 2
x 1
0.20.40.7
1.0
1.7
1.8 23
4 56 z = 20
1.8
2
1.7
3
4
56
local minimum
saddle point
8/10/2019 non linier.pdf
9/94
9
Example: Surface and
Contour Plots of PeaksFunction
2 2 2
1 1 23 5 2 2
1 1 2 1 2
2 21 2
3(1 ) exp ( 1) 10 (0 .2 )exp( )
1 3exp ( 1)
z x x x x x x x x
x x
8/10/2019 non linier.pdf
10/94
10
05
1 01 5
2 0
0
5
1 0
1 5
2 0- 1 0
- 5
0
5
1 0
x 2
z
x 1
multimodal!
1 0 2 0 3 0 4 0 5 0 6 0 7 0 8 0 9 0
1 0
2 0
3 0
4 0
5 0
6 0
7 0
8 0
9 0
1 0 0
x 1
x 2
global max
global min
local min
local max
local max
saddle
8/10/2019 non linier.pdf
11/94
11
Gradient VectorThe slope of f (x ) at a point x x in the directionof the i th co-ordinate axis is
The n -vector of these partial derivatives istermed the gradient vector of f , denoted by:
NMMMMM
f
f
x f
xn
( )
( )
( )x
x
x
1
(a column vector )
f
xi
( )x
x x
8/10/2019 non linier.pdf
12/94
12
The gradient vector at a point x xis normal to the the contour through thatpoint in the direction of increasing f .
x
f ( )xincreasing f
At a stationary point :
f ( )x 0 (a null vector )
8/10/2019 non linier.pdf
13/94
13
Example
f x x x x
f
f x
f
x
x x x
x x x
( ) cos
( )
( )
( )sin
cos
x
x
x
x
L
N
MMMM
O
Q
PPPP
LNM
1 22
2 1
1
2
22
2 1
1 2 12
and the stationary point (points) are given bythe simultaneous solution(s) of:-
x x x x x x
22
2 1
1 2 1
02 0
sincos
8/10/2019 non linier.pdf
14/94
14
e.g.
f T
( )x c x x c f( )=
Note : If f ( )x is a constant vector, f (x ) is then linear .
8/10/2019 non linier.pdf
15/94
15
Hessian Matrix (Curvature Matrix
The second derivative of a n - variablefunction is defined by the n 2 partialderivatives:
x f x i n j ni j
( ), , , ; , ,
x
HG KJ 1 1 written as:
2 2
2 f
x xi j f
xi j
i j i
( ) , , ( ) , .x x
8/10/2019 non linier.pdf
16/94
16
These n 2 second partial derivatives areusually represented by a square, symmetric
matrix, termed the Hessian matrix , denotedby:
H x x
x x
x x
2
2 2
2 2( ) ( )
( ) ( )
( ) ( )
NMMMMM
f
f
x
f
x x
f x x
f x
n
n n
12
1
12
8/10/2019 non linier.pdf
17/94
17
Example : For the previous example:
NMMMM Q
PPPP LNM
2
2
12
2
1 22
1 2
2
22
2 1 2 1
2 1 1
2
2 2 f
f x
f x x
f x x
f x
x x x x
x x x( )
cos sin
sinx
Note : If the Hessian matrix of f (x ) is aconstant matrix, f (x ) is then quadratic ,expressed as:
f
f f
T T ( )
( ) , ( )
x x Hx c x
x Hx c x H
12
2
8/10/2019 non linier.pdf
18/94
18
Convex and Concave Functions A function is called concave over a given
region R if:
f f f
Ra b a b
a b
( ( ) ) ( ) ( ) ( )
, , .
x x x x
x x
1 1
0 1w here: an d
The function is strictly concave if isreplaced by >.
A function is called convex (strictly convex )if is replaced by (
8/10/2019 non linier.pdf
19/94
19
concave function
convex function
x x a x b
f (x )
f x( ) 0
x x a x b
f (x )
f x( ) 0
8/10/2019 non linier.pdf
20/94
20
I f th en is c o n ca v e .
I f th en is c o n v e x .
f x f
x f x
f x f x f x
( ) ( )
( ) ( )
2
2
2
2
0
0
For a multivariate function f (x ) theconditions are:-
Strictly convex +ve def convex +ve semi def
concave -ve semi def strictly concave -ve def
f (x ) H (x ) Hessian matrix
8/10/2019 non linier.pdf
21/94
21
Tests for Convexity and ConcavityH is +ve def (+ve semi def) iff
x Hx x 0T 0 0( ), .H is -ve def (-ve semi def) iff
x Hx x 0T 0 0( ), .Convenient tests : H (x ) is strictly convex(+ve def) ( convex) (+ve semi def)) if:
1. all eigenvalues of H (x ) areor 2. all principal determinants of H (x ) are
0 0( )
0 0( )
8/10/2019 non linier.pdf
22/94
22
H (x ) is strictly concave (-ve def)(concave (- ve semi def)) if:
1. all eigenvalues of H (x ) areor 2. the principal determinants of H (x )
are alternating in sign:
0 0( )
1 2 3
1 2 3
0 0 0
0 0 0
, , ,
( , , , )
8/10/2019 non linier.pdf
23/94
23
Example f x x x x x( ) 2 3 212 1 2 222 2
1 2 21 1 21
2
1 2 22 2
( ) ( ) ( )4 3 4 3
( ) ( )3 4 4
f f f x x x x x x
f f x x
x x
x x x
x x
1 24 3 4 3( ) , 4, 73 4 3 4
H x
22
1 2
4 3eigenvalues: | | 8 7 0
3 41 Hence, ( ) is strictly conv, 7. e .x f
I H
x
8/10/2019 non linier.pdf
24/94
24
Convex Regionx a
x b
non convex region
A convex set of points exist if for any two points, x aand x b, in a region, all points:
x x x a b( ) ,1 0 1
on the straight line joining x a and x b are in the set.If a region is completely bounded by concavefunctions then the functions form a convex region.
x a
x b
convex region
8/10/2019 non linier.pdf
25/94
25
Necessary and Sufficient Conditionsfor an Extremum of an
Unconstrained Function A condition N is necessary for a result R if R canbe true only if N is true.
R
A condition S is sufficient for a result R if R istrue if S is true.
S R
A condition T is necessary and sufficient for aresult R iff T is true.T R
8/10/2019 non linier.pdf
26/94
26
There are two necessary and a single sufficientconditions to guarantee that x * is an extremum of afunction f (x ) at x = x *:
1. f (x) is twice continuously differentiable at x*.
2. , i.e. a stationary point exists at x*.
3. is +ve def for a minimum to existat x *, or -ve def for a maximum to exist at x *
f ( )*x 0
2
f ( ) ( )* *x H x
1 and 2 are necessary conditions; 3 is asufficient condition.Note : an extremum may exist at x* eventhough it is not possible to demonstrate thefact using the three conditions.
8/10/2019 non linier.pdf
27/94
27
Example: Consider:
The gradient vector is: NM f x x x x x
x x x( )
.x
45 2 2 4 4
4 4 2 21 2 1
31 2
2 1 12
yielding three stationary points located by settingand solving numerically: f ( )x 0
x *=( x 1,x 2) f (x *) eigenvaluesof
2 f ( )xclassification
A.(-1.05,1.03) -0.51 10.5 3.5 global min
B.(1.94,3.85) 0.98 37.0 0.97 local min
C.(0.61,1.49) 2.83 7.0 -2.56 saddle
where:
NM2 1
22 1
1
2 12 4 2 4
2 4 4 f
x x x
x( )x
f x x x x x x x x x( ) .x 4 4 5 4 2 2 21 2 12 22 1 2 14 12 2
8/10/2019 non linier.pdf
28/94
28
Interpretation of the Objective Functionin Terms of its Quadratic Approximation
If a function of two variables can be approximatedwithin a region of a stationary point by a quadratic
function :
f x x x xh h
h h
x
xc c
x
x
h x h x h x x c x c x
( , )1 212 1 2
11 12
12 22
1
21 2
1
2
12 11 1
2 12 22 2
212 1 2 1 1 2 2N
MQPNM
QP
NM
QP
then the eigenvalues and eigenvectors of:
H ( , ) ( , )* * * * x x f x xh h
h h1 2
21 2
11 12
12 22
NMcan be used to interpret the nature of f (x 1,x 2) at: x x x x1 1 2 2 * *,
8/10/2019 non linier.pdf
29/94
29
They provide information on the shape of
f (x 1,x 2) at x x x x1 1 2 2 * *, If H( , )* * x x1 2 is +ve def, theeigenvectors are at right angles ( orthogonal ) andcorrespond to the principal axes of ellipticalcontours of f (x 1,x 2).
A valley or ridge lies in the direction of theeigenvector associated with a relative smalleigenvalue.
These interpretations can be generalized to the
multivariate quadratic approximation :
f T T ( )x x Hx c x 12
8/10/2019 non linier.pdf
30/94
Topics Convex sets and convex programming First-order optimality conditions Examples Problem classes
8/10/2019 non linier.pdf
31/94
Minimize f (x )s.t. g i(x ) ( , , =) bi, i = 1,, m
x = ( x1,, xn)T is the n-dimensional vector of
decision variables f (x ) is the objective function
g i(x ) are the constraint functions
bi are fixed known constants
General NLP
8/10/2019 non linier.pdf
32/94
Convex Sets
Definition : A set S n
is convex if every point on the linesegment connecting any two points x 1, x 2 S is also in S .
Mathematically, this is equivalent to
x 0 = x 1 + (1 )x 2 S for all such 0 1.
x 1x 2
x 1x 1x
2
x 2
8/10/2019 non linier.pdf
33/94
S = {( x1, x2) : (0.5 x1 0.6) x2 1
2( x1)2 + 3( x2)2 27; x1, x2 0}
(Nonconvex) Feasible Region
x 1
x 2
8/10/2019 non linier.pdf
34/94
Convex Sets and Optimization
Let S = { xn
: g i(x ) bi, i = 1,, m }
Fact : If g i(x ) is a convex function for each i = 1,, m then S is aconvex set.
Convex Programming Theorem : Let xn
and let f (x ) be aconvex function defined over a convex constraint set S . If a
finite solution exists to the problem
Minimize { f (x ) : x S }
then all local optima are global optima. If f (x ) is strictlyconvex, the optimum is unique.
8/10/2019 non linier.pdf
35/94
Max f ( x 1,, x n)
s.t. gi ( x 1,, x n) b i i = 1,,m
x 1 0,, x n 0
is a convex program if f isconcave and each g i isconvex .
Convex Programming
Min f ( x 1,, x n)
s.t. gi ( x 1,, x n) b i i = 1,,m
x 1 0,, x n 0
is a convex program if f isconvex and each g i isconvex .
8/10/2019 non linier.pdf
36/94
x11 2 3 4 5
1
2
3
4
5
x2
Maximize f (x ) = ( x1 2) 2 + ( x2 2) 2
subject to 3 x1 2 x2 6
x1 + x 2 3 x1 + x2 7
2 x1 3 x2 4
Linearly Constrained Convex Functionwith Unique Global Maximum
8/10/2019 non linier.pdf
37/94
(Nonconvex) Optimization Problem
8/10/2019 non linier.pdf
38/94
First-Order Optimality Conditions
Minimize { f (x) : g i(x ) bi, i = 1,, m }
Lagrangian: L(x ,) f (x ) i g i (x ) bi i 1
m
L(x ,) f (x ) i g i (x )i 1
m
0
Optimality conditions
Stationarity:
Complementarity: ig i(x) = 0, i = 1,, m
Feasibility: g i(x ) bi, i = 1,, m
Nonnegativity: i 0, i = 1,, m
8/10/2019 non linier.pdf
39/94
Commercial optimization software cannot guarantee that a
solution is globally optimal to a nonconvex program.
Importance of Convex Programs
NLP algorithms try to find a point where the gradient of theLagrangian function is zero a stationary point andcomplementary slackness holds.
Given L(x, ) = f (x) + (g(x) b)
we want L(x, ) = f (x) + g(x) = 0 g(x) b) = 0
g(x) b 0, 0For a convex program, all local solutions are global optima.
8/10/2019 non linier.pdf
40/94
NLP Problem Classes Constrained vs. unconstrained Convex programming problem Quadratic programming problem
f (x) = a + cT
x + xT
Qx , Q 0 Separable programming problem
f (x) = j =1 ,n f j ( x j )
Geometric programming problem
g(x) = t =1 ,T ct Pt (x), Pt (x) = ( x 1a t 1) . . . ( x n
a tn ), x j > 0
Equality constrained problems
8/10/2019 non linier.pdf
41/94
What You Should Know AboutNonlinear Programming
How to identify a convex program. How to write out the first-order optimality
conditions. The difference between a local and global
solution. How to classify problems.
8/10/2019 non linier.pdf
42/94
Outline of Part 1
Equality-Constrained Problems
Lagrange Multipliers
Kuhn-Tucker Conditions
Kuhn-Tucker Theorem
8/10/2019 non linier.pdf
43/94
Equality-Constrained Problems
solving the problem as an unconstrainedproblem by explicitly eliminating K independent variables using the equalityconstraints
GOAL
8/10/2019 non linier.pdf
44/94
Example 5.1
8/10/2019 non linier.pdf
45/94
45
Optimisation with EqualityConstraints
min ( );( ) ;x x x
h x 0
n
f subject to: m constraints (m n)
Elimination of variables:
example: min ( ), x x
f x x
x x1 2
4 5
2 3 6
12
22
1 2
x
(a)
s. t. (b)
and substituting into (a) :- f x x x( ) ( )2 22
226 3 5
Using (b) to eliminate x 1 gives: x x1 26 32 (c)
8/10/2019 non linier.pdf
46/94
46
At a stationary point
f x x
x x
x x
( )( )
.*
2
22 2
2 2
0 6 6 3 10 0
28 36 1286
Then using (c): x x
126 3
210 71*
*
.
Hence, the stationary point (min) is: (1.071, 1.286)
8/10/2019 non linier.pdf
47/94
What if?
8/10/2019 non linier.pdf
48/94
8/10/2019 non linier.pdf
49/94
Lagrange Multipliers
Converting constrained problem to an unconstrained problem with help of
certain unspecified parameters known as LagrangeMultipliers
8/10/2019 non linier.pdf
50/94
Lagrange Multipliers
Lagrange
function
8/10/2019 non linier.pdf
51/94
Lagrange Multipliers
Lagrange
multiplier
8/10/2019 non linier.pdf
52/94
52
If:
f x
f x
h x
h x
1 2
1 2
0 nontrivial nonunique solutionsfor dx 1 and dx 2 will exist.
This is achieved by setting
h x
h x
h x
h x
f x
h x
f x
h x
1 2
1 2
1 1 2 2
with
where is known as a Lagrange multiplier .
8/10/2019 non linier.pdf
53/94
53
If an augmented objective function , called theLagrangian is defined as:
L x x f x x h x x( , , ) ( , ) ( , )1 2 1 2 1 2 we can solve the constrained optimisation problem bysolving:
L x
f x
h x
L x
f x
h x
L h x x
1 1 1
2 2 2
1 2
0
0
0
V||W||
provides equations (a) and (b)
re-statement of equality constraint( , )
8/10/2019 non linier.pdf
54/94
54
Generalizing : To solve the problem:
min ( );
subject to: ( ) ; constraints ( )
f
m m n
n
xx x
h x 0
define the Lagrangian :
( , ) ( ) ( ),T m L f x x h x and the stationary point (points) is obtained from:-
( )( , ) ( )
( , ) ( )
T
L f
L
x x
h xx x 0
x
x h x 0
8/10/2019 non linier.pdf
55/94
55
Example Consider the previous exampleagain. The Lagrangian is:-
L x x x x
L x
x
L x
x
L x x
4 5 2 3 6
8 2 0
10 3 0
2 3 6 0
12
22
1 2
11
22
1 2
( )
(a)
(b)
(c)
Substituting (a) and (b) into (c) gives:
3 9 30
8/10/2019 non linier.pdf
56/94
56
which agrees with the previous result.
x x
x x
1 2
1 2
4310 2
910
6 0 30
74281
15
14
10 71 90
70
1286
, .
. , .Hence,
4 -3 -2 -1 0 1 2 3 4-4
-3
-2
-1
0
1
2
3
4
x 1
x 2
8/10/2019 non linier.pdf
57/94
Example 5.2
8/10/2019 non linier.pdf
58/94
8/10/2019 non linier.pdf
59/94
Test whether the stationary pointcorresponds to a minimum
positive definite
8/10/2019 non linier.pdf
60/94
8/10/2019 non linier.pdf
61/94
Example 5.3
8/10/2019 non linier.pdf
62/94
8/10/2019 non linier.pdf
63/94
8/10/2019 non linier.pdf
64/94
positivedefinite
negativedefinite
ax
8/10/2019 non linier.pdf
65/94
65
Limitations of Analytical Methods
The computations needed to evaluate the aboveconditions can be extensive and intractable.Furthermore, the resulting simultaneousequations required for solving x *, * and * areoften nonlinear and cannot be solved without
resorting to numerical methods.
The results may be inconclusive.
For these reasons, we often have to resort tonumerical methods for solving optimisationproblems, using computer codes (e.g.. MATLAB )
8/10/2019 non linier.pdf
66/94
Kuhn-Tucker Conditions
8/10/2019 non linier.pdf
67/94
NLP problem
Kuhn Tucker conditions
8/10/2019 non linier.pdf
68/94
Kuhn-Tucker conditions(aka Kuhn-Tucker Problem)
8/10/2019 non linier.pdf
69/94
Example 5.4
8/10/2019 non linier.pdf
70/94
Example 5.4
8/10/2019 non linier.pdf
71/94
Example 5.4
8/10/2019 non linier.pdf
72/94
Kuhn-Tucker Theorems
1. Kuhn Tucker Necessity Theorem
2. Kuhn Tucker Sufficient Theorem
8/10/2019 non linier.pdf
73/94
8/10/2019 non linier.pdf
74/94
8/10/2019 non linier.pdf
75/94
Kuhn-Tucker Necessity Theorem
For certain special NLP problems, theconstraint qualification is satisfied:1. When all the inequality and equality
constraints are linear2. When all the inequality constraints are
concave functions and equalityconstraints are linear
! When the constraint qualification isnot met at the optimum, there may notexist a solution to the KTP
8/10/2019 non linier.pdf
76/94
Example 5.5
x* = (1, 0)
and for k=1,.,K are linearly independent at the optimum
8/10/2019 non linier.pdf
77/94
Example 5.5
x* = (1, 0)
No Kuhn-Tucker point at theoptimum
8/10/2019 non linier.pdf
78/94
8/10/2019 non linier.pdf
79/94
Example 5.6
8/10/2019 non linier.pdf
80/94
Kuhn-Tucker Sufficiency Theorem
Let f(x) be convex
the inequality constraints g j(x) for j=1,,J beall concave function the equality constraints hk (x) for k=1,,K belinear
If there exists a solution (x*,u*,v*) thatsatisfies KTCs, then x* is an optimal solution
8/10/2019 non linier.pdf
81/94
t d t i t
8/10/2019 non linier.pdf
82/94
82
-2 0 2 4 6 8-2
-1
0
1
2
3
5
6
7
8
x1
2
(1.0 0 ,4.90 )
x2=0
h1(x )=0
g1(x
)=0105
0-5
-10
-20-25
-30-40-50-60
contours and constraints
We test each Kuhn Tucker condition in turn:
8/10/2019 non linier.pdf
83/94
83
We test each Kuhn-Tucker condition in turn:(a) All functions are seen by inspection to be twice
differentiable.
(b) We assume the Lagrange multipliers exist
(c) Are the constraints satisfied?
h yes
g yesbinding
g yes not active
g yes not active
g yes not active
12 2
12 2
22
3
4
25 10 0 490 0 0 1 0
10 10 0 10 0 10 490 490 34 0 0 1 0
490 1 1921 0
1
490 0
: ( . ) ( . ) .
: ( . ) ( . ) ( . ) ( . ) .
: ( ( . ) .
:
: .
1.00-3).00 -2
2
h f h di i d
8/10/2019 non linier.pdf
84/94
84
To test the rest of the conditions we need todetermine the Lagrange multipliers using thestationarity conditions. First we note that fromcondition (e) we require:
j jg j* *( ) , , , ,x 0 1 2 3 4
1 1
2 2
3 3
4 4
0
0
0
0
* *
* *
* *
* *
( )
( )
( )
( )
can have any value because
must be zero because
must be zero because
must be zero because
g
g
g
g
x
x
x
x
8/10/2019 non linier.pdf
85/94
85
Hence:
L
NM O
QPL
NMO
QP L
NM O
QP
x
x
L x x
L x x x1
2
4 2 10 2 4 2 8
2 2 10 2 98 98 0 2
2 8
98 0 2
4
9810 15
1 1 1 1 1 1
2 1 2 1 2 1 1
1
1
1 1
* * * * * *
* * * * * * *
*
** *
( )
( ) . . .
. . .. , =0.754
Now consider the stationarity condition:* *2 3
2 2 2 2 21 2 1 1 2 1 1 1 2 2
( , , ) 0 where, since 0
=4 12 (25 ) (10 10 34)
L
L x x x x x x x x
x x
8/10/2019 non linier.pdf
86/94
86
Now we can check the remaining conditions:
(d) Are j j* , , , , ? 0 1 2 3 4
1 2 3 40 754 0* * * *. , Hence, the answer is yes
(e) Are j jg j* *( ) , , , , ?x 0 12 3 4
Yes , because we have already used this above.
(f) Is the Lagrangian function at a stationary point?
Yes , because we have already used this above.
8/10/2019 non linier.pdf
87/94
87
Hence, all the Kuhn-Tucker conditions aresatisfied and we can have confidence in the
solution :
x* = (1.00,4.90)
8/10/2019 non linier.pdf
88/94
Example 5.4 f(x) be convex the inequality constraints g j (x) for j=1,,J be all concave function the equality constraints hk (x) for k=1,,K be linear
8/10/2019 non linier.pdf
89/94
Example 5.4 f(x) be convex
semi-definite
8/10/2019 non linier.pdf
90/94
Example 5.4 f(x) be convex the inequality constraints g j (x) for j=1,,J be all concave function
v
g1(x) linear, hence both convex andconcave
negative definite
8/10/2019 non linier.pdf
91/94
Example 5.4 f(x) be convex the inequality constraints g j (x) for j=1,,J be all concave function the equality constraints hk (x) for k=1,,K be linear
v
8/10/2019 non linier.pdf
92/94
Remarks
For practical problems, the constraintqualificationwill generally hold. If the functions are
differentiable, a KuhnTucker point is a possiblecandidate for the optimum. Hence, many of theNLP methods attempt to converge to a KuhnTucker point.
8/10/2019 non linier.pdf
93/94
Remarks
When the sufficiency conditions of Theorem 5.2hold, a KuhnTucker point automaticallybecomes the global minimum. Unfortunately,the sufficiency conditions are difficult to verify,and often practical problems may not possessthese nice properties. Note that the presenceof one nonlinear equality constraint is enough
to violate the assumptions of Theorem 5.2
8/10/2019 non linier.pdf
94/94
Remarks
The sufficiency conditions of Theorem 5.2 havebeen generalized further to nonconvexinequality constraints, nonconvex objectives,and nonlinear equality constraints. These usegeneralizations of convex functions such asquasi-convex and pseudoconvex functions