What is Optimization? Optimization is an iterative process by which a desired solution (max/min) of the problem can be found while satisfying all its constraint or bounded conditions. Optimization problem could be linear or non-linear. Non –linear optimization is accomplished by numerical ‘ Search Methods’. Search methods are used iteratively before a solution is achieved. The search procedure is termed as algorithm. Figure 2: Optimum solution is found while satisfying its constraint (derivative must be zero at optimum).
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
What is Optimization?
Optimization is an iterative process by which a desired solution
(max/min) of the problem can be found while satisfying all its
constraint or bounded conditions.
Optimization problem could be linear or non-linear.
Non –linear optimization is accomplished by numerical ‘Search
Methods’.
Search methods are used iteratively before a solution is achieved.
The search procedure is termed as algorithm.
Figure 2: Optimum solution is found
while satisfying its constraint (derivative
must be zero at optimum).
Linear problem – solved by Simplex or Graphical methods.
The solution of the linear problem lies on boundaries of the feasible
region.
Non-linear problem solution lies within and on the boundaries of the
feasible region.
Figure 3: Solution of linear problem Figure 4: Three dimensional solution of
non-linear problem
What is Optimization?(Cont.)
Constraints
• Inequality
• Equality
Fundamentals of Non-Linear Optimization
Single Objective function f(x)
• Maximization
• Minimization
Design Variables, xi , i=0,1,2,3…..
Figure 5: Example of design variables and
constraints used in non-linear optimization.
Maximize X1 + 1.5 X2
Subject to:
X1 + X2 ≤ 150
0.25 X1 + 0.5 X2 ≤ 50
X1 ≥ 50
X2 ≥ 25
X1 ≥0, X2 ≥0
Optimal points
• Local minima/maxima points: A point or Solution x* is at local point
if there is no other x in its Neighborhood less than x*
• Global minima/maxima points: A point or Solution x** is at global
point if there is no other x in entire search space less than x**
Figure 6: Global versus local optimization. Figure 7: Local point is equal to global point if
the function is convex.
Fundamentals of Non-Linear Optimization (Cont.)
Function f is convex if f(Xa) is less than value of the corresponding
point joining f(X1) and f(X2).
Convexity condition – Hessian 2nd order derivative) matrix of
function f must be positive semi definite ( eigen values +ve or zero).
Fundamentals of Non-Linear Optimization (Cont.)
Figure 8: Convex and nonconvex set Figure 9: Convex function
Mathematical Background
Slop or gradient of the objective function f – represent the
direction in which the function will decrease/increase most rapidly
x
f
x
xfxxf
dx
df
xx 00lim
)()(lim
.......)(!2
1)()( 2
2
2
xdx
fdx
dx
dfxxf
pp xx
p
z
g
y
g
x
g
z
f
y
f
x
f
J
Taylor series expansion
Jacobian – matrix of gradient of f with respect to several variables
Hessian – Second derivative of f of several variables
Second order condition (SOC)
• Eigen values of H(X*) are all positive
• Determinants of all lower order of H(X*) are +ve
2
22
2
2
2
y
f
yx
f
xy
f
x
f
H
First order Condition (FOC)
0*)(Xf
Mathematical Background (Cont.)
Optimization Algorithm
Deterministic - specific rules to move from one iteration to next ,
gradient, Hessian
Stochastic – probalistic rules are used for subsequent iteration
Optimal Design – Engineering Design based on
optimization algorithm
Lagrangian method – sum of objective function and linear
combination of the constraints.
Multivariable Techniques ( Make use of Single variable Techniques
specially Golden Section)
Optimization Methods
Deterministic
• Direct Search – Use Objective function values to locate minimum
• Gradient Based – first or second order of objective function.
• Minimization objective function f(x) is used with –ve sign –
f(x) for maximization problem.
Single Variable
• Newton – Raphson is Gradient based technique (FOC)
• Golden Search – step size reducing iterative method
• Unconstrained Optimizationa.) Powell Method – Quadratic (degree 2) objective function polynomial is
non-gradient based.
b.) Gradient Based – Steepest Descent (FOC) or Least Square minimum
(LMS)
c.) Hessian Based -Conjugate Gradient (FOC) and BFGS (SOC)
• Constrained Optimizationa.) Indirect approach – by transforming into unconstrained
problem.
b.) Exterior Penalty Function (EPF) and Augmented Lagrange
Multiplier
c.) Direct Method Sequential Linear Programming (SLP), SQP and