Top Banner
MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002
37

MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Dec 20, 2015

Download

Documents

Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

MAE 552 – Heuristic Optimization

Lecture 3

January 28, 2002

Page 2: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Types of Optimization and Optimization Algorithms

•There are a many different types of optimization and even more algorithms to perform it.

•Why are there so many algorithms?

•None of the traditional algorithms are robust.

•For every different type of problem there is a different solution method.

•This can become a problem when a user only knows a limited number of algorithms and tries to apply them to problems they were not intended for.

Page 4: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Important Categories

Methods can be classified according to:

1. The type of problem being addressed

• Convex vs. Multi-Modal

• Continuous vs. Discrete

• Constrained vs. Unconstrained

• Linear, Quadratic and Nonlinear Programming

• Analytical or Numerical

Page 5: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Important Categories

Methods can be classified according to:

2. The type of technique being used

• Deterministic vs. Random

• Derivative-based vs. Derivative free

• Global vs. Local

Page 6: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Types of Problems: Convex versus Multimodal

1. Strictly Convex Problems have a single optima

• Local Optima = Global Optima

• Hessian matrix has positive eigenvalues

F(x)

x*

Page 7: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Types of Problems: Convex versus Multimodal

1. Non Convex Problems may have multiple local optima

x

F(x)

Page 8: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Types of Problems: Continuous vs. Discrete

1. In continuous problems x is defined over a range

• xlxxu

• Examples: length, weight, temperature etc.

F(x)

x

Page 9: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Types of Problems: Continuous vs. Discrete

1. In discrete problems x is defined only for specific values

• xlxxu where x can take on only specific values

• Examples: Material Type, routes taken, standardized sizing

F(x)

x

Page 10: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Types of Problems: Linear, Quadratic and Nonlinear F

1. Problems with Linear or Quadratic Objective Functions and constraints are much easier to solve than those with general nonlinear terms.

Examples:

Linear F=ax1 + bx2+ c

Quadratic F=F=ax1 + bx2+ cx22 +d

Nonlinear F=ax1 + bx22+ cx2

5 +x1x21/2

Page 11: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Types of Problems:Analytical or Numerical

• Analytical Problems: Equations available to calculate objective function and constraints

F(x)=ax1 + bx2+ c

g1(x)=x1+x20

• Numerical Problems: Analysis codes used to determine objective function and constraints

ANALYSISxF(x)

g(x)

Page 12: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Type of Technique : Deterministic vs. Random

Deterministic Methods:

With a given set of initial conditions these algorithms always follow the same path.

Xo

Examples: Enumerative Search, Steepest Descent, GRG, Conjugate Gradient, Powell’s

etc.

Page 13: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Type of Technique : Deterministic vs. Random

Random Methods:

With a given set of initial conditions these algorithms follow different paths with each run

Xo

Examples: Scatter Search, Genetic Algorithms, Simulated Annealing

Page 14: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Type of Technique : Derivative-based/Derivative free

Derivative Based Methods:

First and Second Derivatives of the objective Function are calculated either numerically of analytically

Examples: Steepest Descent, Newton’s Method, Conjugate Gradient, Varible Metric Method etc.

Derivative Free Methods:

Only F(X) information is used to guide the search

Examples: Powell’s Search, Random Search, SA, GA, etc.

Page 15: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Type of Technique :Local or Global Optimizer

•Local Search Methods: Find best design within a portion of the design space usually a convex space (neighborhood).

•These methods include almost all of the traditional numerical techniques zero-order and derivative-based.

•Tend to get stuck in valleys of the design space

F(x)

x*

Global Opt.

Local Opt. Local Opt.

Page 16: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Type of Technique :Local or Global Optimizer

•Global Search Methods: Find best design within entire design space.

•Tend to be robust but inefficient and expensive to run.

F(x)

x*

Global Opt.

Local Opt. Local Opt.

Page 17: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Choosing the Right Method

1. It is very important to match the optimization method to the problem that you are solving.

2. Know the assumptions involved with each method.

• Example: LP requires linear objective and constraints.

3. Know the relative efficiency of each proposed method.

• Example: Using a Enumerative Search to solve a convex problem will be very inefficient.

4. Some problems are intractable!!!

• There are no methods that can guarantee a global optima in finite time. Use methods that find good solutions fast.

Page 18: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Basics of problem Solving

•Which algorithm is right for a problem is strongly dependent on how the problem is mathematically described.

•Three basic concepts common to every algorithmic approach to problem solving

1. A representation of the problem

2. The objective

3. The evaluation function

Page 19: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Basics of problem Solving- Representation

•The representation encodes alternative candidate solutions for manipulation and defines the size of the search space.

•For example in NLP typically the design variables are represented as real numbers in n dimensions. Is this the only possibility?

•In the TSP the representation we have used is a permutation of the natural numbers 1,2,3,….,n.

•The size of the search space is not determined by the problem, it is determined by your representation and the way that you handle this encoding.

Page 20: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Basics of problem Solving- Representation

•Your choice of representation will determine how you will be able to manipulate the potential solutions

•Alternative Representations for a NLP

•{x1,x2}={5,6} {x1,x2}={101,110}

0 x1 10

10

X2

0

Page 21: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Basics of problem Solving-Objective

•The objective represents mathematically the task to be achieved

•For the TSP the objective is typically to minimize the distance covered by the salesman subject to the constraint of visiting each city once and only once and returning the starting city.

mindist (legal trip)

•For NLP an example might be:

min F=x1+x2 +x3

Page 22: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Basics of problem Solving-Evaluation Function

•The evaluation function is typically a mapping from the space of possible candidate solutions to a set of numbers , where each element from the space of possible solutions is assigned a number to indicate its quality.

•The EF allows you to compare solutions in terms of quality.

•Ordinal EFs allow you to rank the candidate solutions.

Current Candidate is the tenth best

•Numerical evaluations allow you to rank the solutions and also their degree of quality.

Current Candidate is 1.3 units better that the second best candidate

Page 23: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Basics of problem Solving-Evaluation Function

•Since it could be computationally expensive to determine a numerical value for how good or bad a candidate it might be good enough to know approximately how good a solution is.

Eval. FunctionCandidate 1

Candidate 2

Superior Candidate

Page 24: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Basics of problem Solving-Evaluation Function

•For every real world problem the evaluation problem is chosen by the designer.

•It should of course indicate for instance that a solution that meets the objective is better than one that does not.

•It should also depend on factors such as the computational complexity of the problem.

•Often the objective function indicates a good evaluation function.

Objective - Minimize Stress -> Evaluation Function -Stress

Page 25: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Basics of problem Solving-Evaluation Function

•Other times you cannot derive a useful Evaluation Function from the objective:

•In the SAT problem the objective is to find a set of boolean (TRUE,FALSE) variables that satisfies a logical statement (makes it TRUE).

•All wrong candidate solutions return FALSE which does not tell you how to improve the solution.

Page 26: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Basics of problem Solving-Defining a Search Problem

•When you design an evaluation function you need to consider that for many problems the only solutions of interest are the subset that are Feasible (satisfy the constraints).

•The feasible space can be defined as F where F S.

•A search problem can then be defined as:

Given a search space S and its feasible part F S find x F such that

eval(x) eval(y)

for all y F

•Note that the objective does not appear at all in the formulation!!

•If your EF does not correspond with the objective you will searching for the answer to the wrong problem.

Page 27: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Basics of problem Solving-Defining a Search Problem

• A point x that satisfies the condition is called a global solution.

•Finding a global solution can be difficult and impossible to prove in some cases.

•It would be easier if we could limit the search to a smaller area of S.

•This fact underlies many search techniques.

Page 28: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Basics of problem Solving-Neighborhood Search

• If we concentrate on the area of S ‘near’ to some point in the search space we can more easily look in this ‘neighborhood’.

Sx

N(x)

•N(x) of x is a set of all points in the search space that ‘close’ to the given point x.

N(x) ={y S: dist(x,y)}

Page 29: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Basics of problem Solving-Neighborhood Search

• For a continuous NLP the Euclidean distance can be used to define a neighborhood.

dist x y x yi ii

n

( , ) ( ) 2

1

•For the TSP a 2-swap neighborhood can be defined as all of the candidates that would result from swapping two cities in a given tour.

A solution x (a permutation of n=5 cities)

1-2-3-4-5 has n(n-1)/2 neighbors including

1-3-2-4-5 (swapping cities 2 and 3)

5-2-3-4-1 (swapping cities 1 and 5)

etc.

Page 30: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Basics of problem Solving-Neighborhood Search

F=x2+3

xc

Example: Quadratic Objective with no Constraints

Page 31: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Basics of problem Solving-Neighborhood Search

Step 1: Define a neighborhood around point xc.

N(x): xc- x xc +

Min F=x2+3

xc

Page 32: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Basics of problem Solving-Neighborhood Search

Step 2: Sample a candidate solution from the neighborhood and evaluate it.

if F(x1)>F(xc) reject point and choose another.

F=x2+3

xc

x1

Page 33: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Basics of problem Solving-Neighborhood Search

if F(x1)<F(xc) accept point and choose replace current point xc with x1.

F=x2+3

xc

x1

Page 34: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Basics of problem Solving-Neighborhood Search

Step 3: Create new neighborhood around xc and repeat process.

F=x2+3

xc

Page 35: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Basics of problem Solving-Neighborhood Search

•Most realistic problems are considerably more difficult than a quadratic bowl problem.

•The evaluation function defines a response surface that describes the topography of the search space with many hills and valleys.

Page 36: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Basics of problem Solving-Neighborhood Search

•Finding the best peak or the lowest valley is like trying to navigate a mountain range in the dark with only a small lamp.

•Your decisions must be made using local information. You can sample points in a local area and then decide where to walk next.

Page 37: MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.

Basics of problem Solving-Neighborhood Search

•If you decide to always go uphill then you will reach a peak but not necessarily the highest peak.

•You may need to walk downhill in order to eventually reach the highest peak in the space.