Top Banner
Multi-objective Optimization Some introductory figures from : Deb Kalyanmoy, Multi-Objective Optimization using Evolutionary Algorithms, Wiley 2001 Implementation of Constrained GA Based on NSGA-II
23

Multi-objective Optimization · Multi-objective Optimization Some introductory figures from : Deb Kalyanmoy, Multi-Objective Optimization using Evolutionary Algorithms, Wiley 2001

Aug 02, 2020

Download

Documents

dariahiddleston
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Multi-objective Optimization · Multi-objective Optimization Some introductory figures from : Deb Kalyanmoy, Multi-Objective Optimization using Evolutionary Algorithms, Wiley 2001

Multi-objective Optimization

Some introductory figures from : Deb Kalyanmoy, Multi-Objective Optimization using Evolutionary Algorithms, Wiley 2001

Implementation of Constrained GABased on NSGA-II

Page 2: Multi-objective Optimization · Multi-objective Optimization Some introductory figures from : Deb Kalyanmoy, Multi-Objective Optimization using Evolutionary Algorithms, Wiley 2001

Optimization

• Optimization refers to finding one or more feasible solutions which correspond to extreme values of one or more objectives

• Finding out design variable : x

Minimize f(x) - Single objective

Subjected to gj(x) ≤ 0, j=1,…,nj

hk(x) = 0, k=1,…,nk

xi(L) ≤ xi ≤ xi

(U)

Page 3: Multi-objective Optimization · Multi-objective Optimization Some introductory figures from : Deb Kalyanmoy, Multi-Objective Optimization using Evolutionary Algorithms, Wiley 2001

Optimization Model Classification

• Basic classifications are:

– Constrained or unconstrained

– Linear or non-linear

– Single objective or multi-objective

– Another classification can be made by variables:

• continuous/discrete/mixed-integer

Page 4: Multi-objective Optimization · Multi-objective Optimization Some introductory figures from : Deb Kalyanmoy, Multi-Objective Optimization using Evolutionary Algorithms, Wiley 2001

Single and Multi-objective Optimization

• Single Objective : Only one objective function

• Multi-Objective : Two or more and often conflicting objective functions

• e.g. Buying a car : minimize cost and maximize comfort

Page 5: Multi-objective Optimization · Multi-objective Optimization Some introductory figures from : Deb Kalyanmoy, Multi-Objective Optimization using Evolutionary Algorithms, Wiley 2001

• Mapping between feasible decision space and objective space

• Dominated solutions : Set of design points performing worse than some other better points

• Domination criterion : A feasible solution x1 dominates an other feasible solution x2 (denoted asx1 < x2), if both of the following conditions are true:1) The solution x1 is no worse than x2 in all objectives, i.e. fi(x1) ≤ fi(x2)2) The solution x1 is strictly better than x2 in at least one objective, i.e. fi(x1) < fi(x2)

• Non-dominated solutions :If two solutions are compared, then the solutions are said to be non-dominatedwith respect to each other IF neither solution dominates the other

• Pareto optimal front : The function space representation of all the non-

dominated solutions

Pareto Optimal Front

Page 6: Multi-objective Optimization · Multi-objective Optimization Some introductory figures from : Deb Kalyanmoy, Multi-Objective Optimization using Evolutionary Algorithms, Wiley 2001

Pareto Optimal Front .. contd

Options :• Min – Min• Min – Max• Max – Min• Max – Max

Which one is which ?

Page 7: Multi-objective Optimization · Multi-objective Optimization Some introductory figures from : Deb Kalyanmoy, Multi-Objective Optimization using Evolutionary Algorithms, Wiley 2001

Solution Methods

• Methods that try to avoid generating the Pareto front– Generate “utopia point”

– Define optimum based on some measure of distance from utopia point

• Generating entire Pareto front– Weighted sum of objectives with variable coefficients

– Optimize one objective for a range of constraints on the others

– Niching methods with population based algorithms

Page 8: Multi-objective Optimization · Multi-objective Optimization Some introductory figures from : Deb Kalyanmoy, Multi-Objective Optimization using Evolutionary Algorithms, Wiley 2001

Implementation of Multiobjective Constrained GA,

Based on NSGA-II

Page 9: Multi-objective Optimization · Multi-objective Optimization Some introductory figures from : Deb Kalyanmoy, Multi-Objective Optimization using Evolutionary Algorithms, Wiley 2001

Genetic Algorithms

• Genetic algorithms imitate natural optimization process, natural selection in evolution

• Coding: replace design variables with a continuous string of digits or “genes”– Binary– Integer– Real

• Population: Create population of design points• Selection: Select parents based on fitness• Crossover: Create child designs• Mutation: Mutate child designs

Page 10: Multi-objective Optimization · Multi-objective Optimization Some introductory figures from : Deb Kalyanmoy, Multi-Objective Optimization using Evolutionary Algorithms, Wiley 2001

Problem Formulation

• Current program is written for 2 objectives (M=2), it is possible to change it

≥ Inequalities defined ≥ 0

Page 11: Multi-objective Optimization · Multi-objective Optimization Some introductory figures from : Deb Kalyanmoy, Multi-Objective Optimization using Evolutionary Algorithms, Wiley 2001

NSGA-II

• Non-dominated Sorting Genetic Algorithm (NSGA)-II performs better than other constrained multi-objective optimizers* (PAEA, SPEA)

– Better and faster convergence to true optimal front

– Better spread on Pareto optimal front

• NSGA-II ranks designs based on non-domination

• For example : min-max problem

• Design 3 is dominated by both design A and B (and thus undesirable), but design A and B are non-dominated with respect to one another (and thus Pareto optimal).

Design Cost Comfort

A 25K 65%

B 45K 80%

3 55K 50%

* Deb, K, et al, “A Fast and Elitist Multiobjective Genetic Algorithm: NSGA-II,” IEEE

Transactions on Evolutionary Computations, Vol. 6, No. 2, pp. 182-197, 2002

3

Page 12: Multi-objective Optimization · Multi-objective Optimization Some introductory figures from : Deb Kalyanmoy, Multi-Objective Optimization using Evolutionary Algorithms, Wiley 2001

Flow Chart *

* From presentation of Tushar Goel

Page 13: Multi-objective Optimization · Multi-objective Optimization Some introductory figures from : Deb Kalyanmoy, Multi-Objective Optimization using Evolutionary Algorithms, Wiley 2001

Implementation

• Initialize population

– Fixed number of population size (N_pop)

– Fixed number of variables (N_var)

– Discrete variables

• Variable upper (UB) and lower bounds (LB)

• Number of increments (N_increments)

– Randomly distributed throughout the design space

Page 14: Multi-objective Optimization · Multi-objective Optimization Some introductory figures from : Deb Kalyanmoy, Multi-Objective Optimization using Evolutionary Algorithms, Wiley 2001

Ranking• Ranks designs based on non-

domination– The Pareto front is all rank 1 designs– If the rank 1 designs are removed, the next

Pareto front will be all rank 2 designs, etc.– Sorting method is different than what

NSGA-II* details

• Constraints : handled with constraint-domination ideas– If two designs are both feasible, the standard non-domination

techniques are used– If one design is feasible and the other is not, the former is obviously

favored (ranked lower)– If both designs are infeasible, the design with a smaller overall

constraint violation is favored (ranked lower)

Rank 1

Rank 2

* Deb, K, et al, “A Fast and Elitist Multiobjective Genetic Algorithm: NSGA-II,” IEEE

Transactions on Evolutionary Computations, Vol. 6, No. 2, pp. 182-197, 2002

Page 15: Multi-objective Optimization · Multi-objective Optimization Some introductory figures from : Deb Kalyanmoy, Multi-Objective Optimization using Evolutionary Algorithms, Wiley 2001

Selection and Fitness

• More fit designs have higher chance of passing their genes to the next generation

• Fitness is based on rank, low rank designs have higher fitness

• Selection :– Using fitness based roulette wheel– Create roulette wheel with ns segments– Create random number between 0 and 1– Find segment on roulette wheel that contains the

random number– Segment number corresponds to design number– Build parent database

* From presentation of Gerhard Venter

*

Page 16: Multi-objective Optimization · Multi-objective Optimization Some introductory figures from : Deb Kalyanmoy, Multi-Objective Optimization using Evolutionary Algorithms, Wiley 2001

Child Population Creation

• Select two parents for each reproduction –randomly from parent database

• Crossover :– Probability close to 1– One point crossover – randomly select crossover point– Child =

[parent1(1:cross_pt),parent2(cross_pt+1:N_var)]

• Mutation :– Exploration parameter– Probability of mutation is typically small (e.g. 0.2)– Randomly select gene to mutate– Randomly modify gene

Page 17: Multi-objective Optimization · Multi-objective Optimization Some introductory figures from : Deb Kalyanmoy, Multi-Objective Optimization using Evolutionary Algorithms, Wiley 2001

Elitism

• Keeps best individuals

– Combine the child and parent population

– Select best individuals from the combined population

* Figure from presentation of Tushar Goel

Page 18: Multi-objective Optimization · Multi-objective Optimization Some introductory figures from : Deb Kalyanmoy, Multi-Objective Optimization using Evolutionary Algorithms, Wiley 2001

Nitching• Guides the selection process toward a uniformly spread-out

Pareto front

• Uses a parameter based upon crowding distance (c = a + b), where designs which provide the greatest spread along the Pareto front are favored– Between two solutions with differing

nondomination ranks, we prefer the solution with the lower (better) rank

– If both solutions belong to the same front, then we prefer the solution that is located in a lesser crowded region

* Figure from presentation of Tushar Goel

Page 19: Multi-objective Optimization · Multi-objective Optimization Some introductory figures from : Deb Kalyanmoy, Multi-Objective Optimization using Evolutionary Algorithms, Wiley 2001

Example

Laminate Design

Page 20: Multi-objective Optimization · Multi-objective Optimization Some introductory figures from : Deb Kalyanmoy, Multi-Objective Optimization using Evolutionary Algorithms, Wiley 2001

Problem Formulation

• Objectives : Design a symmetric laminate

– Maximize D11, maximize D22

• Design Variables :

– 8 to 16 layers

– Layup orientations, 0 ≤ θi ≤ 90 (15° step)

• Constraints :

– D12 ≥ 0.5*D11

– D12 ≥ 0.5*D22

Page 21: Multi-objective Optimization · Multi-objective Optimization Some introductory figures from : Deb Kalyanmoy, Multi-Objective Optimization using Evolutionary Algorithms, Wiley 2001

Optimization Settings

• N_pop = 10; % size of the population

• N_gen = 30; % # of generations

• cross = 1.0; % crossover probability

• mut = 0.2; % mutation probability

• LB = [1 1 1 1 0 0 0 0];

• UB = [7 7 7 7 7 7 7 7];

• N_increments = [7 7 7 7 8 8 8 8];

Use higher values

Page 22: Multi-objective Optimization · Multi-objective Optimization Some introductory figures from : Deb Kalyanmoy, Multi-Objective Optimization using Evolutionary Algorithms, Wiley 2001

Layup Orientations

• For last 4 layers If variable is 0,the ply does not exist

for i = 1:4ply_angles(i) = (X(i)-1)*15;

end

count = 5;for i=5:8

if X(i) > 0ply_angles(count) = (X(i)-1)*15;count = count+1;

endend

Page 23: Multi-objective Optimization · Multi-objective Optimization Some introductory figures from : Deb Kalyanmoy, Multi-Objective Optimization using Evolutionary Algorithms, Wiley 2001

Pareto Front – 30 Generations

A

B

C

D

Layup orientation (°)

Design D11 D22 con1 con2 θ1 θ2 θ3 θ4 θ5 θ6 θ7 θ8

A 259.81 333.23 0.14248 0.00093 15 15 90 90 45 15 0 90

B 287.08 317.54 0.06128 0.00744 15 15 90 45 75 30 0 45

C 310.96 300.29 0.00750 0.02553 15 15 90 45 0 75 30 15

D 325.42 275.83 0.00034 0.09029 15 15 90 15 0 45 0 90

θ1 is outermost layer