Surrogate-based constrained multi- objective optimization Aerospace design is synonymous with the use of long running and computationally intensive simulations, which are employed in the search for optimal designs in the presence of multiple, competing objectives and constraints. The difficulty of this search is often exacerbated by numerical `noise' and inaccuracies in simulation data and the frailties of complex simulations, that is they often fail to return a result. Surrogate-based optimization methods can be employed to solve, mitigate, or circumvent problems associated with such searches. This presentation gives an overview of constrained multi-objective optimization using Gaussian process based surrogates, with an emphasis on dealing with real- world problems. Alex Forrester 3 rd July 2009
64
Embed
Surrogate-based constrained multi-objective optimization Aerospace design is synonymous with the use of long running and computationally intensive simulations,
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Aerospace design is synonymous with the use of long running and computationally intensive simulations, which are employed in the search for optimal designs in the presence of multiple, competing objectives and constraints. The difficulty of this search is often exacerbated by numerical `noise' and inaccuracies in simulation data and the frailties of complex simulations, that is they often fail to return a result. Surrogate-based optimization methods can be employed to solve, mitigate, or circumvent problems associated with such searches. This presentation gives an overview of constrained multi-objective optimization using Gaussian process based surrogates, with an emphasis on dealing with real-world problems.
Alex Forrester3rd July 2009
Coming up:• Surrogate model based optimization – the basic idea
• Gaussian process based modelling
• Probability of improvement and expected improvement
• Missing data
• Noisy data
• Constraints
• Multiple objectives
2
Surrogate model based optimization
• Surrogate used to expedite search for global optimum
• Global accuracy of surrogate not a priority
3
SAMPLING PLAN
OBSERVATIONS
CONSTRUCT SURROGATE(S)
design sensitivities available?
multi-fidelity data?
SEARCH INFILL CRITERION(optimization using the
surrogate(s))
constraints present?
noise in data?
multiple design objectives?
ADD NEW DESIGN(S)
PRELIMINARY EXPERIMENTS
Gaussian process based
modelling4
Building Gaussian process models, e.g. Kriging
5
• Sample the function to be predicted at a set of points
• Correlate all points using a Gaussian type function
6
7
• 20 Gaussian “bumps” with appropriate widths (chosen to maximize likelihood of data) centred around sample points
• Multiply by weightings (again chosen to maximize likelihood of data)
8
Add together to predict function
9
Kriging prediction True function
Optimization
10
Polynomial regression based search (as Devil’s advocate)
Gaussian process prediction based optimization
12
Gaussian process prediction based optimization (as Devil’s advocate)
13
But, we have error estimates with Gaussian processes
14
Error estimates used to construct improvement criteria
15
Probability of improvement
Expected improvement
Probability of improvement
16
• Useful global infill criterion
• Not a measure of improvement, just the chance there will be one
Expected improvement
17
• Useful metric of actual amount of improvement to be expected
• Can be extended to constrained and multi-objective problems
18
Missing Data
19
What if design evaluations fail?• No infill point augmented to the surrogate
– model is unchanged
– optimization stalls
• Need to add some information or perturb the model
– add random point?
– impute a value based on the prediction at the failed point, so EI goes to zero here?
– use a penalized imputation (prediction + error estimate)?
20
Aerofoil design problem• 2 shape functions
(f1,f2) altered
• Potential flow solver (VGK) has ~35% failure rate
• 20 point optimal Latin hypercube
• max{E[I(x)]} updates until within one drag count of optimum
21
Results
22
A typical penalized imputation based optimization
23
Four variable problem
• f1,f2,f3,f4 varied
• 82% failure rate
24
A typical four variable penalized imputation based optimization• Legend as for two
variable
• Red crosses indicate imputed update points.
• Regions of infeasible geometries are shown as dark blue.
• Blank regions represent flow solver failure
25
‘Noisy’ Data
26
‘Noisy’ data
• Many data sets are corrupted by noise
• We are usually interested in deterministic ‘noise’
• ‘Noise’ in aerofoil drag data due to discretization of Euler equations
27
Failure of interpolation based infill• Surrogate becomes
excessively snaky
• Error estimates increase
• Search becomes too global
28
Regression improves model
• Add regularization constant to correlation matrix
• Last plot of previous slide improved
29
Failure of regression based infill• Regularization assumes
error at sample locations (brought in through lambda in equations below)
• Leads to expectation of improvement here
• Ok for stochastic noise
• Search stalls for deterministic simulations
30
Use “re-interpolation”
• Error due to noise ignored using new variance formulation (equation below)
• Only modelling error
• Search proceeds as desired
31
Two variable aerofoil example
• Same parameterization as missing data problem
• Course mesh causes ‘noise’
32
Interpolation – very global
33
Regression - stalls
34
Re-interpolation – searches local basins, but finds global optimum
35
Constrained EI
36
Probability of constraint satisfaction
• g(x) is the constraint function
• F=G(x)-gmin is a measure of feasibility, where G(x) is a random variable
37
It’s just like the probability of improvement, but with a limit, not a minimum
38
Probability of satisfaction
Prediction of constraint function
Constraint function
Constraint limit
Constrained probability of improvement• Probability of
improvement conditional upon constraint satisfaction
• Simply multiply the two probabilities:
39
Constrained expected improvement
• Expected improvement conditional upon constraint satisfaction
• Again, a simple multiplication:
40
A 1D example
41
After one infill point
42
A 2D example
43
44
Multi-objective EI
45
Pareto optimization• We want to identify a
set of non-dominated solutions
• These define the Pareto front
• We can formulate an expectation of improvement on the current non-dominated solutions
46
Multi-dimensional Gaussian process
• Consider a 2 objective problem
• The random variables Y1 and Y2 have a 2D probability density function:
47
Probability of improving on one point• Need to integrate
the 2D pdf:
48
• Integrating under all non-dominated solutions:
• The EI is the first moment of this integral about the Pareto front (see book)
49
A 1D example
50
51
Matlab demo
52
Nowacki beam• Fixed length steel cantilever beam under 5kN load
• Variables:
– height
– width
• Objectives:
– minimize cross section area
– minimize bending moment
• Constraints:
– area ratio
– Bending moment
– buckling
– deflection
– Shear
53
Problem setup• 10 point optimal Latin hypercube
• Kriging model of each objective and constraint
– Parameters tuned with GA + SQP (using adjoint of likelihood)
• 20 points added at the maximum constrained multi-objective expected improvement
54
Sampling plan
55
Initial trade off
56
5 updates
57
10 updates
58
15 updates
59
20 updates
60
Final trade off
61
Summary• Surrogate based optimization offers answers to, or
ways to get round, many problems associated with real world optimization
• This seemingly blunt tool must, however, be used with precision as there are many traps to fall into
• In a multi-objective context, the use of surrogates is particularly promising
• There has not been time to cover new surrogate methods (e.g. blind Kriging), multi-fidelity modelling or enhancements to EI, in terms of its exploitation/exploration tradeoff properties
62
References• A. I. J. Forrester, A. Sóbester, A. J. Keane, Engineering Design via Surrogate
Modelling: A Practical Guide, John Wiley & Sons, Chichester, 240 pages, ISBN 978-0-470-06068-1.
• A. I. J. Forrester, A. J. Keane, Recent advances in surrogate-based optimization, Progress in Aerospace Sciences, 45, 50-79, (doi:10.1016/j.paerosci.2008.11.001)
• A. I. J. Forrester, A. Sóbester, A. J. Keane, Optimization with missing data, Proc. R. Soc. A, 462(2067), 935-945, (doi:10.1098/rspa.2005.1608).
• A. I. J. Forrester, N. W. Bressloff, A. J. Keane, Design and analysis of ‘noisy’ computer experiments, AIAA journal, 44(10), 2331-2339, (doi:10.2514/1.20068).