Top Banner
Introductory Workshop on Evolutionary Computing Dr. Daniel Tauritz Director, Natural Computation Laboratory Associate Professor, Department of Computer Science Research Investigator, Intelligent Systems Center Collaborator, Energy Research & Development Center Part I: Introduction to Evolutionary Algorithms
112

Introductory Workshop on Evolutionary Computing

Jan 15, 2016

Download

Documents

Cedric Cedric

Introductory Workshop on Evolutionary Computing. Part I: Introduction to Evolutionary Algorithms. Dr. Daniel Tauritz Director, Natural Computation Laboratory Associate Professor, Department of Computer Science Research Investigator, Intelligent Systems Center - PowerPoint PPT Presentation
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Introductory Workshop on Evolutionary Computing

Introductory Workshop on Evolutionary Computing

Dr. Daniel TauritzDirector, Natural Computation Laboratory

Associate Professor, Department of Computer ScienceResearch Investigator, Intelligent Systems Center

Collaborator, Energy Research & Development Center

Part I: Introduction to Evolutionary Algorithms

Page 2: Introductory Workshop on Evolutionary Computing

Motivation

• Real-world optimization problems are typically characterized by huge, ill-behaved solution spaces– Infeasible to exhaustively search– Defy traditional (gradient-based) optimization

algorithms because they are non-linear, non-differentiable, non-continuous, or non-convex

Page 3: Introductory Workshop on Evolutionary Computing

Real-World Example

• Electric Power Transmission Systems• Supply is not keeping up with demand• Expansion hampered by:

– Social, environmental, and economic constraints

• Transmission system is “stressed”– Already carrying more than intended– Dramatic increase in incidence reports

Page 4: Introductory Workshop on Evolutionary Computing

The Grid

Page 5: Introductory Workshop on Evolutionary Computing

The Grid: Failure

Page 6: Introductory Workshop on Evolutionary Computing

The Grid: Redistribution

Page 7: Introductory Workshop on Evolutionary Computing

The Grid: A Cascade

Page 8: Introductory Workshop on Evolutionary Computing

The Grid: Redistribution

Page 9: Introductory Workshop on Evolutionary Computing

The Grid: Unsatisfiable

Page 10: Introductory Workshop on Evolutionary Computing

The Grid: Unsatisfiable

Page 11: Introductory Workshop on Evolutionary Computing

Failure Analysis

• Failure spreads relatively quickly– Too quickly for conventional control

• Cascade may be avoidable– Utilize unused capacities (flow

compensation)• Unsatisfiable condition may be avoidable

– Better power flow control to reduce severity

Page 12: Introductory Workshop on Evolutionary Computing

Possible Solution

• Strategically place a number of power flow control devices

• Flexible A/C Transmission System (FACTS) devices are a promising type of high-speed power-electronics power flow control devices

• Unified Power Flow Controller (UPFC)

Page 13: Introductory Workshop on Evolutionary Computing

FACTS Interaction Laboratory

HIL Line

UPFC

Simulation

Engine

Page 14: Introductory Workshop on Evolutionary Computing

The placement optimization problem

• UPFCs are extremely expensive, so only a limited number can be placed

• Placement is a combinatorial problem– Given 1000 high-voltage lines and 10 UPFCs,

there are 1000C10 total possible placements (about 2.6 x 1023)

– If each placement is evaluated in 1 minute, then it will take about 5 x 1015 centuries to solve using exhaustive search

Page 15: Introductory Workshop on Evolutionary Computing

The placement solution space

• Placing individual UPFC devices are not independent tasks

• There are complex non-linear interactions between UPFC devices

• The placement solution space is ill-behaved, so traditional optimization algorithms are not usable

Page 16: Introductory Workshop on Evolutionary Computing

Evolutionary Computing

• The field of Evolutionary Computing (EC) studies the theory and application of Evolutionary Algorithms (EAs)

• EAs can be described as a class of stochastic, population-based optimization algorithms inspired by natural evolution, genetics, and population dynamics

Page 17: Introductory Workshop on Evolutionary Computing

Very high-level EA schematic

EAfitness function

representation

EA operators

EA parameters

solution

problem instance

Page 18: Introductory Workshop on Evolutionary Computing

Intuitive view of why EAs work

• Trial-and-error (aka generate-and-test)

• Graduated solution quality creates virtual gradient

• Stochastic local search of solution landscape

Page 19: Introductory Workshop on Evolutionary Computing

Population Initialization

Problem Description

Reproduction

Fitness Evaluation

Termination Criteria Met?

Solution

Competition

Strategy Parameters

Fitness Evaluation

Problem Specific Black Box

Evolutionary Problem Solving

yes

no

Evolutionary Cycle

Page 20: Introductory Workshop on Evolutionary Computing

(Darwinian) Evolution

• The environment contains populations of individuals of the same species which are reproductively compatible

• Natural selection

• Random variation

• Survival of the fittest

• Inheritance of traits

Page 21: Introductory Workshop on Evolutionary Computing

(Mendelian) Genetics

• Genotypes vs. phenotypes

• Pleitropy: one gene affects multiple phenotypic traits

• Polygeny: one phenotypic trait is affected by multiple genes

• Chromosomes (haploid vs. diploid)

• Loci and alleles

Page 22: Introductory Workshop on Evolutionary Computing

Environment Problem (solution space)

Fitness Fitness function

Population Set

Individual Datastructure

Genes Elements

Alleles Datatype

Nature versus the digital realm

Page 23: Introductory Workshop on Evolutionary Computing

Scope

• Genotype – functional unit of inheritance

• Individual – functional unit of selection

• Population – functional unit of evolution

Page 24: Introductory Workshop on Evolutionary Computing

Solution Representation

• Structural types: linear, tree, FSM, etc.• Data types: bit strings, integers,

permutations, reals, etc.• EA genotype encodes solution

representation and attributes• EA phenotype expresses the EA

genotype in the current environment• Encoding & Decoding

Page 25: Introductory Workshop on Evolutionary Computing

Fitness Function

• Determines individuals’ fitness based selection chances

• Transforms objective function to linearly ordered set with higher fitness values corresponding to higher quality solutions (i.e., solutions which better satisfy the objective function)

• Knapsack Problem Example

Page 26: Introductory Workshop on Evolutionary Computing

Initialization

• (Initial) population size

• Uniform random

• Heuristic based

• Knowledge based

• Genotypes from previous runs

• Seeding

Page 27: Introductory Workshop on Evolutionary Computing

Parent selection

• Fitness Proportional Selection (FPS)– Roulette wheel sampling– High risk of premature convergence– Uneven selective pressure– Fitness function not transposition invariant

• Fitness Rank Selection– Mapping function (like a cooling schedule)– Tournament selection

Page 28: Introductory Workshop on Evolutionary Computing

Variation operators

• Mutation = Stochastic unary variation operator

• Recombination = Stochastic multi-ary variation operator

Page 29: Introductory Workshop on Evolutionary Computing

Mutation

• Bit-String Representation:– Bit-Flip

– E[#flips] = L * pm

• Integer Representation:– Random Reset (cardinal attributes)– Creep Mutation (ordinal attributes)

Page 30: Introductory Workshop on Evolutionary Computing

Mutation cont.

• Floating-Point– Uniform– Non-uniform from fixed distribution

• Gaussian, Cauche, Levy, etc.

• Permutation– Swap– Insert– Scramble– Inversion

Page 31: Introductory Workshop on Evolutionary Computing

Recombination• Recombination rate: asexual vs. sexual• N-Point Crossover (positional bias)• Uniform Crossover (distributional bias)• Discrete recombination (no new alleles)• (Uniform) arithmetic recombination• Simple recombination• Single arithmetic recombination• Whole arithmetic recombination

Page 32: Introductory Workshop on Evolutionary Computing

Survivor selection

• (µ+λ) – plus strategy• (µ,λ) – comma strategy (aka generational)• Typically fitness-based

– Deterministic vs. stochastic– Truncation– Elitism

• Alternatives include completely stochastic and age-based

Page 33: Introductory Workshop on Evolutionary Computing

Termination

• CPU time / wall time

• Number of fitness evaluations

• Lack of fitness improvement

• Lack of genetic diversity

• Solution quality / solution found

• Combination of the above

Page 34: Introductory Workshop on Evolutionary Computing

Simple Genetic Algorithm (SGA)

• Representation: Bit-strings

• Recombination: 1-Point Crossover

• Mutation: Bit Flip

• Parent Selection: Fitness Proportional

• Survival Selection: Generational

Page 35: Introductory Workshop on Evolutionary Computing

Problem solving steps• Collect problem knowledge (at minimum solution

representation and objective function)• Define gene representation and fitness function• Creation of initial population• Parent selection, mate pairing• Define variation operators• Survival selection• Define termination condition• Parameter tuning

Page 36: Introductory Workshop on Evolutionary Computing

Typical EA Strategy Parameters

• Population size• Initialization related parameters• Selection related parameters• Number of offspring• Recombination chance• Mutation chance• Mutation rate• Termination related parameters

Page 37: Introductory Workshop on Evolutionary Computing

• More general purpose than traditional optimization algorithms; i.e., less problem specific knowledge required

• Ability to solve “difficult” problems• Solution availability• Robustness• Inherent parallelism

EA Pros

Page 38: Introductory Workshop on Evolutionary Computing

• Fitness function and genetic operators often not obvious

• Premature convergence• Computationally intensive• Difficult parameter optimization

EA Cons

Page 39: Introductory Workshop on Evolutionary Computing

Behavioral aspects

• Exploration versus exploitation• Selective pressure• Population diversity

– Fitness values– Phenotypes– Genotypes– Alleles

• Premature convergence

Page 40: Introductory Workshop on Evolutionary Computing

Genetic Programming (GP)

• Characteristic property: variable-size hierarchical representation vs. fixed-size linear in traditional EAs

• Application domain: model optimization vs. input values in traditional EAs

• Unifying Paradigm: Program Induction

Page 41: Introductory Workshop on Evolutionary Computing

Program induction examples• Optimal control• Planning• Symbolic regression• Automatic programming• Discovering game playing strategies• Forecasting• Inverse problem solving• Decision Tree induction• Evolution of emergent behavior• Evolution of cellular automata

Page 42: Introductory Workshop on Evolutionary Computing

GP specification

• S-expressions

• Function set

• Terminal set

• Arity

• Correct expressions

• Closure property

• Strongly typed GP

Page 43: Introductory Workshop on Evolutionary Computing

GP notes

• Mutation or recombination (not both)

• Bloat (survival of the fattest)

• Parsimony pressure

Page 44: Introductory Workshop on Evolutionary Computing

Case Study employing GP

Deriving Gas-Phase Exposure History through Computationally

Evolved Inverse Diffusion Analysis

Page 45: Introductory Workshop on Evolutionary Computing

Introduction

UnexplainedSickness

Examine IndoorExposure History

Find Contaminantsand Fix Issues

Page 46: Introductory Workshop on Evolutionary Computing

Background

• Indoor air pollution top five environmental health risks

•$160 billion could be saved every year by improving indoor air quality

•Current exposure history is inadequate

•A reliable method is needed to determine past contamination levels and times

Page 47: Introductory Workshop on Evolutionary Computing

Problem Statement

•A forward diffusion differential equation predicts concentration in materials after exposure

•An inverse diffusion equation finds the timing and intensity of previous gas contamination

•Knowledge of early exposures would greatly strengthen epidemiological conclusions

Page 48: Introductory Workshop on Evolutionary Computing

Gas-phase concentration history material phase concentration profile

Elapsed time

Con

cen

trati

on

in

gas

Con

cen

trati

on

in

solid

x or distance into solid (m)

0 0

Page 49: Introductory Workshop on Evolutionary Computing

Proposed Solution

x^2 + sin(x)sin(x+y) + e^(x^2)

5x^2 + 12x - 4

x^5 + x^4 - tan(y) / pisin(cos(x+y)^2)

x^2 - sin(x)X +

/

Sin

??

•Use Genetic Programming (GP) as a directed search for inverse equation

•Fitness based on forward equation

Page 50: Introductory Workshop on Evolutionary Computing

Related Research

• It has been proven that the inverse equation exists

•Symbolic regression with GP has successfully found both differential equations and inverse functions

•Similar inverse problems in thermodynamics and geothermal research have been solved

Page 51: Introductory Workshop on Evolutionary Computing

Candidate

Solutions

Population

Fitness

Interdisciplinary Work•Collaboration between Environmental

Engineering, Computer Science, and Math

Parent Parent SelectionSelection

Parent Parent SelectionSelection

ReproductioReproductionn

ReproductioReproductionn

CompetitioCompetitionn

CompetitioCompetitionn

Genetic Programming Algorithm

Forward Forward Diffusion Diffusion EquationEquation

Forward Forward Diffusion Diffusion EquationEquation

Page 52: Introductory Workshop on Evolutionary Computing

Genetic Programming Background

++

**

XX

SinSin

**XX

XX PiPi

Y = X^2 + Sin( X * Pi )

Page 53: Introductory Workshop on Evolutionary Computing

Summary

•Ability to characterize exposure history will enhance ability to assess health risks of chemical exposure

Page 54: Introductory Workshop on Evolutionary Computing

Parameter Tuning

• A priori optimization of EA strategy parameters

• Start with stock parameter values

• Manually adjust based on user intuition

• Monte Carlo sampling of parameter values on a few (short) runs

• Meta-tuning algorithm (e.g., meta-EA)

Page 55: Introductory Workshop on Evolutionary Computing

Parameter Tuning drawbacks

• Exhaustive search for optimal values of parameters, even assuming independency, is infeasible

• Parameter dependencies

• Extremely time consuming

• Optimal values are very problem specific

• Different values may be optimal at different evolutionary stages

Page 56: Introductory Workshop on Evolutionary Computing

Parameter Control

• Blind– Example: replace pi with pi(t)

• akin to cooling schedule in Simulated Annealing

• Adaptive– Example: Rechenberg’s 1/5 success rule

• Self-adaptive– Example: mutation-step size control

Page 57: Introductory Workshop on Evolutionary Computing

Evaluation Function Control

• Example 1: Parsimony Pressure in GP

• Example 2: Penalty Functions in Constraint Satisfaction Problems (aka Constrained Optimization Problems)

Page 58: Introductory Workshop on Evolutionary Computing

Penalty Function Controleval(x)=f(x)+W ·penalty(x)

• Deterministic example: W=W(t)=(C ·t)α with C,α≥1

• Adaptive example

• Self-adaptive exampleNote: this allows evolution to cheat!

Page 59: Introductory Workshop on Evolutionary Computing

Parameter Control aspects

• What is changed?– Parameters vs. operators

• What evidence informs the change?– Absolute vs. relative

• What is the scope of the change?– Gene vs. individual vs. population– Ex: one-bit allele for recombination

operator selection (pairwise vs. vote)

Page 60: Introductory Workshop on Evolutionary Computing

Parameter control examples• Representation (GP:ADFs, delta coding)• Evaluation function (objective function/…)• Mutation (ES)• Recombination (Davis’ adaptive operator

fitness:implicit bucket brigade)• Selection (Boltzmann)• Population• Multiple

Page 61: Introductory Workshop on Evolutionary Computing

Self-Adaptive Mutation Control

• Pioneered in Evolution Strategies

• Now in widespread use in many types of EAs

Page 62: Introductory Workshop on Evolutionary Computing

Uncorrelated mutation with one

• Chromosomes: x1,…,xn, ’ = • exp( • N(0,1))

• x’i = xi + ’ • N(0,1)

• Typically the “learning rate” 1/ n½

• And we have a boundary rule ’ < 0 ’ = 0

Page 63: Introductory Workshop on Evolutionary Computing

Mutants with equal likelihood

Circle: mutants having same chance to be created

Page 64: Introductory Workshop on Evolutionary Computing

Uncorrelated mutation with n ’s• Chromosomes: x1,…,xn, 1,…, n ’i = i • exp(’ • N(0,1) + • Ni (0,1))• x’i = xi + ’i • Ni (0,1)• Two learning rate parmeters:

’ overall learning rate coordinate wise learning rate

’ 1/(2 n)½ and 1/(2 n½) ½

’ and have individual proportionality constants which both have default values of 1

i’ < 0 i’ = 0

Page 65: Introductory Workshop on Evolutionary Computing

Mutants with equal likelihood

Ellipse: mutants having the same chance to be created

Page 66: Introductory Workshop on Evolutionary Computing

Correlated mutations

• Chromosomes: x1,…,xn, 1,…, n ,1,…, k

• where k = n • (n-1)/2 • and the covariance matrix C is defined as:

– cii = i2

– cij = 0 if i and j are not correlated

– cij = ½ • ( i2 - j

2 ) • tan(2 ij) if i and j are correlated

• Note the numbering / indices of the ‘s

Page 67: Introductory Workshop on Evolutionary Computing

Correlated mutations cont’d

The mutation mechanism is then: ’i = i • exp(’ • N(0,1) + • Ni (0,1)) ’j = j + • N (0,1)

• x ’ = x + N(0,C’)– x stands for the vector x1,…,xn – C’ is the covariance matrix C after mutation of the values

1/(2 n)½ and 1/(2 n½) ½ and 5° i’ < 0 i’ = 0 and

• | ’j | > ’j = ’j - 2 sign(’j)

Page 68: Introductory Workshop on Evolutionary Computing

Mutants with equal likelihood

Ellipse: mutants having the same chance to be created

Page 69: Introductory Workshop on Evolutionary Computing

Learning Classifier Systems (LCS)

• Note: LCS is technically not a type of EA, but can utilize an EA

• Condition-Action Rule Based Systems – rule format: <condition:action>

• Reinforcement Learning

• LCS rule format: – <condition:action> → predicted payoff– don’t care symbols

Page 70: Introductory Workshop on Evolutionary Computing
Page 71: Introductory Workshop on Evolutionary Computing

LCS specifics

• Multi-step credit allocation – Bucket Brigade algorithm

• Rule Discovery Cycle – EA• Pitt approach: each individual represents

a complete rule set• Michigan approach: each individual

represents a single rule, a population represents the complete rule set

Page 72: Introductory Workshop on Evolutionary Computing

Multimodal Problems

• Multimodal def.: multiple local optima and at least one local optimum is not globally optimal

• Basins of attraction & Niches

• Motivation for identifying a diverse set of high quality solutions:– Allow for human judgement– Sharp peak niches may be overfitted

Page 73: Introductory Workshop on Evolutionary Computing

Restricted Mating

• Panmictic vs. restricted mating• Finite pop size + panmictic mating -> genetic

drift• Local Adaptation (environmental niche)• Punctuated Equilibria

– Evolutionary Stasis– Demes

• Speciation (end result of increasingly specialized adaptation to particular environmental niches)

Page 74: Introductory Workshop on Evolutionary Computing

Implicit Diversity Maintenance (1)

• Multiple runs of standard EA– Non-uniform basins of attraction problematic

• Island Model (coarse-grain parallel)– Punctuated Equilibria– Epoch, migration– Communication characteristics– Initialization: number of islands and respective

population sizes

Page 75: Introductory Workshop on Evolutionary Computing

Implicit Diversity Maintenance (2)

• Diffusion Model EAs– Single Population, Single Species– Overlapping demes distributed within

Algorithmic Space (e.g., grid)– Equivalent to cellular automata

• Automatic Speciation– Genotype/phenotype mating restrictions

Page 76: Introductory Workshop on Evolutionary Computing

Explicit Diversity Maintenance

• Fitness Sharing: individuals share fitness within their niche

• Crowding: replace similar parents

Page 77: Introductory Workshop on Evolutionary Computing

Multi-Objective EAs (MOEAs)

• Extension of regular EA which maps multiple objective values to single fitness value

• Objectives typically conflict• In a standard EA, an individual A is said to be

better than an individual B if A has a higher fitness value than B

• In a MOEA, an individual A is said to be better than an individual B if A dominates B

Page 78: Introductory Workshop on Evolutionary Computing

Domination in MOEAs

• An individual A is said to dominate individual B iff:– A is no worse than B in all objectives– A is strictly better than B in at least one

objective

Page 79: Introductory Workshop on Evolutionary Computing

Pareto Optimality

• Given a set of alternative allocations of, say, goods or income for a set of individuals, a movement from one allocation to another that can make at least one individual better off without making any other individual worse off is called a Pareto Improvement. An allocation is Pareto Optimal when no further Pareto Improvements can be made. This is often called a Strong Pareto Optimum (SPO).

Page 80: Introductory Workshop on Evolutionary Computing

Pareto Optimality in MOEAs

• Among a set of solutions P, the non-dominated subset of solutions P’ are those that are not dominated by any member of the set P

• The non-dominated subset of the entire feasible search space S is the globally Pareto-optimal set

Page 81: Introductory Workshop on Evolutionary Computing

Goals of MOEAs

• Identify the Global Pareto-Optimal set of solutions (aka the Pareto Optimal Front)

• Find a sufficient coverage of that set

• Find an even distribution of solutions

Page 82: Introductory Workshop on Evolutionary Computing

MOEA metrics

• Convergence: How close is a generated solution set to the true Pareto-optimal front

• Diversity: Are the generated solutions evenly distributed, or are they in clusters

Page 83: Introductory Workshop on Evolutionary Computing

Deterioration in MOEAs

• Competition can result in the loss of a non-dominated solution which dominated a previously generated solution

• This loss in its turn can result in the previously generated solution being regenerated and surviving

Page 84: Introductory Workshop on Evolutionary Computing

Game-Theoretic ProblemsAdversarial search: multi-agent problem with

conflicting utility functions

Ultimatum Game• Select two subjects, A and B• Subject A gets 10 units of currency• A has to make an offer (ultimatum) to B, anywhere from

0 to 10 of his units• B has the option to accept or reject (no negotiation)• If B accepts, A keeps the remaining units and B the

offered units; otherwise they both loose all units

Page 85: Introductory Workshop on Evolutionary Computing

Real-World Game-Theoretic Problems

• Real-world examples: – economic & military strategy– arms control– cyber security– bargaining

• Common problem: real-world games are typically incomputable

Page 86: Introductory Workshop on Evolutionary Computing

Armsraces

• Military armsraces

• Prisoner’s Dilemma

• Biological armsraces

Page 87: Introductory Workshop on Evolutionary Computing

Approximating incomputable games

• Consider the space of each user’s actions

• Perform local search in these spaces• Solution quality in one space is

dependent on the search in the other spaces

• The simultaneous search of co-dependent spaces is naturally modeled as an armsrace

Page 88: Introductory Workshop on Evolutionary Computing

Evolutionary armsraces

• Iterated evolutionary armsraces

• Biological armsraces revisited

• Iterated armsrace optimization is doomed!

Page 89: Introductory Workshop on Evolutionary Computing

Coevolutionary Algorithm (CoEA)

A special type of EAs where the fitness of an individual is dependent on other individuals. (i.e., individuals are explicitly part of the environment)

• Single species vs. multiple species

• Cooperative vs. competitive coevolution

Page 90: Introductory Workshop on Evolutionary Computing

CoEA difficulties (1)

Disengagement

• Occurs when one population evolves so much faster than the other that all individuals of the other are utterly defeated, making it impossible to differentiate between better and worse individuals without which there can be no evolution

Page 91: Introductory Workshop on Evolutionary Computing

CoEA difficulties (2)

Cycling

• Occurs when populations have lost the genetic knowledge of how to defeat an earlier generation adversary and that adversary re-evolves

• Potentially this can cause an infinite loop in which the populations continue to evolve but do not improve

Page 92: Introductory Workshop on Evolutionary Computing

CoEA difficulties (3)

Suboptimal Equilibrium

(aka Mediocre Stability)

• Occurs when the system stabilizes in a suboptimal equilibrium

Page 93: Introductory Workshop on Evolutionary Computing

Case Study from Critical Infrastructure Protection

Infrastructure Hardening

• Hardenings (defenders) versus contingencies (attackers)

• Hardenings need to balance spare flow capacity with flow control

Page 94: Introductory Workshop on Evolutionary Computing

Case study from Automated Software Engineering

Coevolutionary Automated Software Correction (CASC)

Page 95: Introductory Workshop on Evolutionary Computing

Objective: Find a way to automate the process of software testing and correction.

Approach: Create Coevolutionary Automated Software Correction (CASC) system which will take a software artifact as input and produce a corrected version of the software artifact as output.

Page 96: Introductory Workshop on Evolutionary Computing
Page 97: Introductory Workshop on Evolutionary Computing

Coevolutionary Cycle

Page 98: Introductory Workshop on Evolutionary Computing

Population Initialization

Page 99: Introductory Workshop on Evolutionary Computing

Population Initialization

Page 100: Introductory Workshop on Evolutionary Computing

Population Initialization

Page 101: Introductory Workshop on Evolutionary Computing

Population Initialization

Page 102: Introductory Workshop on Evolutionary Computing

Initial Evaluation

Page 103: Introductory Workshop on Evolutionary Computing

Initial Evaluation

Page 104: Introductory Workshop on Evolutionary Computing

Reproduction Phase

Page 105: Introductory Workshop on Evolutionary Computing

Reproduction Phase

Page 106: Introductory Workshop on Evolutionary Computing

Reproduction Phase

Page 107: Introductory Workshop on Evolutionary Computing

Evaluation Phase

Page 108: Introductory Workshop on Evolutionary Computing

Evaluation Phase

Page 109: Introductory Workshop on Evolutionary Computing

Competition Phase

Page 110: Introductory Workshop on Evolutionary Computing

Competition Phase

Page 111: Introductory Workshop on Evolutionary Computing

Termination

Page 112: Introductory Workshop on Evolutionary Computing

Termination