Top Banner
Evolutionary Algorithms
39

Evolutionary Algorithms

Apr 13, 2017

Download

Technology

Reem Alattas
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Evolutionary Algorithms

Evolutionary Algorithms

Page 2: Evolutionary Algorithms

Evolutionary Algorithms (EA)

• EA are stochastic search and optimization heuristics derived from the classic evolution theory, which are implemented on computers in the majority of cases.

Page 3: Evolutionary Algorithms

Basic Idea

• If only those individuals of a population reproduce, which meet a certain selection criteria, and the other individuals of the population die, the population will converge to those individuals that best meet the selection criteria.

• Population dynamics follow the basic rule of Darwin evolution theory, which can be described in short as the “survival of the fittest.”

Page 4: Evolutionary Algorithms

General EA Process

Page 5: Evolutionary Algorithms

Classification of EA/CA Methods

Genetic Algorithm (GA)

Genetic Programming (GP)

Evolutionary Strategy (ES)

Evolutionary Programming (EP)

Learning Classifier System (LCS)

Page 6: Evolutionary Algorithms

Related Search Heuristics

Hill Climbing Particle Swarms (PS)

Simulated Annealing (SA)

Tabu Search (TS)

Ant Systems (AS)

Page 7: Evolutionary Algorithms

EA Analogy Foundations

Individual

Selection

Crossover

Mutation

Page 8: Evolutionary Algorithms

Genetic Algorithm (GA)

• A search heuristic that mimics the process of natural selection*.

* Wikipedia

Page 9: Evolutionary Algorithms

GA – Individual

• GA individuals store the solution attributes not directly in clear type but in a coded representation.– Bit-String: consists of L bits, which are clustered

into words wi. In a simple version all words are of equal length l.

– The decoded words w are the solution attributes x, which are to be optimized. Each attribute xi is assigned to the word wi.

Page 10: Evolutionary Algorithms

Standard Binary Coding

• A simple binary decoding method for a real number attribute xi with the upper bound of oj and the lower bound of uj with the length l from the word wi could be:

• wi,n gives the nth bit of the word wi

Page 11: Evolutionary Algorithms

GA – Fitness

• After the fitness for every individual ai of the population A(s) has been calculated, the best individuals of the population A(s) are selected to be the parents of the next generation A(s+1).

Page 12: Evolutionary Algorithms

GA – Selection

• A commonly selection method is the roulette wheel selection:– an individual ai of the population A(s) is assigned to a

small part on a wheel of chance. The size pi of this part is proportional to the calculated fitness Φi.

– The wheel is then tossed as many times as parents are needed to create the next generation and each winning individual is copied into the parent population B(s).

Page 13: Evolutionary Algorithms

GA – Crossover

• The descendants are created by performing sexual recombination (crossover) and mutation.

• Crossover exchanges genes, coding the solution attributes, between the parents to generate the descendants.– Instead of DNA, the Bit-String of the parents is cut at a

position chosen by chance and the single parts are mixed and chained againto build the Bit-Strings of the descendants.

Page 14: Evolutionary Algorithms

GA – Mutation

• After the descendants are created by sexual recombination, the descendants are mutated.

• This resembles the naturally occurring accidents that can happen when DNA is copied. Mutation on the Bit-String occurs by shifting single bits in the Bit-String selected by chance.

Page 15: Evolutionary Algorithms

GA

• After undergoing these creation and altering processes, the descendants form the next generation A(s+1).

• The generational process is repeated until a satisfying solution for the given target function is found.

Page 16: Evolutionary Algorithms

Evolutionary Strategy (ES)

• An optimization technique based on ideas of adaptation and evolution. It belongs to the general class of evolutionary computation or artificial evolution methodologies*.

* Wikipedia

Page 17: Evolutionary Algorithms

ES as a Phenotype Oriented Approach

• ES method emphasizes the phenotype of an individual and that it omits any additional coding. Because of this, the crossover and mutation methods have to change the real value of an attributes and not an abstract coded representation

• This phenotype-oriented approach allows further optimization of the altering operators.

Page 18: Evolutionary Algorithms

ES – Individual

• An ES individual consists in most cases of a vector of n real numbers, the decision parameters that are to be optimized:

• The strategy parameters are stored in an additional vector of n real numbers:

Page 19: Evolutionary Algorithms

ES – Fitness

• We use one strategy parameter σi for each decision parameter xi.

• σi assigns a standard deviation to the phenotype mutation for each decision parameter, which is equal to the mean size of a mutation step.

Page 20: Evolutionary Algorithms

ES – Selection

• A deterministic selection procedure is used to select the possible parents for the next generation.

• Only λ best individuals are selected from a population A(s) that is of size μ and used for the parent population B(S).

• This approach is called the (μ, λ) strategy.

Page 21: Evolutionary Algorithms

ES – Crossover• Two different methods to mix two selected

parents (e1 and e2) from B(s):– Discrete crossover: used for the decision

parameters. • A descendant c either inherits the attribute xi from one

or the other parent. • The actual parent that supplies an attribute can be

chosen arbitrarily.– Intermedium crossover: used for strategy

parameters. • The descendant c is assigned a σi value that is the

mean of the corresponding σi of both parents. • This method has a concentration effect.

Page 22: Evolutionary Algorithms

ES – Mutation• Strategy parameters are mutated according to:

• N(0,1) is an evenly distributed random number with the range [0,1] that is generated for each mutation per individual.

• Nj(0,1) is also an evenly distributed random number with the range [0,1], but this number is generated for each σi.

• τ1 and τ2 are exogene strategy parameters.• The new mutation step size σ’i is used to mutate the decision

parameters xi:

Page 23: Evolutionary Algorithms

ES

• The descendants form the next generation A(s+1) after undergoing these altering processes.

• Again the generational process can be repeated until a satisfying solution is found.

Page 24: Evolutionary Algorithms

Genetic Programming (GP)• An evolutionary algorithm-based methodology inspired by

biological evolution to find computer programs that perform a user-defined task. Essentially GP is a set of instructions and a fitness function to measure how well a computer has performed a task. It is a specialization of genetic algorithms (GA) where each individual is a computer program. It is a machine learning technique used to optimize a population of computer programs according to a fitness landscape determined by a program's ability to perform a given computational task*.

* Wikipedia

Page 25: Evolutionary Algorithms

GP – Individual

• Designed to store computer programs in such a way that they can be optimized using an evolutionary approach.

• The computer programming language LISP was used to code computer programs into an individual.

Page 26: Evolutionary Algorithms

GP – Fitness

• Evaluated by executing the program under varying conditions and the behavior or the output of the program can be used to estimate the fitness.

Page 27: Evolutionary Algorithms

GP – Selection

• Tournament selection: – Parents are selected by doing multiple small

tournaments between members of A(s). – For each tournament n arbitrarily selected

individuals from the population A(s) are put into a tournament group.

• Then the best individual of the tournament group is selected as possible parent and copied into the parent population B(s).

Page 28: Evolutionary Algorithms

GP – Crossover

• Performed by selecting two sub trees by chance from the parents and exchange them to create the descendants.

Page 29: Evolutionary Algorithms

GP – Mutation

• Performed by selecting a sub tree of the descendant by chance and to exchange the sub tree with an arbitrary generated new sub tree.

• The GP program trees tend to rapidly increase in size during the evolutionary process. This effect is called bloat and it causes the increase of computation effort, since the programs that are to be evaluated become more complex.

Page 30: Evolutionary Algorithms

Evolutionary Programming (EP)

• Evolutionary programming is one of the four major evolutionary algorithm paradigms. It is similar to genetic programming, but the structure of the program to be optimized is fixed, while its numerical parameters are allowed to evolve*.

* Wikipedia

Page 31: Evolutionary Algorithms

EP – Individual

• EP individuals (each representing whole species) is only determined by the given optimization problem and underlies virtually no restrictions regarding the data types that can be processed.

Page 32: Evolutionary Algorithms

EP – Selection

• EP uses a mixture of tournament selection and the best of selection as in ES.

• For each individual ai a tournament group of n individuals selected by chance from A(s), then the number of individuals of that tournament group that are inferior to ai, regarding the fitness is determined and assigned as score to ai.

• The λ individuals with the highest score are selected as parents B(s).

Page 33: Evolutionary Algorithms

EP – Mutation

• EP mutation operator is problem or data type specific, because each problem can use different attribute data types for the individuals.

Page 34: Evolutionary Algorithms

Learning Classifier Systems (LCS)

• A machine learning system with close links to reinforcement learning and genetic algorithms*.

• The method of LCS is based on the GA method, only the Bit-String representation of the individuals was changed that they can represent generalizing rules.

* Wikipedia

Page 35: Evolutionary Algorithms

LCS – Individual

• Able to learn simple “if {condition} then {action}” style rules by learning from feedback information given by an environment or supervisor.

• The condition and the action parts of the rules are coded like the GA in Bit-Strings. – Only the alphabet of the condition part was extended

from the basic binary alphabet {0,1} to {0,1,#}. • The ‘#’ symbol represents a ‘don’t care’ to allow

generalization of conditions.

Page 36: Evolutionary Algorithms

LCS – Fitness

• Individuals are evaluated in two phases: – Performance phases (red), where the individuals

interact with the environment.– Reinforcement phase (blue), where the individuals

are rewarded for good performance.

Page 37: Evolutionary Algorithms

Performance and Reinforcement Phases of LCS evaluation

Page 38: Evolutionary Algorithms

LCS – Fitness

• The performance and reinforcement phase can be repeated several times to ensure correctly assigned fitness values of the rules evaluated. Then the accumulated fitness can be used to perform a single GA generation step.

Page 39: Evolutionary Algorithms

Thank You

Q & A