First lecture Introduction to Evolutionary Computation Second lecture Genetic Programming Inverted CERN School of Computing 2017 Daniel Lanza CERN Agenda Introduction to Evolutionary Computati ID: 932688
Download Presentation The PPT/PDF document "Applying natural evolution for solving c..." is the property of its rightful owner. Permission is granted to download and print the materials on this web site for personal, non-commercial use only, and to display it on your personal computer provided you do not modify the materials and that you retain all copyright notices contained in the materials. By downloading content from our website, you accept the terms of this agreement.
Slide1
Applying natural evolution for solving computational problems
First lecture: Introduction to Evolutionary Computation
Second lecture: Genetic Programming
Inverted CERN School of Computing
2017
Daniel Lanza - CERN
Slide2Agenda
Introduction to Evolutionary Computati
on
Introduction to Evolutionary AlgorithmsUse cases, practical examplesKey conceptsRepresentation of individualsPhasesEvolutionary Computation research tool (ECJ)
2
Slide3Introduction to Evolutionary Algorithms
History
Idea originated
in the 1950sL. Fogel 1962 (San Diego, CA): Evolutionary ProgrammingI. Rechenberg & H.-P. Schwefel
1965 (
Berlin, Germany): Evolution StrategiesWhen to use themWhen finding exact solution is computationally too demanding, but near-optimal solution is sufficient
An evolutionary algorithm (EA) is a heuristic optimization algorithm using techniques inspired by mechanisms from organic evolution such as mutation, recombination, and natural selection to find an optimal configuration for a specific system within specific constraints. [1]
3
Slide4Use cases, practical examples
Aircraft wing
design [8][9]
4
Slide5Use cases, practical examples
Electronic circuit design, known as evolvable
hardware [2]
5
Slide6Use cases, practical examples
Wireless sensor/ad-hoc networks [4].
6
Slide7Use cases, practical examples
Vehicle routing problems (traveling salesman problem) [3].
Feature
selection used by machine learning algorithms. [6]Image processing (face recognition) [5].… [7]
7
Slide8Key concepts
A population of individuals is evolved through generations
Each individual’s genome describes a candidate solution
The fitness function evaluates new individualsThe evolutionary process is finished as soon as the optimal solution is found
8
Slide9Representation of individuals
It should be able to represent all the search space
But should not able to represent impossible solutions
Examples of solutions to different problems:[1, 0, 1, 1, 0, 1, 1]
[4.5, 1, 100.3, 9, 21, 934, 1]
[“right”, “left”, “up”, ”up”, “left”, “left”, “down”]
[4.5, “left”, false, true, 9]
9
Slide10Representation of individuals: example
Choose points of interest for face recognition [5]
10
Slide11Phases
Similarly to how natural evolution works
…
Initialization
Randomly generated
Evaluation
Calculate fitness for each individual
BreedingIndividuals are crossed-over and mutations take place
Selection
Choose individuals for breeding
11
Optimal solution
Slide12“MaxOne
” problem
Starting from randomly generated strings of 0s and 1s
Evolve to the optimal solution, a string of 1sPhases: in detail by using an example[1, 1, 1, 1, 1, 1, 1]
12
Slide13The first population is filled up with individuals
The individuals
are randomly generated following problem’s criteria
The population size is determined beforehand and remains fixedPhases: initialization
Initialization
Evaluation
Breeding
Selection
[1, 0, 1, 1, 0, 1, 1]
[0,
0
, 1, 1, 0, 1, 1]
[0, 0, 1, 1, 0, 0, 1]
[1, 0, 0, 1, 1, 1, 1]
[1, 1, 1, 1, 0, 0,
0
]
[0, 0, 0, 1, 0, 0, 1]
13
Slide14Phases: evaluation
Practically the fitness function defines the problem
Get the fitness of each individual
Fitness describes how well the individual solves the problemIn “MaxOne”, fitness is defined as the number of 1s in the individual
Initialization
Evaluation
Breeding
Selection
[1, 0, 1, 1, 0, 1, 1]
[0,
0
, 1, 1, 0, 1, 1]
[0, 0, 1, 1, 0, 0, 1]
[1, 0, 0, 1, 1, 1, 1]
[1, 1, 1, 1, 0, 0,
0
]
[0, 0, 0, 1, 0, 0, 1]
4
5
5
3
4
2
14
Slide15Phases: selection
A selection strategy is defined to choose the parents
Selected individuals, parents, will be used for breeding
Fitness is taken into account (best individuals)But also some randomness affects the selection (simulating real life)Individuals with reduced fitness could have valuable featuresElitism (optional): best individual is copied to the next generation
Initialization
Evaluation
Breeding
Selection
[1, 0, 1, 1, 0, 1, 1]
[0,
0
, 1, 1, 0, 1, 1]
[0, 0, 1, 1, 0, 0, 1]
[1, 0, 0, 1, 1, 1, 1]
[1, 1, 1, 1, 0, 0,
0
]
[0, 0, 0, 1, 0, 0, 1]
4
5
5
3
4
2
15
Slide16Phases: selection (techniques)
Tournament [11]
A fixed number of individuals are randomly selected
Among them, the best one is selected
Initialization
Evaluation
Breeding
Selection
16
[1, 0, 1, 1, 0, 1, 1]
[0,
0
, 1, 1, 0, 1, 1]
[0, 0, 1, 1, 0, 0, 1]
[1, 0, 0, 1, 1, 1, 1]
[1, 1, 1, 1, 0, 0,
0
]
[0, 0, 0, 1, 0, 0, 1]
4
5
5
3
4
2
[0, 0, 1, 1, 0, 0, 1]
3
[1, 1, 1, 1, 0, 0,
0
]
4
Slide17Phases: selection (techniques)
Roulette-wheel
[12]
Selection probability is proportional to individual’s fitnessOthers… [13]
Initialization
Evaluation
Breeding
Selection
17
[1, 0, 1, 1, 0, 1, 1]
[0,
0
, 1, 1, 0, 1, 1]
[0, 0, 1, 1, 0, 0, 1]
[1, 0, 0, 1, 1, 1, 1]
[1, 1, 1, 1, 0, 0,
0
]
[0, 0, 0, 1, 0, 0, 1]
4
5
5
3
4
2
Slide18Phases: breeding
Selected individuals cross-over
New individuals fill up the next generation of individuals
Selection and breeding phases are performed till next population is filled
Initialization
Evaluation
Breeding
Selection
[1, 0, 1, 1, 0, 1, 1]
[1, 0, 0, 1, 1, 1, 1]
[1, 0, 0, 1, 0, 1, 1]
[1, 0, 1, 1, 1, 1, 1]
Parents
Offspring
18
Slide19Phases: breeding (techniques)
Some times features cannot be mixed
Smarter operations need to be applied
For example:Depends on the problemNumber: average, max, min, sum, ..Boolean: and, or, xor, …Strings: concatenate, replace, remove, split,
…
Initialization
Evaluation
BreedingSelection
[4.5, “left”, false, true, 9]
[1.2, “right”, true, true, 9]
Parents
19
Slide20Phases: mutation
With a very little likelihood
A random modification is applied
Initialization
Evaluation
Breeding
Selection
[1, 0, 0,
1
, 0, 1, 1]
[1, 0, 0,
0
, 0, 1, 1]
20
Slide21Phases: mutation (techniques)
Any modification can be consider as a mutation
More complex genomes could have their features modified to any possible value
Initialization
Evaluation
Breeding
Selection
[1, 0, 0,
1
,
0
, 1, 1]
[1, 0, 0,
0
,
1
, 1, 1]
[
1, 0, 0, 1, 0, 1, 1
]
[
1, 1, 0, 1,
0
,
0
, 1
]
[4.5, “left”, false, true, 9]
[4.5,
“right”
, false, true, 9]
[
0.5
, “right”, false, true, 9]
[4.5, “right”, false,
false
, 9]
21
Slide22Phases: evaluation,
selection, breeding
,
…The loop keep going till the optimal individual is found
Initialization
Randomly generated
Evaluation
Calculate fitness for each individualBreeding
Individuals are crossed-over and mutations take place
Selection
Choose individuals for breeding
[1, 1,
1
, 1, 1,
1
, 1]
7
22
Slide23Evolutionary Computation research tool (ECJ)
Developed at the George Manson University [10]
Eliminates the
need of implementing the
evolutionary
process
Highly used in the communityMain features:
Multi-platform: JavaFlexibility: easy to implement many kind of problemsConfiguration filesCheck pointsMulti-threadingPseudo-random number generator: reproduce results23
Slide24Evolutionary Computation research tool (ECJ)
Code architecture allows pluggable and customized components
Several built-in implementations for every component
24
Slide25Evolutionary Computation research tool (ECJ)
Configuring the
MaxOne
problem
breedthreads
= 1evalthreads = 1seed.0 = 4357state = ec.simple.SimpleEvolutionStatepop = ec.Population
init = ec.simple.SimpleInitializerfinish = ec.simple.SimpleFinisherbreed = ec.simple.SimpleBreedereval = ec.simple.SimpleEvaluatorstat = ec.simple.SimpleStatisticsexch = ec.simple.SimpleExchangergenerations = 200pop.subpops = 1pop.subpop.0 = ec.Subpopulationpop.subpop.0.size = 10pop.subpop.0.species = ec.vector.BitVectorSpeciespop.subpop.0.species.fitness = ec.simple.SimpleFitnesspop.subpop.0.species.ind = ec.vector.BitVectorIndividualpop.subpop.0.species.genome-size = 20pop.subpop.0.species.mutation-type = flippop.subpop.0.species.mutation-prob = 0.01pop.subpop.0.species.pipe = ec.vector.breed.VectorMutationPipelinepop.subpop.0.species.pipe.source.0 = ec.vector.breed.VectorCrossoverPipelinepop.subpop.0.species.pipe.source.0.source.0 = ec.select.TournamentSelectionpop.subpop.0.species.pipe.source.0.source.1 = ec.select.TournamentSelectioneval.problem = ec.app.tutorial1.MaxOnes
25
Slide26Evolutionary Computation research tool (ECJ)
Implementing the
MaxOne
fitness function: ec.app.tutorial1.MaxOnes
26
Slide27Evolutionary Computation research tool (ECJ)
Execution
27
Slide28Questions?
28
Slide29Applying natural evolution for solving computational problems
First lecture: Introduction to Evolutionary Computation
Second lecture: Genetic Programming
Tomorrow at 11:30
Slide30References
[
1
] https://en.wikipedia.org/wiki/Evolutionary_algorithm[2] W. Greenwood, Garrison; Tyrrell, Andrew M. (2006-10-20). Introduction to Evolvable Hardware: A Practical Guide for Designing Self-Adaptive Systems (1 ed.). Wiley-IEEE Press. ISBN 978-0471719779
.
[3]
Maimon, Oded; Braha, Dan (1998). "A genetic algorithm approach to scheduling PCBs on a single machine" (PDF). International Journal of Production Research
.[4] BiSNET/e – Distributed Software Systems Group, University of Massachusetts, Boston[5] D. Lanza, F. Chavez, F. Fernandez, C. Benavides-Alvarez and J. Villegase, Speeding up Evolutionary Approaches to Face Recognition by Means of Hadoop. EVO 2016[6] Haleh Vafaie and Kenneth De Jong. Genetic Algorithms as a Tool for Feature Selection in Machine Learning. Center for Artificial Intelligence, George Mason University[7] https://en.wikipedia.org/wiki/List_of_genetic_algorithm_applications#cite_note-56[8] Andre C. Marta. Parametric Study of a Genetic Algorithm using a Aircraft Design Optimization Problem. Stanford University, U.S.A.[9] I.
Kroo
.
Aeronautical
Applications
Of
Evolutionary
Design.
Stanford University, U.S.A
.
[10]
https://cs.gmu.edu/~eclab/projects/ecj
/
30
Slide31References
[11] Miller
, Brad; Goldberg, David (1995). "Genetic Algorithms, Tournament Selection, and the Effects of Noise".
Complex Systems. 9: 193–212.[12] A. Lipowski, Roulette-wheel selection via stochastic acceptance (arXiv:1109.3627)[13] http://www.geatbx.com
/
docu
/algindex-02.html31