Memetic Algorithms
Dr. N. Krasnogor
Interdisciplinary Optimisation Laboratory
Automated Scheduling, Optimisation and Planning Research Group
School of Computer Science & Information Technology
University of Nottingham
www.cs.nott.ac.uk/~nxk
Ben Gurion University of the Negev – 2nd May 2010
Outline of the Talk
– Evolutionary Algorithms Revisited
– MAs’ Motivation (General Versus Problem Specific
Solvers)
– MAs design issues
– A Few Exemplar Applications
– Related Methods and Advanced Topics
– Putting it all Together
– Conclusions, Q&A
Ben Gurion University of the Negev – 2nd May 2010
Based on:
o N. Krasnogor. Handbook of Natural Computation, chapter Memetic Algorithms. Natural Computing. Springer Berlin
/ Heidelberg, 2009.
o J. Bacardit and N. Krasnogor. Performance and efficiency of memetic pittsburgh learning classifier systems.
Evolutionary Computation, 17(3), 2009.
o Q.H. Quang, Y.S. Ong, M.H. Lim, and N. Krasnogor. Adaptive cellular memetic algorithm. Evolutionary
Computation, 17(3), 2009.
o N. Krasnogor and J.E. Smith. Memetic algorithms: The polynomial local search complexity theory perspective.
Journal of Mathematical Modelling and Algorithms, 7:3-24, 2008.
o M. Tabacman, J. Bacardit, I. Loiseau, and N. Krasnogor. Learning classifier systems in optimisation problems: a case
study on fractal travelling salesman problems. In Proceedings of the International Workshop on Learning Classifier
Systems, volume (to appear) of Lecture Notes in Computer Science. Springer, 2008.
o N. Krasnogor and J.E. Smith. A tutorial for competent memetic algorithms: model, taxonomy and design issues.
IEEE Transactions on Evolutionary Computation, 9(5):474- 488, 2005.
o W.E. Hart, N. Krasnogor, and J.E. Smith, editors. Recent advances in memetic algorithms, volume 166 of Studies in
Fuzzyness and Soft Computing. Springer Berlin Heidelberg New York, 2004. ISBN 3-540-22904-3.
o N. Krasnogor. Self-generating metaheuristics in bioinformatics: the protein structure comparison case. Genetic
Programming and Evolvable Machines, 5(2):181-201, 2004.
o N.Krasnogor and S. Gustafson. A study on the use of “self-generation” in memetic algorithms. Natural Computing,
3(1):53 - 76, 2004.
o M. Lozano, F. Herrera, N. Krasnogor, and D. Molina. Real-coded memetic algorithms with crossover hill-climbing.
Evolutionary Computation, 12(3):273-302, 2004.
All material available at www.cs.nott.ac.uk/~nxk/publications.html
Ben Gurion University of the Negev – 2nd May 2010
Evolutionary Computation Most Important Metaphors
Evolution
Problem Solving
Environment
Problem
Individual
Candidate/Feasible Solution
Fitness
Solution quality, i.e. objective value
Natural Selection
Simulated Pruning of bad solutions
Fitness  survival and reproduction likelihood
Objective value  chances of generating related (but not
necessarily identical) solutions
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing
Hybridisation with other techniques: Memetic Algorithms
Ben Gurion University of the Negev – 2nd May 2010
Simulated Evolution Consists of:
• The population contains a diverse set of individuals (i.e. solutions
to an optimisation problem)
• Those features that make solutions good under a specific
objective function tend to proliferate
• Variation is introduced into the population by means of mutation,
crossover and, for MAs, local search.
• Selection culls the population throwing out bad solutions and
keeping the most promising ones
Ben Gurion University of the Negev – 2nd May 2010
The Metaphor of “Adaptive landscape”
(Wright, 1932) (1)
• Solutions (i.e. individuals) are represented by n properties + its
quality value.
• One can imagine the space of all solutions and their quality values
represented as a graph (i.e. landscape) in n+1 dimensions.
• In this metaphor, a specific individual is seen as a point in the
landscape.
• Each point in the landscape will have neighbouring points, which
are those solutions that are somehow related to that point implying
a neighbourhood.
• It is called “adaptive” landscape because the quality value function,
F(), depends on the features and properties of the individual and
hence the better those are the higher(smaller) the value.
Ben Gurion University of the Negev – 2nd May 2010
An example: Maximisation Problem with HillClimber (2)
Objective Value
a solutions
Solutions have 2 features
Prop 2
Prop 1
Ben Gurion University of the Negev – 2nd May 2010
An example: Maximisation Problem with EA (3)
Prop 2
Prop 1
Ben Gurion University of the Negev – 2nd May 2010
Evolutionary Algorithms in Context
• There are several opinions about the use of metaheuristics in
optimisation
• For the majority of problems a specific algorithm could:
– work better than a generic algorithm in a large set of instances ,
– but it can be very limited on a different domain.
– don’t work too good for some instances.
• One important research challenge is :
– to build frameworks that can be robust across a set of problems
delivering good enough/cheap enough/soon enough solutions.
– to a variety of problems and instances.
Ben Gurion University of the Negev – 2nd May 2010
Outline of the Talk
– Evolutionary Algorithms Revisited
– MAs’ Motivation (General Versus Problem Specific
Solvers)
– MAs design issues
– A Few Exemplar Applications
– Related Methods and Advanced Topics
– Putting it all Together
– Conclusions, Q&A
Ben Gurion University of the Negev – 2nd May 2010
method performance on problems
EAs as problem solvers: The pre-90s view
The For T for problem
solving
the Rollroyce
for problem P
specific method
metaheuristic method
random search
P
scale of all problems
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing
Hybridisation with other techniques: Memetic Algorithms
Ben Gurion University of the Negev – 2nd May 2010
Evolutionary Algorithms and domain knowledge
• Fashionable after the 90’s:
to add problem specific information into the EAs by
means of specialized crossover, mutation,
representations and local search
• Result: The performance curve deforms and
– makes EAs better in some problems,
– worst on other problems
– amount of problem specific is varied.
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing
Hybridisation with other techniques: Memetic Algorithms
Ben Gurion University of the Negev – 2nd May 2010
Michalewicz’s Interpretation
method performance on problems
EA 4
EA 2
EA 3
EA 1
P
scale of all problems
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing
Hybridisation with other techniques: Memetic Algorithms
Ben Gurion University of the Negev – 2nd May 2010
So What Are Memetic Algorithms?
MAs are carefully orchestrated interplay between
(stochastic) global search and (stochastic) local
search algorithms
N. Krasnogor. Handbook of Natural Computation, chapter Memetic Algorithms. Natural
Computing. Springer Berlin / Heidelberg, 2009
Ben Gurion University of the Negev – 2nd May 2010
So What Are Memetic Algorithms?
Adding Domain Knowledge to EAs
Memetic Algorithms (MAs) were originally inspired by:
• Models of adaptation in natural systems that combine evolutionary adaptation of
population of individuals (GAs)
WITH
• Individual learning (LS) within a lifetime (others consider the LS as development
stage). Learning took the form of (problem specific) local search
PLUS
• R. Dawkin’s concept of a meme which represents a unit of cultural evolution that
can exhibit refinement, hence the local search can be adaptive.
BUT WHY?
Ben Gurion University of the Negev – 2nd May 2010
An example: Maximisation Problem with EA (3)
Prop 2
Prop 1
Ben Gurion University of the Negev – 2nd May 2010
The Canonical MA At design
From Eiben’s & Smith “Introduction To Evolutionary Computation”
Ben Gurion University of the Negev – 2nd May 2010
time lots of
issues arise
Memetic Algorithms: the issues involved
Motivation
There are several reasons why it is worthwhile hybridizing:
•Complex problems can be partially decomposable, different subproblems be better solved by
different methods:
•EA could be used as pre/post processors
•Subproblem specific information can be placed into variation operators or into local searchers
•In some cases there are exact/approximate methods for subproblems
•Well established theoretical results: generally good black-box optimisers do not exist. This is
why successful EAs are usually found in “hybridized form” with domain knowledge added in
•EA are good at exploring the search space but find it difficult to zoom-in good solutions
•Problems have constraints associated to solutions and heuristics/local search are used to repair
solutions found by the EA
•If heuristic/local search strategies in MAs are “first class citizens” then one can raise the level
of generality of the algorithm without sacrificing performance by letting the MA choose which
local search to use.
Ben Gurion University of the Negev – 2nd May 2010
A conservation of competence principle applies: the better one
algorithm is solving one specific instance (class) the worst it is solving
a different instance (class) [Wolpert et.al.]
It cannot be expected that a black-box metaheuristic will suit all
problem classes and instances all the time, that is, it is theoretically
impossible to have both ready made of-the-shelf general & good
solvers for all problems.
MAs and Hyperheuristics are good algorithmic templates that aid in
the balancing act of successfully & cheaply using general, of-theshelf, reusable solvers (EAs) with adds-on instance (class)
specific features.
Ben Gurion University of the Negev – 2nd May 2010
What happens inside an MA?
This is my
solution to your
problem
I used
strategy
X
Memetics
Population of agents
When I grow up
I’ll need to decide
whose problem
solving strategy to
use
Evolutionary
Algorithm
Offspring
Ben Gurion University of the Negev – 2nd May 2010
Outline of the Talk
– Evolutionary Algorithms Revisited
– MAs’ Motivation (General Versus Problem Specific
Solvers)
– MAs design issues
– A Few Exemplar Applications
– Related Methods and Advanced Topics
– Putting it all Together
– Conclusions, Q&A
Ben Gurion University of the Negev – 2nd May 2010
Memetic Algorithms: the issues involved
Baldwinism VS Lamarckianism
• Lamarkian
• traits acquired by an individual during its lifetime can
be transmitted to its offspring
• e.g. replace individual with fitter neighbour
• Baldwinian
• traits acquired by individual cannot be transmitted to
its offspring
• e.g. individual receives fitness (but not genotype) of
fitter neighbour
Ben Gurion University of the Negev – 2nd May 2010
Ben Gurion University of the Negev – 2nd May 2010
Baldwin’s “filter”
Raw fitness
Ben Gurion University of the Negev – 2nd May 2010
Memetic Algorithms: the issues involved
Diversity
The loss of diversity is specially problematic in MAs as the LS tends
to focus excesively in a few good solutions.
If the MA uses LS up to local optimae then it becomes important
to constantly identify new local optimae
If the MA uses partial LS you could still be navigating around the
basins of attractions of a few solutions
Ben Gurion University of the Negev – 2nd May 2010
Memetic Algorithms: the issues involved
Diversity
There are various ways to improve diversity (assuming that’s what
one wants!):
•if the population is seeded only do so partially.
•instead of applying LS to every individual choose whom to apply it to.
•use variation operators that ensure diversity (assorted)
•in the local search strategy include a diversity weigth
•modify the selection operator to prevent duplicates
•archives
•modify the acceptance criteria in the local search:
Ben Gurion University of the Negev – 2nd May 2010
Memetic Algorithms: the issues involved
Diversity
The following modified MC exploits solutions (zooms-in) when the
population is diverse. If the population is converged it explores
(zooms-out)
The temperature T of the MC is defined for each generation as:
A new solution is accepted when:
when population is diverse
T <= 0 
only accepts improvements
when population is converged
T goes to infinity 
accepts both better and worst
solutions (explores)
Ben Gurion University of the Negev – 2nd May 2010
Memetic Algorithms: the issues involved
Operators Choice
The choice of LS/Heuristic is one of the most important steps in
the design of an MA
1.
2.
3.
Local searchers induce search landscapes and there has been various attempts to
characterize these. Kallel et.al. and Merz et.al. have shown that the choice of LS can
have dramatic impact on the efficiency and effectiveness of the MA
Krasnogor formally proved that to reduce the worst case run time of MAs LS move
operators must induce search graphs complementary (or disjoint) than those of the
crossover and mutation.
Krasnogor and Smith have also shown that the optimal choice of LS operator is not
only problem and instance dependent but also dependent on the state of the overall
search carried by the underlying EA
The obvious way to implement 2&3 is to use multiple local searchers within an MA
(multimeme algorithms)
and we will see that the obvious way of including feedback like that suggested by 1 is to
use self-generated multiple local searchers (self-generating MAs aka co-evolving MAs)
Ben Gurion University of the Negev – 2nd May 2010
Memetic Algorithms: the issues involved
Fitness Landscapes
Thanks to P. Merz!
Ben Gurion University of the Negev – 2nd May 2010
Multiple Local Searchers
Ben Gurion University of the Negev – 2nd May 2010
Memetic Algorithms: the issues involved
Use of Knowledge
The use of knowledge is essential for the success of a search methods
There are essentially two stages when knowledge is used:
• At design time: eg, in the form of local searchers/heuristics, specific variation
operators, initialization biases, etc.
• At run time:
• using tabu-like mechanisms to avoid revisiting points (implicit)
• using adaptive operators that bias search towards unseen/promising regions of
search space (implicit)
• creating new operators on-the-fly, eg., self-generating or co-evolving MAs
(explicit)
With appropriate data-mining techniques we can turn implicit
knowledge into explicit and feed it back into the design process!
(Deb Calls this Innovisation)
Ben Gurion University of the Negev – 2nd May 2010
Outline of the Talk
– Evolutionary Algorithms Revisited
– MAs’ Motivation (General Versus Problem Specific
Solvers)
– MAs design issues
– A Few Exemplar Applications
– Related Methods and Advanced Topics
– Putting it all Together
– Conclusions, Q&A
Ben Gurion University of the Negev – 2nd May 2010
Showcase Applications
The Maximum Diversity Problem
Katayama & Narihisa solve the MDP by means of a sophisticated
MA. The MDP:
The problem consists in selecting out of a set of N elements, M
which maximize certain diversity measure Dij
Ben Gurion University of the Negev – 2nd May 2010
Showcase Applications
The Maximum Diversity Problem
This problem is at the core of various important real-world
applications:
• Immigration and admission policies
• Committee formation
• Curriculum design
• Portfolio selection
• Combinatorial chemical libraries
• etc
Ben Gurion University of the Negev – 2nd May 2010
Showcase Applications
Protein Structure Prediction by us
Primary Structure
Tertiary Structure
Ben Gurion University of the Negev – 2nd May 2010
Showcase Applications
Protein Structure Prediction
Krasnogor, Krasnogor & Smith, Krasnogor & Pelta, Smith have used
MAs to study fundamentals of the algorithmics behind PSP in
simplified models.
Ben Gurion University of the Negev – 2nd May 2010
Showcase Applications
Protein Structure Prediction
Standard MA template except that Multiple Memes which promote
diversity by means of fuzzy rules are used
Ben Gurion University of the Negev – 2nd May 2010
Showcase Applications
Protein Structure Prediction
Membership function for “acceptable” solutions
Two distinct “acceptability” concepts
Promotes
Diversity
Promotes
improvements
Ben Gurion University of the Negev – 2nd May 2010
Showcase Applications
Protein Structure Prediction
New optimal solutions
Ben Gurion University of the Negev – 2nd May 2010
Showcase Applications
Optimal Engine Calibration
The OEC problem is paradigmatic of many industrial problems.
In this problem many combinatorial optimisation problems occur:
1. Optimal Design of Experiments
2. Optimal Test Bed Schedule
3. Look-up Table Calculation
Ben Gurion University of the Negev – 2nd May 2010
Showcase Applications
Optimal Engine Calibration
By P.Merz:
Ben Gurion University of the Negev – 2nd May 2010
Showcase Applications
Optimal Engine Calibration
Standard MA template
Ben Gurion University of the Negev – 2nd May 2010
Showcase Applications
Circuit Partitioning
CP is the task of dividing a circuit into smaller parts. Its an
important component of the VLSI Layout problem:
is a minimization
objective
1. this
the division
permits the
fabrication of circuits
physically
this is ainconstraint
distinct components
2. By dividing we conquer: resulting circuits can fit fabrication
norms, complexity is reduced
3. Can reduce heat dissipation, energy consumption, etc.
Ben Gurion University of the Negev – 2nd May 2010
Showcase Applications
Circuit Partitioning
From S.Areibi’s chapter:
A graphical example
Ben Gurion University of the Negev – 2nd May 2010
Showcase Applications
The Maximum Diversity Problem
Various features: distinct repair & LS, GRASP
for init, diversification phase, accelerated LS.
Ben Gurion University of the Negev – 2nd May 2010
Outline of the Talk
– Evolutionary Algorithms Revisited
– MAs’ Motivation (General Versus Problem Specific
Solvers)
– MAs design issues
– A Few Exemplar Applications
– Related Methods and Advanced Topics
– Putting it all Together
– Conclusions, Q&A
Ben Gurion University of the Negev – 2nd May 2010
Related Methodologies
Teams of Heuristics
Variable Neighbourhood Search: under this approach a number of different
neighbourhood structures are systematically explored, tries to improve the current
solution while avoiding poor local optima.
A-teams of Heuristics: in A-Teams a set of constructive, improvement and
destructive heuristics are asynchronously used to improve solutions.
Hyperheuristics: the main concept behind the hyperheuristic is that of managing the
application of other heuristics adaptively with the purpose of improving solutions.
Ben Gurion University of the Negev – 2nd May 2010
Methodologies
Cooperative Local Search (Landa Silva & Burke)
Cycle of each individual in pop
The search cycle of each
individual begins
Cooperation mechanism
Gets stuck
sharing moves,
parts, centralized
control, etc
Finds something to do. Gets unstuck
Note that this differs from teams of heuristics in that here the
cooperation is made explicit
Ben Gurion University of the Negev – 2nd May 2010
Methodologies
Off-line & In-line operators discovery
All the previous methodologies clearly benefits the end user as
they have been shown to provide improvements in robustness,
quality, etc.
But what do we do if we do not have, or don’t know, good heuristics
which could be used by,eg., A-teams, VNS or CLS?
Also, why don’t we use the information the algorithm produces
to better understand and make explicit new knowledge of the search
landscape capturing this knowledge in new operators?
Ben Gurion University of the Negev – 2nd May 2010
Methodologies
On-the-fly operators discovery
Two alternatives:
1. Off-line: Whitley and Watson did it successfully for TS, and
Kallel et al for other methods. Tabacman et al demonstrated it
for TSP
2. In-line: Krasnogor, Krasnogor & Gustafson, J.E. Smith and
others for MAs on PSP, PSC & other problems
Ben Gurion University of the Negev – 2nd May 2010
• Strategic Goals:
Objectives
– To move away from a close-system optimisation scenario to an open-andembodied- optimisation scenario
– To integrate Datamining and Classification techniques within the life cycle of
optimisation problem solvers
• Tactical Goals:
– To use Learning Classifier Systems / Genetic Programming / Etc to learn
features from instances of an optimisation problem
– Provide a TSP Solver as a proof of concept:
• The Fractal TSP family of problems is used
• GAssist is trained to distinguish between different members of the family
• The predictions are fed into a naive TSP solving heuristic
Ben Gurion University of the Negev – 2nd May 2010
Off-line pattern discovery with
Learning Classifier Systems
• Machine Learning technique
• Classifies a set of (previously unseen) input data
based on (previously learnt) classification rules
• “IF condition THEN action” rules (other
semantics are possible)
• Use (some flavour of) Evolutionary Algorithms
Ben Gurion University of the Negev – 2nd May 2010
Traveling Salesman Problem(TSP)
• Goal: Given a set of cities’ coordinates, find the
Hamiltonian cycle of minimum cost (search on a
Kn graph, n=# of cities)
• One of the most studied combinatorial optimization
problems
• NP-hard problem
Ben Gurion University of the Negev – 2nd May 2010
Fractal TSP
• A family of TSP 2D euclidean instances described by
L-Systems rules
• Coordinates are generated while the L-System rules are
being applied
• Sequence of coordinates describes the optimal solution
for the underlying TSP
• Instances are built by iterating over the rewriting rules
(axioms) of an L-System
• Tour of order n is obtained by iterating the L-system n
times
• Higher orders give rise to larger and more intrincate
city distributions: #cities=f(n)
Ben Gurion University of the Negev – 2nd May 2010
Fractal TSP
David Tour
Koch
Mpeano
MNpeano
Ben Gurion University of the Negev – 2nd May 2010
No Free Lunch Theorem
• In general: No algorithm is unsurpassed in
efficiency for all possible inputs
• For Fractal TSP: No heuristic gives the optimal
solution for all families of FTSPs
• Fractal TSP Solvers
– Mpeano  Nearest Neighbour & Multiple Fragment
– MNPeano  Multiple Fragment
– Koch  Furthest Addition From Minimal Convex Hull
 Theoretically proved to be optimally solved by
Ben Gurion University of the Negev – 2nd May 2010
Fractal TSP
• By Construction the optimal tour is known
• Hence Fractal L-systems are an ideal test bed for
TSP methods as:
– Scale to as large instances as desired
– Instances with features at all scales, e.g. clusters of
cities, bridges, etc.
– For each instance the optimal tour is known
– It is possible to mathematically demonstrate (in some
cases) whether a family of instances (for all n) is
solvable by a given heuristic
Ben Gurion University of the Negev – 2nd May 2010
An LCS Approach to FTSP
Optimisation
• Described by two interacting algorithms
– Instance Classifier (IC) (point 2.1 previous slide)
– Greedy Ramdomised Algorithm (GRA) (point 2.2
previous slide)
Ben Gurion University of the Negev – 2nd May 2010
Greedy Randomised Algorithm
Assuming we have a good classifier for deciding
edges inclusion in optimal tour:
• Iterates over all of the edges of the complete graph
of the TSP instance
• Classify an edge. If it belongs to optimal tour
include it, reject otherwise
• Does not ensure optimality
• Obtains quick solutions
Ben Gurion University of the Negev – 2nd May 2010
FTSP Instance Classifier
• Requires
– C: a set of coordinates samples derived from
FTPS instances
– F = Global Statistics features to be computed
for C
• Returns:
– R: a rule set that classifies a set of Global
Statistics features into one of the Fractal TSP
tours
– Gives the prediction made by R from the data F
Ben Gurion University of the Negev – 2nd May 2010
Global Statistics Model
• Features:
– Number of cities in the sample
– Maximum X and Y components
– Average spread from X and Y axis center
• Expresses a general signature of the shape of the curve in a 2dimensional space.
• The Spread of a coordinate from a given axis center computes
"how far" the corresponding component is.
– Average of nearest neighbors
– Features are computed after normalizing the
coordinates, with the curve meant to fit a 1x1 unit
square
Ben Gurion University of the Negev – 2nd May 2010
Triplets Model
• Accuracy increases for higher orders
Ben Gurion University of the Negev – 2nd May 2010
David Tour
Ben Gurion University of the Negev – 2nd May 2010
Koch Tour
Ben Gurion University of the Negev – 2nd May 2010
MPeano
Ben Gurion University of the Negev – 2nd May 2010
Observations
• Accuracy over a random coin classication
increases as the number of coordinates grow
• As the problem becomes more complex, the
proposed system improves its performance
over a random solution
Ben Gurion University of the Negev – 2nd May 2010
Observations
• Simple algorithm structure
• Rule sets obtained with basic features, in
fact the most obvious. There are plenty of
geometrical features that could be tried out.
• A more complex algorithm coupled with
refined predictions could guide the creation
of high performing solutions for the TSP.
Ben Gurion University of the Negev – 2nd May 2010
Observations
• We can correctly classify, using LCS, a
subset of cities into one of four fractal TSP
families
– Helps deciding which heuristic to use for
solving them
• LCS might be able to learn how to
distinguish betwen parts of an optimal tour
• The GAssist LCS provides insight on why it
chooses one edge over another to be
included in the optimal tour as its rules are
human-readable
Ben Gurion University of the Negev – 2nd May 2010
Methodologies
In-line operators discovery
Canonical MA cycle
Adapted from Durham,
W.: Coevolution: Genes, Culture and Human Diversity.
Stanford University Press (1991)
Ben Gurion University of the Negev – 2nd May 2010
Methodologies
On-the-fly operators discovery
Self-Generating/Co-evolving Mas
Adapted from Durham,
W.: Coevolution: Genes, Culture and Human Diversity.
Stanford University Press (1991)
Ben Gurion University of the Negev – 2nd May 2010
Methodologies
On-the-fly operators discovery
•Inheritance: an agent inherits the
meme of the most successful of its
parents
There are various processes that
guide the Agent’s cultural
evolution of local search
strategies:
•Imitation: an agent imitates a
successful non-genetically-related
individual
•Innovation: an agent blindly
(i.e.randomly) change its meme
•Mental Simulation: an agent
purposely (e.g. hill-climbs to ) improve
its meme
Ben Gurion University of the Negev – 2nd May 2010
An Example with The MAX-CMO Problem
Goal is to maximise number of cycles of
size = 4 without crossing edges
Ben Gurion University of the Negev – 2nd May 2010
Random Instance Generator
Number of vertices
Density of
random edges
Number of
regular
edges
pattern
1-7  1—7
N Patterns
Pattern
probability
3-7-11  3-7-11
Ben Gurion University of the Negev – 2nd May 2010
Ben Gurion University of the Negev – 2nd May 2010
A Real World Instance
Ben Gurion University of the Negev – 2nd May 2010
Chunks & Templates  Patterns & Grammars
Grammars
Patterns
Ben Gurion University of the Negev – 2nd May 2010
From Krasnogor & Gustafson chapter
Ben Gurion University of the Negev – 2nd May 2010
Ben Gurion University of the Negev – 2nd May 2010
Ben Gurion University of the Negev – 2nd May 2010
Ben Gurion University of the Negev – 2nd May 2010
Ben Gurion University of the Negev – 2nd May 2010
From Smith chapter
Ben Gurion University of the Negev – 2nd May 2010
Outline of the Talk
– Evolutionary Algorithms Revisited
– MAs’ Motivation (General Versus Problem Specific
Solvers)
– MAs design issues
– A Few Exemplar Applications
– Related Methods and Advanced Topics
– Putting it all Together
– Conclusions, Q&A
Ben Gurion University of the Negev – 2nd May 2010
So What Are Memetic Algorithms?
MAs are carefully orchestrated interplay between
(stochastic) global search and (stochastic) local
search algorithms
N. Krasnogor. Handbook of Natural Computation, chapter Memetic Algorithms. Natural
Computing. Springer Berlin / Heidelberg, 2009
Ben Gurion University of the Negev – 2nd May 2010
• MAs are one of the key methodologies
behind successful discrete/continuous
optimisation
– Nature Inspired Methodology?
– A Research Paradigm
• Key Design Issues underpinning MAs
• A Pattern Language for MAs design
Ben Gurion University of the Negev – 2nd May 2010
The Canonical MA At design
From Eiben’s & Smith “Introduction To Evolutionary Computation”
Ben Gurion University of the Negev – 2nd May 2010
time lots of
issues arise
Design Patterns and Pattern
Languages
In Alexander, C., Ishikawa, S., Silverstein, M., Jacobson, M., Fiksdahl-King, I.,
Angel, S.: A Pattern Language - Towns, Buildings, Construction. Oxford
University Press (1977):
“In this book, we present one possible pattern language,... The elements of this
language are entities called patterns. Each pattern describes a problem which
occurs over and over again in our environment, and then describes the core of
the solution to that problem, in such a way that you can use this solution a
million times over, without ever doing it the same way twice. “
Ben Gurion University of the Negev – 2nd May 2010
How are Patterns Described?
High Level
• Pattern name
• Problem statement
• The solution
• The Consequences
• Examples
A collection of well
defined patterns, i.e. a
rich pattern language,
substantially enhances
our ability to
communicate solutions
to recurring problems
without the need to
discuss specific
implementation details.
Ben Gurion University of the Negev – 2nd May 2010
Invariants and Decorations
A Compact “Memetic”
Algorithm by Merz
(2003)
Ben Gurion University of the Negev – 2nd May 2010
Invariants and Decorations
A “Memetic” Particles
Swarm Optimisation by
Petalas et al (2007)
Ben Gurion University of the Negev – 2nd May 2010
Invariants and Decorations
A “Memetic” Artificial
Immune System by
Yanga et al (2008)
Ben Gurion University of the Negev – 2nd May 2010
Invariants and Decorations
A “Memetic” Learning
Classifier System by
Bacardit & Krasnogor
(2009)
Ben Gurion University of the Negev – 2nd May 2010
Invariants and Decorations
• Many more based on Ant Colony Optimisation,
NN, Tabu Search, SA, DE, etc.
• Key Invariants:
– Global search mode
– Local search mode
• Many Decorations, e.g.:
–
–
–
–
Crossover/Mutations (EAs based MAs)
Pheromones updates (ACO based MAs)
Clonal selection/Hypermutations (AIS based MAs)
etc
Ben Gurion University of the Negev – 2nd May 2010
Memetic Algorithm Pattern
(MAP)
• Problem: how to successfully orchestrate local
and global search
• Solution: for a given domain find exploration and
exploitation mechanisms that work in synergy.
• Consequence: increase CPU? Diversity lost?
• Examples: too many!
Ben Gurion University of the Negev – 2nd May 2010
A Pattern Language for Memetic Algorithms
Memetic Algorithms by N. Krasnogor. Handbook of Natural Computation (chapter) in Natural Computing. Springer Berlin / Heidelberg, 2009.
www.cs.nott.ac.uk/~nxk/publications.html
Ben Gurion University of the Negev – 2nd May 2010
Outline of the Talk
– Evolutionary Algorithms Revisited
– MAs’ Motivation (General Versus Problem Specific
Solvers)
– MAs design issues
– A Few Exemplar Applications
– Related Methods and Advanced Topics
– Putting it all Together
– Conclusions, Q&A
Ben Gurion University of the Negev – 2nd May 2010
Conclusions (I)
•There is much more in MA that meets the eye. Its not a simple
matter of ad-hoc putting LS somewhere in the EA cycle.
•Just a small space of the architectural space of MAs has been
explored and we don’t know yet why a given architecture
performs well/bad in a specific problem (see my PhD thesis)
•People usually use one “silver bullet” LS. That’s fine if that
SB exists. However when it does not exist use multimeme
algorithms, or other heuristics teams/cooperative algorithms as
lots of simple heuristics can synergistically do the trick.
Ben Gurion University of the Negev – 2nd May 2010
Conclusions (II)
•ADAPT: the search process is dynamic and your method should
detect and adapt to changing circumstances. Adaptation is not too
expensive or complex to code!
•Carefully consider how your variation operators interact with LS
• Consider whether Baldwinian or Lamarckianism is better
•Understand that the fitness landscape explored by your MA is not
a one-operator landscape but the results of the superposicion with
interference of varios landscapes.
Ben Gurion University of the Negev – 2nd May 2010
Conclusions (III)
•Use more expresive acceptance criteria in your local search, eg.,
fuzzy criteria
•If you don’t know what operators to apply let the the MA find it
for you by some Self-Generating mechanism, e.g., co-evolution.
•Self-Generating mechanisms are a great niche for GPers!
FINALLY:
check out the literature, almost surely you will find MAs.
among the best success stories in applications to real world probs!
Ben Gurion University of the Negev – 2nd May 2010
The Future of MAs
Software Growth
• Software should be “seeded” and grown, very much
like a plant or animal (including humans)
• Software should be embryonic and develop when
situated on a production environment
• What would a software “incubation” machine look
like?
• What would a software “germinator” look like?
Ben Gurion University of the Negev – 2nd May 2010
Organs
Individual
Cells
Tissue
DNA/RNA
Ben Gurion University of the Negev – 2nd May 2010
Production Environment
Input
TSP Organ
SC
SC
Software Cell
Euclidean TSP Organ
SC
SC
SC
SC
Pluripotential Solver
“DNA”
GraphicalTSP Organ
Ben Gurion University of the Negev – 2nd May 2010
TSP
Solver
Software
Organism
Questions?!?
Thank you:
Professor Dr. Moshe Sipper for inviting me to
give this talk
Ben Gurion University of the Negev – 2nd May 2010
Descargar

Recent Advances in Memetic Algorithms