[go: nahoru, domu]

Jump to content

Edit filter log

Details for log entry 17,162,989

12:26, 28 November 2016: 113.203.206.50 (talk) triggered filter 636, performing the action "edit" on Memetic algorithm. Actions taken: Warn; Filter description: Unexplained removal of sourced content (examine)

Changes made in edit

'''end for'''
'''end for'''
'''end while'''
'''end while'''

===2nd generation===
ite journal|author=Krasnogor N.|title=Coevolution of genes and memes in memetic algorithms|journal=Graduate Student Workshop|year=1999|pages=371}}</ref> [[Hyper-heuristic]]<ref name=kendall2002cfa>{{cite conference|author=Kendall G. and Soubeiga E. and Cowling P.|title=Choice function and random hyperheuristics|conference=4th Asia-Pacific Conference on Simulated Evolution and Learning. SEAL 2002|pages=667–671}}</ref><ref name=burke2013>{{cite journal|author=Burke E. K. and Gendreau M. and Hyde M. and Kendall G. and Ochoa G. and &Ouml;zcan E. and Qu R.|title=Hyper-heuristics: A Survey of the State of the Art|journal=Journal of the Operational Research Society|year=2013|volume=64|pages=1695–1724|issue=12}}</ref>
and Meta-Lamarckian MA<ref name=ong2004mll>{{cite journal|author1=Ong Y. S. |author2=Keane A. J. |lastauthoramp=yes |title=Meta-Lamarckian learning in memetic algorithms|journal=IEEE Transactions on Evolutionary Computation|year=2004|volume=8|pages=99–110|doi=10.1109/TEVC.2003.819944|issue=2}}</ref> are referred to as second generation MA exhibiting the principles of memetic transmission selection in their design. In Multi-meme MA, the memetic material is encoded as part of the [[genotype]]. Subsequently, the decoded meme of each respective individual/[[chromosome]] is then used to perform a local refinement. The memetic material is then transmitted through a simple inheritance mechanism from parent to offspring(s). On the other hand, in hyper-heuristic and meta-Lamarckian MA, the pool
of candidate memes considered will compete, based on their past merits in generating local improvements through a reward mechanism, deciding on which meme to be selected to proceed for future local refinements. Memes with a higher reward have a greater chance of being replicated or copied. For a review on second generation MA; i.e., MA considering multiple individual learning methods within
an evolutionary system, the reader is referred to.<ref name=ong2006cam>{{cite journal|author=Ong Y. S. and Lim M. H. and Zhu N. and Wong K. W.|title=Classification of Adaptive Memetic Algorithms: A Comparative Study|journal=IEEE Transactions on Systems Man and Cybernetics -- Part B.|year=2006|volume=36|pages=141|doi=10.1109/TSMCB.2005.856143|issue=1}}</ref>


===3rd generation===
===3rd generation===

Action parameters

VariableValue
Whether or not the edit is marked as minor (no longer in use) (minor_edit)
false
Name of the user account (user_name)
'113.203.206.50'
Whether or not a user is editing through the mobile interface (user_mobile)
false
user_wpzero
false
Page ID (page_id)
3989208
Page namespace (page_namespace)
0
Page title without namespace (page_title)
'Memetic algorithm'
Full page title (page_prefixedtitle)
'Memetic algorithm'
Action (action)
'edit'
Edit summary/reason (summary)
'/* 2nd generation */ '
Old content model (old_content_model)
'wikitext'
New content model (new_content_model)
'wikitext'
Old page wikitext, before the edit (old_wikitext)
'{{Context|date=February 2011}} '''Memetic algorithms''' (MA) represent one of the recent growing areas of research in [[evolutionary computation]]. The term MA is now widely used as a synergy of evolutionary or any population-based approach with separate individual learning or local improvement procedures for problem search. Quite often, MA are also referred to in the literature as Baldwinian [[evolutionary algorithm]]s (EA), Lamarckian EAs, cultural algorithms, or genetic local search. ==Introduction== Inspired by both Darwinian principles of natural evolution and [[Richard Dawkins|Dawkins']] notion of a [[meme]], the term “Memetic Algorithm” (MA) was introduced by Moscato in his technical report<ref name=martial_arts>{{cite journal|last=Moscato|first=P.|year=1989|title=On Evolution, Search, Optimization, Genetic Algorithms and Martial Arts: Towards Memetic Algorithms|journal=Caltech Concurrent Computation Program|volume=|issue=report 826}}</ref> in 1989 where he viewed MA as being close to a form of population-based hybrid [[genetic algorithm]] (GA) coupled with an individual learning procedure capable of performing local refinements. The metaphorical parallels, on the one hand, to Darwinian evolution and, on the other hand, between memes and domain specific (local search) [[heuristic]]s are captured within memetic algorithms thus rendering a methodology that balances well between generality and problem specificity. This two-stage nature makes them a special case of [[Dual-phase evolution]]. In a more diverse context, memetic algorithms are now used under various names including Hybrid Evolutionary Algorithms, Baldwinian Evolutionary Algorithms, Lamarckian Evolutionary Algorithms, Cultural Algorithms, or Genetic Local Search. In the context of complex optimization, many different instantiations of memetic algorithms have been reported across a wide range of [[#Applications|application domains]], in general, converging to high-quality solutions more efficiently than their conventional evolutionary counterparts. In general, using the ideas of memetics within a computational framework is called "Memetic Computing or Memetic Computation" (MC).<ref name=MC2011>{{cite journal|last=Chen|first=X. S. |last2=Ong|first2=Y. S.|last3=Lim|first3=M. H.|last4=Tan|first4=K. C.|year=2011|title=A Multi-Facet Survey on Memetic Computation|journal=[[IEEE Transactions on Evolutionary Computation]]|volume=15|issue=5|pages=591–607|doi=10.1109/tevc.2011.2132725}}</ref><ref name=MC2010>{{cite journal|last=Chen|first=X. S. |last2=Ong|first2=Y. S.|last3=Lim|first3=M. H.|year=2010|title=Research Frontier: Memetic Computation - Past, Present & Future|journal=[[IEEE Computational Intelligence Society#Publications|IEEE Computational Intelligence Magazine]]|volume=5|issue=2|pages=24–36|doi=10.1109/mci.2010.936309}}</ref> With MC, the traits of Universal Darwinism are more appropriately captured. Viewed in this perspective, MA is a more constrained notion of MC. More specifically, MA covers one area of MC, in particular dealing with areas of evolutionary algorithms that marry other deterministic refinement techniques for solving optimization problems. MC extends the notion of memes to cover conceptual entities of knowledge-enhanced procedures or representations. ==The development of MAs== ===1st generation=== The first generation of MA refers to hybrid [[algorithm]]s, a marriage between a population-based global search (often in the form of an evolutionary algorithm) coupled with a cultural evolutionary stage. This first generation of MA although encompasses characteristics of cultural evolution (in the form of local refinement) in the search cycle, it may not qualify as a true evolving system according to [[Universal Darwinism]], since all the core principles of inheritance/memetic transmission, variation, and selection are missing. This suggests why the term MA stirred up criticisms and controversies among researchers when first introduced.<ref name=martial_arts/> ;Pseudo code: '''Procedure''' Memetic Algorithm '''Initialize:''' Generate an initial population; '''while''' Stopping conditions are not satisfied '''do''' ''Evaluate'' all individuals in the population. ''Evolve'' a new population using stochastic search operators. {{nowrap|''Select'' the subset of individuals, <math>\Omega_{il}</math>}}, that should undergo the individual improvement procedure. {{nowrap|'''for''' each individual in <math>\Omega_{il}</math> '''do'''}} ''Perform'' individual learning using meme(s) {{nowrap|with frequency or probability of <math>f_{il}</math>}}, {{nowrap|for a period of <math>t_{il}</math>.}} ''Proceed'' with Lamarckian or Baldwinian learning. '''end for''' '''end while''' ===2nd generation=== ite journal|author=Krasnogor N.|title=Coevolution of genes and memes in memetic algorithms|journal=Graduate Student Workshop|year=1999|pages=371}}</ref> [[Hyper-heuristic]]<ref name=kendall2002cfa>{{cite conference|author=Kendall G. and Soubeiga E. and Cowling P.|title=Choice function and random hyperheuristics|conference=4th Asia-Pacific Conference on Simulated Evolution and Learning. SEAL 2002|pages=667–671}}</ref><ref name=burke2013>{{cite journal|author=Burke E. K. and Gendreau M. and Hyde M. and Kendall G. and Ochoa G. and &Ouml;zcan E. and Qu R.|title=Hyper-heuristics: A Survey of the State of the Art|journal=Journal of the Operational Research Society|year=2013|volume=64|pages=1695–1724|issue=12}}</ref> and Meta-Lamarckian MA<ref name=ong2004mll>{{cite journal|author1=Ong Y. S. |author2=Keane A. J. |lastauthoramp=yes |title=Meta-Lamarckian learning in memetic algorithms|journal=IEEE Transactions on Evolutionary Computation|year=2004|volume=8|pages=99–110|doi=10.1109/TEVC.2003.819944|issue=2}}</ref> are referred to as second generation MA exhibiting the principles of memetic transmission selection in their design. In Multi-meme MA, the memetic material is encoded as part of the [[genotype]]. Subsequently, the decoded meme of each respective individual/[[chromosome]] is then used to perform a local refinement. The memetic material is then transmitted through a simple inheritance mechanism from parent to offspring(s). On the other hand, in hyper-heuristic and meta-Lamarckian MA, the pool of candidate memes considered will compete, based on their past merits in generating local improvements through a reward mechanism, deciding on which meme to be selected to proceed for future local refinements. Memes with a higher reward have a greater chance of being replicated or copied. For a review on second generation MA; i.e., MA considering multiple individual learning methods within an evolutionary system, the reader is referred to.<ref name=ong2006cam>{{cite journal|author=Ong Y. S. and Lim M. H. and Zhu N. and Wong K. W.|title=Classification of Adaptive Memetic Algorithms: A Comparative Study|journal=IEEE Transactions on Systems Man and Cybernetics -- Part B.|year=2006|volume=36|pages=141|doi=10.1109/TSMCB.2005.856143|issue=1}}</ref> ===3rd generation=== Co-evolution<ref name=smith2007cma>{{cite journal|author=Smith J. E.|title=Coevolving Memetic Algorithms: A Review and Progress Report|journal=IEEE Transactions on Systems Man and Cybernetics - Part B|year=2007|volume=37|pages=6–17|doi=10.1109/TSMCB.2006.883273|issue=1}}</ref> and self-generating MAs<ref name=krasnogor2002ttm>{{cite journal|author1=Krasnogor N. |author2=Gustafson S. |lastauthoramp=yes |title=Toward truly "memetic" memetic algorithms: discussion and proof of concepts|journal=Advances in Nature-Inspired Computation: the PPSN VII Workshops. PEDAL (Parallel Emergent and Distributed Architectures Lab). University of Reading|year=2002}}</ref> may be regarded as 3rd generation MA where all three principles satisfying the definitions of a basic evolving system have been considered. In contrast to 2nd generation MA which assumes that the memes to be used are known a priori, 3rd generation MA utilizes a rule-based local search to supplement candidate solutions within the evolutionary system, thus capturing regularly repeated features or patterns in the problem space. ==Some design notes== The frequency and intensity of individual learning directly define the degree of evolution (exploration) against individual learning (exploitation) in the MA search, for a given fixed limited computational budget. Clearly, a more intense individual learning provides greater chance of convergence to the local optima but limits the amount of evolution that may be expended without incurring excessive computational resources. Therefore, care should be taken when setting these two parameters to balance the computational budget available in achieving maximum search performance. When only a portion of the population individuals undergo learning, the issue of which subset of individuals to improve need to be considered to maximize the utility of MA search. Last but not least, the individual learning procedure/meme used also favors a different neighborhood structure, hence the need to decide which meme or memes to use for a given optimization problem at hand would be required. ===How often should individual learning be applied?=== One of the first issues pertinent to memetic algorithm design is to consider how often the individual learning should be applied; i.e., individual learning frequency. In one case,<ref name=hart1994ago>{{cite journal|author=Hart W. E.|title=Adaptive Global Optimization with Local Search|publisher=University of California|year=1994}}</ref> the effect of individual learning frequency on MA search performance was considered where various configurations of the individual learning frequency at different stages of the MA search were investigated. Conversely, it was shown elsewhere<ref name=ku2000sle>{{cite journal|author=Ku K. W. C. and Mak M. W. and Siu W. C.|title=A study of the Lamarckian evolution of recurrent neural networks|journal=IEEE Transactions on Evolutionary Computation|year=2000|volume=4|pages=31–42|doi=10.1109/4235.843493|issue=1}}</ref> that it may be worthwhile to apply individual learning on every individual if the computational complexity of the individual learning is relatively low. ===On which solutions should individual learning be used?=== On the issue of selecting appropriate individuals among the EA population that should undergo individual learning, fitness-based and distribution-based strategies were studied for adapting the probability of applying individual learning on the population of chromosomes in continuous parametric search problems with Land<ref name=land1998eal>{{cite journal|author=Land M. W. S.|title=Evolutionary Algorithms with Local Search for Combinatorial Optimization|publisher=UNIVERSITY OF CALIFORNIA|year=1998}}</ref> extending the work to combinatorial optimization problems. Bambha et al. introduced a simulated heating technique for systematically integrating parameterized individual learning into evolutionary algorithms to achieve maximum solution quality.<ref name=bambha2004sip>{{cite journal|author=Bambha N. K. and Bhattacharyya S. S. and Teich J. and Zitzler E.|title=Systematic integration of parameterized local search into evolutionary algorithms|journal=IEEE Transactions on Evolutionary Computation|year=2004|volume=8|pages=137–155|doi=10.1109/TEVC.2004.823471|issue=2}}</ref> ===How long should individual learning be run?=== Individual learning intensity, <math>t_{il}</math>, is the amount of computational budget allocated to an iteration of individual learning; i.e., the maximum computational budget allowable for individual learning to expend on improving a single solution. ===What individual learning method or meme should be used for a particular problem or individual?=== In the context of continuous optimization, individual learning exists in the form of local heuristics or conventional exact enumerative methods.<ref name=schwefel1995eao>{{cite book|title=Evolution and optimum seeking|publisher=Wiley New York|year=1995|author=Schwefel H. P.}}</ref> Examples of individual learning strategies include the hill climbing, Simplex method, Newton/Quasi-Newton method, interior point methods, conjugate gradient method, line search, and other local heuristics. Note that most of common individual learninger are deterministic. In combinatorial optimization, on the other hand, individual learning methods commonly exist in the form of heuristics (which can be deterministic or stochastic) that are tailored to a specific problem of interest. Typical heuristic procedures and schemes include the k-gene exchange, edge exchange, first-improvement, and many others. ==Applications== Memetic algorithms have been successfully applied to a multitude of real-world problems. Although many people employ techniques closely related to memetic algorithms, alternative names such as ''hybrid genetic algorithms'' are also employed. Furthermore, many people term their memetic techniques as ''genetic algorithms''.{{Citation needed|date=September 2014}} Researchers have used memetic algorithms to tackle many classical [[NP (complexity)|NP]] problems. To cite some of them: [[graph partition]]ing, [[knapsack problem|multidimensional knapsack]], [[travelling salesman problem]], [[quadratic assignment problem]], [[set cover problem]], [[graph coloring#Algorithms|minimal graph coloring]], [[independent set problem|max independent set problem]], [[bin packing problem]], and [[Generalized Assignment Problem|generalized assignment problem]]. More recent applications include (but are not limited to) training of [[artificial neural network]]s,<ref name=training_ANN>{{cite conference |author1=Ichimura, T. |author2=Kuriyama, Y. |title=Learning of neural networks with parallel hybrid GA using a royal road function|conference=IEEE International Joint Conference on Neural Networks|volume=2|pages=1131–1136|year=1998|location=New York, NY }}</ref> [[pattern recognition]],<ref name=pattern_recognition>{{cite journal|author1=Aguilar, J. |author2=Colmenares, A. |year=1998|title=Resolution of pattern recognition problems using a hybrid genetic/random neural network learning algorithm|journal=Pattern Analysis and Applications|volume=1|issue=1|pages=52–61|doi=10.1007/BF01238026}}</ref> robotic [[motion planning]],<ref name=motion_planning>{{cite journal|author1=Ridao, M. |author2=Riquelme, J. |author3=Camacho, E. |author4=Toro, M. | year=1998|title=An evolutionary and local search algorithm for planning two manipulators motion|journal=Lecture Notes in Computer Science|volume=1416| pages=105–114|publisher=Springer-Verlag|doi=10.1007/3-540-64574-8_396|series=Lecture Notes in Computer Science|isbn=3-540-64574-8}}</ref> [[charged particle beam|beam]] orientation,<ref name=beam_orientation>{{cite journal|author1=Haas, O. |author2=Burnham, K. |author3=Mills, J. |year=1998|title=Optimization of beam orientation in radiotherapy using planar geometry|journal=Physics in Medicine and Biology|volume=43|issue=8|pages=2179–2193|doi=10.1088/0031-9155/43/8/013|pmid=9725597}}</ref> [[circuit design]],<ref name=circuit_design>{{cite journal|author1=Harris, S. |author2=Ifeachor, E. |year=1998|title=Automatic design of frequency sampling filters by hybrid genetic algorithm techniques|journal=IEEE Transactions on Signal Processing|volume=46|issue=12|pages=3304–3314|doi=10.1109/78.735305}}</ref> electric service restoration,<ref name=service_restoration>{{cite journal|author1=Augugliaro, A. |author2=Dusonchet, L. |author3=Riva-Sanseverino, E. |year=1998|title=Service restoration in compensated distribution networks using a hybrid genetic algorithm|journal=Electric Power Systems Research|volume=46|issue=1|pages=59–66|doi=10.1016/S0378-7796(98)00025-X}}</ref> medical [[expert system]]s,<ref name=medical_expert_system>{{cite journal|author1=Wehrens, R. |author2=Lucasius, C. |author3=Buydens, L. |author4=Kateman, G. |year=1993|title=HIPS, A hybrid self-adapting expert system for nuclear magnetic resonance spectrum interpretation using genetic algorithms|journal=Analytica Chimica Acta|volume=277|issue=2|pages=313–324|doi=10.1016/0003-2670(93)80444-P}}</ref> [[single machine scheduling]],<ref name=single_machine_sched>{{cite conference|author1=França, P. |author2=Mendes, A. |author3=Moscato, P. |title=Memetic algorithms to minimize tardiness on a single machine with sequence-dependent setup times|conference=Proceedings of the 5th International Conference of the Decision Sciences Institute|pages=1708–1710|year=1999|location=Athens, Greece}}</ref> automatic timetabling (notably, the timetable for the [[NHL]]),<ref name=nhl_timetabling>{{cite journal|author=Costa, D.|title=An evolutionary tabu search algorithm and the NHL scheduling problem|journal=Infor 33|year=1995|pages=161–178}}</ref> [[Schedule (workplace)|manpower scheduling]],<ref name=nurse_rostering>{{cite conference|author=Aickelin, U.|title=Nurse rostering with genetic algorithms|conference=Proceedings of young operational research conference 1998|year=1998|location=Guildford, UK}}</ref> [[nurse rostering problem|nurse rostering optimisation]],<ref name=nurse_rostering_function_opt>{{cite journal| author = Ozcan, E.|year=2007|title=Memes, Self-generation and Nurse Rostering|journal=Lecture Notes in Computer Science|volume=3867|pages=85–104|publisher=Springer-Verlag|doi=10.1007/978-3-540-77345-0_6|series=Lecture Notes in Computer Science|isbn=978-3-540-77344-3}}</ref> [[processor allocation]],<ref name=proc_alloc>{{cite journal|author1=Ozcan, E. |author2=Onbasioglu, E. |year=2006|title=Memetic Algorithms for Parallel Code Optimization|journal=International Journal of Parallel Programming|volume=35|issue=1|pages=33–61|doi=10.1007/s10766-006-0026-x}}</ref> maintenance scheduling (for example, of an electric distribution network),<ref name=planned_maintenance>{{cite journal|author1=Burke, E. |author2=Smith, A. |year=1999|title=A memetic algorithm to schedule planned maintenance for the national grid| journal=Journal of Experimental Algorithmics|issue=4|pages=1–13|doi=10.1145/347792.347801|volume=4}}</ref> multidimensional knapsack problem,<ref name=mkp_ma>{{cite journal|author1=Ozcan, E. |author2=Basaran, C. |year=2009|title=A Case Study of Memetic Algorithms for Constraint Optimization|journal=Soft Computing: A Fusion of Foundations, Methodologies and Applications|volume=13|issue=8–9|pages=871–882|doi=10.1007/s00500-008-0354-4}}</ref> [[VLSI]] design,<ref name=vlsi_design>{{cite journal|author=Areibi, S., Yang, Z.|year=2004|title=Effective memetic algorithms for VLSI design automation = genetic algorithms + local search + multi-level clustering|journal=Evolutionary Computation|volume=12|issue=3|pages=327–353|publisher=MIT Press|doi=10.1162/1063656041774947|pmid=15355604}}</ref> [[cluster analysis|clustering]] of [[expression profiling|gene expression profiles]],<ref name=clustering_gene_expression >{{cite book|author1=Merz, P. |author2=Zell, A. |title = Parallel Problem Solving from Nature — PPSN VII|year=2002|publisher=[[Springer Science+Business Media|Springer]]|doi=10.1007/3-540-45712-7_78|pages=811–820| chapter=Clustering Gene Expression Profiles with Memetic Algorithms}}</ref> feature/gene selection,<ref name=gene_selection1>{{cite journal|author=Zexuan Zhu, Y. S. Ong and M. Dash|title=Markov Blanket-Embedded Genetic Algorithm for Gene Selection|year=2007|journal=Pattern Recognition|volume=49|issue=11|pages=3236–3248|doi=10.1016/j.patcog.2007.02.007}}</ref><ref name=gene_selection2>{{cite journal|author=Zexuan Zhu, Y. S. Ong and M. Dash|title=Wrapper-Filter Feature Selection Algorithm Using A Memetic Framework|year=2007|journal=IEEE Transactions on Systems, Man and Cybernetics - Part B|volume=37|issue=1|pages=70–76|doi=10.1109/TSMCB.2006.883267}}</ref> and multi-class, multi-objective [[feature selection]].<ref name=feature_selection>{{cite journal|author=Zexuan Zhu, Y. S. Ong and M. Zurada|title=Simultaneous Identification of Full Class Relevant and Partial Class Relevant Genes|year=2008|journal=IEEE/ACM Transactions on Computational Biology and Bioinformatics}}</ref><ref name=feature_selection2>{{cite journal|author1=G. Karkavitsas |author2=G. Tsihrintzis |lastauthoramp=yes |title=Automatic Music Genre Classification Using Hybrid Genetic Algorithms|year=2011|journal=Intelligent Interactive Multimedia Systems and Services|volume=11|pages=323–335|publisher=Springer|doi=10.1007/978-3-642-22158-3_32}}</ref> ==Recent Activities in Memetic Algorithms== *IEEE Workshop on Memetic Algorithms (WOMA 2009). Program Chairs: Jim Smith, University of the West of England, U.K.; Yew-Soon Ong, Nanyang Technological University, Singapore; Gustafson Steven, University of Nottingham; U.K.; Meng Hiot Lim, Nanyang Technological University, Singapore; Natalio Krasnogor, University of Nottingham, U.K. *[http://www.springer.com/journal/12293 Memetic Computing Journal], first issue appeared in January 2009. *[http://www.wcci2008.org/ 2008 IEEE World Congress on Computational Intelligence (WCCI 2008)], Hong Kong, [http://users.jyu.fi/~neferran/MA2008/MA2008.htm Special Session on Memetic Algorithms]. *[http://www.ntu.edu.sg/home/asysong/SC/Special-Issue-MA.htm Special Issue on 'Emerging Trends in Soft Computing - Memetic Algorithm'], Soft Computing Journal, Completed & In Press, 2008. *[http://www.ntu.edu.sg/home/asysong/ETTC/ETTC%20Task%20Force%20-%20Memetic%20Computing.htm IEEE Computational Intelligence Society Emergent Technologies Task Force on Memetic Computing] * [http://cec2007.nus.edu.sg/ IEEE Congress on Evolutionary Computation (CEC 2007)], Singapore, [http://ntu-cg.ntu.edu.sg/ysong/MA-SS/MA.htm Special Session on Memetic Algorithms]. * [http://www.esi-topics.com/erf/2007/august07-Ong_Keane.html 'Memetic Computing'] by Thomson Scientific's Essential Science Indicators as an Emerging Front Research Area. * [http://ieeexplore.ieee.org/Xplore/login.jsp?url=/iel5/3477/4067063/04067075.pdf?tp=&isnumber=&arnumber=4067075 Special Issue on Memetic Algorithms], IEEE Transactions on Systems, Man and Cybernetics - Part B, Vol. 37, No. 1, February 2007. * [http://www.springeronline.com/sgw/cda/frontpage/0,11855,5-40356-72-34233226-0,00.html Recent Advances in Memetic Algorithms], Series: Studies in Fuzziness and Soft Computing, Vol. 166, ISBN 978-3-540-22904-9, 2005. * [http://www.mitpressjournals.org/doi/abs/10.1162/1063656041775009?prevSearch=allfield%3A%28memetic+algorithm%29 Special Issue on Memetic Algorithms], Evolutionary Computation Fall 2004, Vol. 12, No. 3: v-vi. ==References== {{reflist|30em}} [[Category:Evolutionary algorithms]]'
New page wikitext, after the edit (new_wikitext)
'{{Context|date=February 2011}} '''Memetic algorithms''' (MA) represent one of the recent growing areas of research in [[evolutionary computation]]. The term MA is now widely used as a synergy of evolutionary or any population-based approach with separate individual learning or local improvement procedures for problem search. Quite often, MA are also referred to in the literature as Baldwinian [[evolutionary algorithm]]s (EA), Lamarckian EAs, cultural algorithms, or genetic local search. ==Introduction== Inspired by both Darwinian principles of natural evolution and [[Richard Dawkins|Dawkins']] notion of a [[meme]], the term “Memetic Algorithm” (MA) was introduced by Moscato in his technical report<ref name=martial_arts>{{cite journal|last=Moscato|first=P.|year=1989|title=On Evolution, Search, Optimization, Genetic Algorithms and Martial Arts: Towards Memetic Algorithms|journal=Caltech Concurrent Computation Program|volume=|issue=report 826}}</ref> in 1989 where he viewed MA as being close to a form of population-based hybrid [[genetic algorithm]] (GA) coupled with an individual learning procedure capable of performing local refinements. The metaphorical parallels, on the one hand, to Darwinian evolution and, on the other hand, between memes and domain specific (local search) [[heuristic]]s are captured within memetic algorithms thus rendering a methodology that balances well between generality and problem specificity. This two-stage nature makes them a special case of [[Dual-phase evolution]]. In a more diverse context, memetic algorithms are now used under various names including Hybrid Evolutionary Algorithms, Baldwinian Evolutionary Algorithms, Lamarckian Evolutionary Algorithms, Cultural Algorithms, or Genetic Local Search. In the context of complex optimization, many different instantiations of memetic algorithms have been reported across a wide range of [[#Applications|application domains]], in general, converging to high-quality solutions more efficiently than their conventional evolutionary counterparts. In general, using the ideas of memetics within a computational framework is called "Memetic Computing or Memetic Computation" (MC).<ref name=MC2011>{{cite journal|last=Chen|first=X. S. |last2=Ong|first2=Y. S.|last3=Lim|first3=M. H.|last4=Tan|first4=K. C.|year=2011|title=A Multi-Facet Survey on Memetic Computation|journal=[[IEEE Transactions on Evolutionary Computation]]|volume=15|issue=5|pages=591–607|doi=10.1109/tevc.2011.2132725}}</ref><ref name=MC2010>{{cite journal|last=Chen|first=X. S. |last2=Ong|first2=Y. S.|last3=Lim|first3=M. H.|year=2010|title=Research Frontier: Memetic Computation - Past, Present & Future|journal=[[IEEE Computational Intelligence Society#Publications|IEEE Computational Intelligence Magazine]]|volume=5|issue=2|pages=24–36|doi=10.1109/mci.2010.936309}}</ref> With MC, the traits of Universal Darwinism are more appropriately captured. Viewed in this perspective, MA is a more constrained notion of MC. More specifically, MA covers one area of MC, in particular dealing with areas of evolutionary algorithms that marry other deterministic refinement techniques for solving optimization problems. MC extends the notion of memes to cover conceptual entities of knowledge-enhanced procedures or representations. ==The development of MAs== ===1st generation=== The first generation of MA refers to hybrid [[algorithm]]s, a marriage between a population-based global search (often in the form of an evolutionary algorithm) coupled with a cultural evolutionary stage. This first generation of MA although encompasses characteristics of cultural evolution (in the form of local refinement) in the search cycle, it may not qualify as a true evolving system according to [[Universal Darwinism]], since all the core principles of inheritance/memetic transmission, variation, and selection are missing. This suggests why the term MA stirred up criticisms and controversies among researchers when first introduced.<ref name=martial_arts/> ;Pseudo code: '''Procedure''' Memetic Algorithm '''Initialize:''' Generate an initial population; '''while''' Stopping conditions are not satisfied '''do''' ''Evaluate'' all individuals in the population. ''Evolve'' a new population using stochastic search operators. {{nowrap|''Select'' the subset of individuals, <math>\Omega_{il}</math>}}, that should undergo the individual improvement procedure. {{nowrap|'''for''' each individual in <math>\Omega_{il}</math> '''do'''}} ''Perform'' individual learning using meme(s) {{nowrap|with frequency or probability of <math>f_{il}</math>}}, {{nowrap|for a period of <math>t_{il}</math>.}} ''Proceed'' with Lamarckian or Baldwinian learning. '''end for''' '''end while''' ===3rd generation=== Co-evolution<ref name=smith2007cma>{{cite journal|author=Smith J. E.|title=Coevolving Memetic Algorithms: A Review and Progress Report|journal=IEEE Transactions on Systems Man and Cybernetics - Part B|year=2007|volume=37|pages=6–17|doi=10.1109/TSMCB.2006.883273|issue=1}}</ref> and self-generating MAs<ref name=krasnogor2002ttm>{{cite journal|author1=Krasnogor N. |author2=Gustafson S. |lastauthoramp=yes |title=Toward truly "memetic" memetic algorithms: discussion and proof of concepts|journal=Advances in Nature-Inspired Computation: the PPSN VII Workshops. PEDAL (Parallel Emergent and Distributed Architectures Lab). University of Reading|year=2002}}</ref> may be regarded as 3rd generation MA where all three principles satisfying the definitions of a basic evolving system have been considered. In contrast to 2nd generation MA which assumes that the memes to be used are known a priori, 3rd generation MA utilizes a rule-based local search to supplement candidate solutions within the evolutionary system, thus capturing regularly repeated features or patterns in the problem space. ==Some design notes== The frequency and intensity of individual learning directly define the degree of evolution (exploration) against individual learning (exploitation) in the MA search, for a given fixed limited computational budget. Clearly, a more intense individual learning provides greater chance of convergence to the local optima but limits the amount of evolution that may be expended without incurring excessive computational resources. Therefore, care should be taken when setting these two parameters to balance the computational budget available in achieving maximum search performance. When only a portion of the population individuals undergo learning, the issue of which subset of individuals to improve need to be considered to maximize the utility of MA search. Last but not least, the individual learning procedure/meme used also favors a different neighborhood structure, hence the need to decide which meme or memes to use for a given optimization problem at hand would be required. ===How often should individual learning be applied?=== One of the first issues pertinent to memetic algorithm design is to consider how often the individual learning should be applied; i.e., individual learning frequency. In one case,<ref name=hart1994ago>{{cite journal|author=Hart W. E.|title=Adaptive Global Optimization with Local Search|publisher=University of California|year=1994}}</ref> the effect of individual learning frequency on MA search performance was considered where various configurations of the individual learning frequency at different stages of the MA search were investigated. Conversely, it was shown elsewhere<ref name=ku2000sle>{{cite journal|author=Ku K. W. C. and Mak M. W. and Siu W. C.|title=A study of the Lamarckian evolution of recurrent neural networks|journal=IEEE Transactions on Evolutionary Computation|year=2000|volume=4|pages=31–42|doi=10.1109/4235.843493|issue=1}}</ref> that it may be worthwhile to apply individual learning on every individual if the computational complexity of the individual learning is relatively low. ===On which solutions should individual learning be used?=== On the issue of selecting appropriate individuals among the EA population that should undergo individual learning, fitness-based and distribution-based strategies were studied for adapting the probability of applying individual learning on the population of chromosomes in continuous parametric search problems with Land<ref name=land1998eal>{{cite journal|author=Land M. W. S.|title=Evolutionary Algorithms with Local Search for Combinatorial Optimization|publisher=UNIVERSITY OF CALIFORNIA|year=1998}}</ref> extending the work to combinatorial optimization problems. Bambha et al. introduced a simulated heating technique for systematically integrating parameterized individual learning into evolutionary algorithms to achieve maximum solution quality.<ref name=bambha2004sip>{{cite journal|author=Bambha N. K. and Bhattacharyya S. S. and Teich J. and Zitzler E.|title=Systematic integration of parameterized local search into evolutionary algorithms|journal=IEEE Transactions on Evolutionary Computation|year=2004|volume=8|pages=137–155|doi=10.1109/TEVC.2004.823471|issue=2}}</ref> ===How long should individual learning be run?=== Individual learning intensity, <math>t_{il}</math>, is the amount of computational budget allocated to an iteration of individual learning; i.e., the maximum computational budget allowable for individual learning to expend on improving a single solution. ===What individual learning method or meme should be used for a particular problem or individual?=== In the context of continuous optimization, individual learning exists in the form of local heuristics or conventional exact enumerative methods.<ref name=schwefel1995eao>{{cite book|title=Evolution and optimum seeking|publisher=Wiley New York|year=1995|author=Schwefel H. P.}}</ref> Examples of individual learning strategies include the hill climbing, Simplex method, Newton/Quasi-Newton method, interior point methods, conjugate gradient method, line search, and other local heuristics. Note that most of common individual learninger are deterministic. In combinatorial optimization, on the other hand, individual learning methods commonly exist in the form of heuristics (which can be deterministic or stochastic) that are tailored to a specific problem of interest. Typical heuristic procedures and schemes include the k-gene exchange, edge exchange, first-improvement, and many others. ==Applications== Memetic algorithms have been successfully applied to a multitude of real-world problems. Although many people employ techniques closely related to memetic algorithms, alternative names such as ''hybrid genetic algorithms'' are also employed. Furthermore, many people term their memetic techniques as ''genetic algorithms''.{{Citation needed|date=September 2014}} Researchers have used memetic algorithms to tackle many classical [[NP (complexity)|NP]] problems. To cite some of them: [[graph partition]]ing, [[knapsack problem|multidimensional knapsack]], [[travelling salesman problem]], [[quadratic assignment problem]], [[set cover problem]], [[graph coloring#Algorithms|minimal graph coloring]], [[independent set problem|max independent set problem]], [[bin packing problem]], and [[Generalized Assignment Problem|generalized assignment problem]]. More recent applications include (but are not limited to) training of [[artificial neural network]]s,<ref name=training_ANN>{{cite conference |author1=Ichimura, T. |author2=Kuriyama, Y. |title=Learning of neural networks with parallel hybrid GA using a royal road function|conference=IEEE International Joint Conference on Neural Networks|volume=2|pages=1131–1136|year=1998|location=New York, NY }}</ref> [[pattern recognition]],<ref name=pattern_recognition>{{cite journal|author1=Aguilar, J. |author2=Colmenares, A. |year=1998|title=Resolution of pattern recognition problems using a hybrid genetic/random neural network learning algorithm|journal=Pattern Analysis and Applications|volume=1|issue=1|pages=52–61|doi=10.1007/BF01238026}}</ref> robotic [[motion planning]],<ref name=motion_planning>{{cite journal|author1=Ridao, M. |author2=Riquelme, J. |author3=Camacho, E. |author4=Toro, M. | year=1998|title=An evolutionary and local search algorithm for planning two manipulators motion|journal=Lecture Notes in Computer Science|volume=1416| pages=105–114|publisher=Springer-Verlag|doi=10.1007/3-540-64574-8_396|series=Lecture Notes in Computer Science|isbn=3-540-64574-8}}</ref> [[charged particle beam|beam]] orientation,<ref name=beam_orientation>{{cite journal|author1=Haas, O. |author2=Burnham, K. |author3=Mills, J. |year=1998|title=Optimization of beam orientation in radiotherapy using planar geometry|journal=Physics in Medicine and Biology|volume=43|issue=8|pages=2179–2193|doi=10.1088/0031-9155/43/8/013|pmid=9725597}}</ref> [[circuit design]],<ref name=circuit_design>{{cite journal|author1=Harris, S. |author2=Ifeachor, E. |year=1998|title=Automatic design of frequency sampling filters by hybrid genetic algorithm techniques|journal=IEEE Transactions on Signal Processing|volume=46|issue=12|pages=3304–3314|doi=10.1109/78.735305}}</ref> electric service restoration,<ref name=service_restoration>{{cite journal|author1=Augugliaro, A. |author2=Dusonchet, L. |author3=Riva-Sanseverino, E. |year=1998|title=Service restoration in compensated distribution networks using a hybrid genetic algorithm|journal=Electric Power Systems Research|volume=46|issue=1|pages=59–66|doi=10.1016/S0378-7796(98)00025-X}}</ref> medical [[expert system]]s,<ref name=medical_expert_system>{{cite journal|author1=Wehrens, R. |author2=Lucasius, C. |author3=Buydens, L. |author4=Kateman, G. |year=1993|title=HIPS, A hybrid self-adapting expert system for nuclear magnetic resonance spectrum interpretation using genetic algorithms|journal=Analytica Chimica Acta|volume=277|issue=2|pages=313–324|doi=10.1016/0003-2670(93)80444-P}}</ref> [[single machine scheduling]],<ref name=single_machine_sched>{{cite conference|author1=França, P. |author2=Mendes, A. |author3=Moscato, P. |title=Memetic algorithms to minimize tardiness on a single machine with sequence-dependent setup times|conference=Proceedings of the 5th International Conference of the Decision Sciences Institute|pages=1708–1710|year=1999|location=Athens, Greece}}</ref> automatic timetabling (notably, the timetable for the [[NHL]]),<ref name=nhl_timetabling>{{cite journal|author=Costa, D.|title=An evolutionary tabu search algorithm and the NHL scheduling problem|journal=Infor 33|year=1995|pages=161–178}}</ref> [[Schedule (workplace)|manpower scheduling]],<ref name=nurse_rostering>{{cite conference|author=Aickelin, U.|title=Nurse rostering with genetic algorithms|conference=Proceedings of young operational research conference 1998|year=1998|location=Guildford, UK}}</ref> [[nurse rostering problem|nurse rostering optimisation]],<ref name=nurse_rostering_function_opt>{{cite journal| author = Ozcan, E.|year=2007|title=Memes, Self-generation and Nurse Rostering|journal=Lecture Notes in Computer Science|volume=3867|pages=85–104|publisher=Springer-Verlag|doi=10.1007/978-3-540-77345-0_6|series=Lecture Notes in Computer Science|isbn=978-3-540-77344-3}}</ref> [[processor allocation]],<ref name=proc_alloc>{{cite journal|author1=Ozcan, E. |author2=Onbasioglu, E. |year=2006|title=Memetic Algorithms for Parallel Code Optimization|journal=International Journal of Parallel Programming|volume=35|issue=1|pages=33–61|doi=10.1007/s10766-006-0026-x}}</ref> maintenance scheduling (for example, of an electric distribution network),<ref name=planned_maintenance>{{cite journal|author1=Burke, E. |author2=Smith, A. |year=1999|title=A memetic algorithm to schedule planned maintenance for the national grid| journal=Journal of Experimental Algorithmics|issue=4|pages=1–13|doi=10.1145/347792.347801|volume=4}}</ref> multidimensional knapsack problem,<ref name=mkp_ma>{{cite journal|author1=Ozcan, E. |author2=Basaran, C. |year=2009|title=A Case Study of Memetic Algorithms for Constraint Optimization|journal=Soft Computing: A Fusion of Foundations, Methodologies and Applications|volume=13|issue=8–9|pages=871–882|doi=10.1007/s00500-008-0354-4}}</ref> [[VLSI]] design,<ref name=vlsi_design>{{cite journal|author=Areibi, S., Yang, Z.|year=2004|title=Effective memetic algorithms for VLSI design automation = genetic algorithms + local search + multi-level clustering|journal=Evolutionary Computation|volume=12|issue=3|pages=327–353|publisher=MIT Press|doi=10.1162/1063656041774947|pmid=15355604}}</ref> [[cluster analysis|clustering]] of [[expression profiling|gene expression profiles]],<ref name=clustering_gene_expression >{{cite book|author1=Merz, P. |author2=Zell, A. |title = Parallel Problem Solving from Nature — PPSN VII|year=2002|publisher=[[Springer Science+Business Media|Springer]]|doi=10.1007/3-540-45712-7_78|pages=811–820| chapter=Clustering Gene Expression Profiles with Memetic Algorithms}}</ref> feature/gene selection,<ref name=gene_selection1>{{cite journal|author=Zexuan Zhu, Y. S. Ong and M. Dash|title=Markov Blanket-Embedded Genetic Algorithm for Gene Selection|year=2007|journal=Pattern Recognition|volume=49|issue=11|pages=3236–3248|doi=10.1016/j.patcog.2007.02.007}}</ref><ref name=gene_selection2>{{cite journal|author=Zexuan Zhu, Y. S. Ong and M. Dash|title=Wrapper-Filter Feature Selection Algorithm Using A Memetic Framework|year=2007|journal=IEEE Transactions on Systems, Man and Cybernetics - Part B|volume=37|issue=1|pages=70–76|doi=10.1109/TSMCB.2006.883267}}</ref> and multi-class, multi-objective [[feature selection]].<ref name=feature_selection>{{cite journal|author=Zexuan Zhu, Y. S. Ong and M. Zurada|title=Simultaneous Identification of Full Class Relevant and Partial Class Relevant Genes|year=2008|journal=IEEE/ACM Transactions on Computational Biology and Bioinformatics}}</ref><ref name=feature_selection2>{{cite journal|author1=G. Karkavitsas |author2=G. Tsihrintzis |lastauthoramp=yes |title=Automatic Music Genre Classification Using Hybrid Genetic Algorithms|year=2011|journal=Intelligent Interactive Multimedia Systems and Services|volume=11|pages=323–335|publisher=Springer|doi=10.1007/978-3-642-22158-3_32}}</ref> ==Recent Activities in Memetic Algorithms== *IEEE Workshop on Memetic Algorithms (WOMA 2009). Program Chairs: Jim Smith, University of the West of England, U.K.; Yew-Soon Ong, Nanyang Technological University, Singapore; Gustafson Steven, University of Nottingham; U.K.; Meng Hiot Lim, Nanyang Technological University, Singapore; Natalio Krasnogor, University of Nottingham, U.K. *[http://www.springer.com/journal/12293 Memetic Computing Journal], first issue appeared in January 2009. *[http://www.wcci2008.org/ 2008 IEEE World Congress on Computational Intelligence (WCCI 2008)], Hong Kong, [http://users.jyu.fi/~neferran/MA2008/MA2008.htm Special Session on Memetic Algorithms]. *[http://www.ntu.edu.sg/home/asysong/SC/Special-Issue-MA.htm Special Issue on 'Emerging Trends in Soft Computing - Memetic Algorithm'], Soft Computing Journal, Completed & In Press, 2008. *[http://www.ntu.edu.sg/home/asysong/ETTC/ETTC%20Task%20Force%20-%20Memetic%20Computing.htm IEEE Computational Intelligence Society Emergent Technologies Task Force on Memetic Computing] * [http://cec2007.nus.edu.sg/ IEEE Congress on Evolutionary Computation (CEC 2007)], Singapore, [http://ntu-cg.ntu.edu.sg/ysong/MA-SS/MA.htm Special Session on Memetic Algorithms]. * [http://www.esi-topics.com/erf/2007/august07-Ong_Keane.html 'Memetic Computing'] by Thomson Scientific's Essential Science Indicators as an Emerging Front Research Area. * [http://ieeexplore.ieee.org/Xplore/login.jsp?url=/iel5/3477/4067063/04067075.pdf?tp=&isnumber=&arnumber=4067075 Special Issue on Memetic Algorithms], IEEE Transactions on Systems, Man and Cybernetics - Part B, Vol. 37, No. 1, February 2007. * [http://www.springeronline.com/sgw/cda/frontpage/0,11855,5-40356-72-34233226-0,00.html Recent Advances in Memetic Algorithms], Series: Studies in Fuzziness and Soft Computing, Vol. 166, ISBN 978-3-540-22904-9, 2005. * [http://www.mitpressjournals.org/doi/abs/10.1162/1063656041775009?prevSearch=allfield%3A%28memetic+algorithm%29 Special Issue on Memetic Algorithms], Evolutionary Computation Fall 2004, Vol. 12, No. 3: v-vi. ==References== {{reflist|30em}} [[Category:Evolutionary algorithms]]'
Whether or not the change was made through a Tor exit node (tor_exit_node)
0
Unix timestamp of change (timestamp)
1480335984