Dagstuhl Seminar 13271
Theory of Evolutionary Algorithms
( Jun 30 – Jul 05, 2013 )
Permalink
Organizers
- Benjamin Doerr (MPI für Informatik - Saarbrücken, DE)
- Nikolaus Hansen (University of Paris South XI, FR)
- Jonathan L. Shapiro (University of Manchester, GB)
- Darrell Whitley (Colorado State University, US)
Contact
Impacts
- A Fixed Budget Analysis of Randomized Search Heuristics for the Traveling Salesperson Problem : article in GECCO '14 Proceedings of the 2014 Conference on Genetic and Evolutionary Computation pp. 807-814 - Nallaperuma, Samadhi; Neumann, Frank; Sudholt, Dirk - New York : ACM, 2014. - pp. 807-814 - GECCO 2014 Proceedings of the 2014 Conference on Genetic and Evolutionary Computation.
- Analysis of runtime of optimization algorithms for noisy functions over discrete codomains : article : pp. 42-50 - Akimoto, Youhei; Astete-Morales, Sandra; Teytaud, Olivier - Amsterdam : Elsevier, 2015 - (Theoretical computer science : 605. 2015, 9).
- Clustering Problems for More Useful Benchmarking of Optimization Algorithms : article in SEAL 2014, LNCS 8886 : pp. 131-142 - Gallagher, Marcus - Berlin : Springer, 2014 - (Lecture notes in computer science : 8886 ; article).
- Design and Analysis of Adaptive Migration Intervals in Parallel Evolutionary Algorithms : article in GECCO '14 Proceedings of the 2014 Conference on Genetic and Evolutionary Computation pp. 1047-1054 - Mambrini, Andrea; Sudholt, Dirk - New York : ACM, 2014. - pp. 1047-1054 - GECCO 2014 Proceedings of the 2014 Conference on Genetic and Evolutionary Computation.
- Expected Fitness Gains of Randomized Search Heuristics for the Traveling Salesperson Problem : article : pp. 673-705 - Nallaperuma, Samadhi; Neumann, Frank; Sudholt, Dirk - Cambridge : MIT Press, 2017 - (Evolutionary computation : 25. 2017, 4).
- Fixed Budget Performance of the (1+1) EA on Linear Functions article in FOGA 2015 : pp. 52-61 - Lengler, Johannes; Spooner, Nick - New York : ACM, 2015 - (FOGA '15 Proceedings of the 2015 ACM Conference on Foundations of Genetic Algorithms XIII : article).
- Level-Based Analysis of Genetic Algorithms and Other Search Processes : article in LNCS 8672 : pp. 912-921 - Corus, Dogan; Dang, Duc-Cuong; Eremeev, Anton V.; Lehre, Per Kristian - Berlin : Springer, 2014 - (Lecture notes in computer science : 8672 ; article).
- On the Easiest and Hardest Fitness Functions : article : pp. 295-305 - He, Jun; Chen, Tianshi; Yao, Xin - Los Alamitos : IEEE, 2015 - (IEEE transactions on evolutionary computation : 19. 2015, 2).
- Run-Time Analysis of Population-Based Evolutionary in Noisy Environments article in FOGA '15 Proceedings of the 2015 ACM Conference on Foundations of Genetic Algorithms XIII Proceedings : pp. 69-75 - Prügel-Bennett, Adam; Rowe, Jonathan E.; Shapiro, Jonathan L. - New York : ACM, 2015 - (FOGA '15 Proceedings of the 2015 ACM Conference on Foundations of Genetic Algorithms XIII Proceedings ; article).
- The Dynamics of Cumulative Step Size Adaptation on the Ellipsoid Model : article : pp. 25-57 - Beyer, Hans-Georg; Hellwig, Michael - Cambridge : MIT Press, 2016 - (Evolutionary computation : 24. 2016, 2).
- Towards improved benchmarking of black-box optimization algorithms using clustering problems : article : 15 pp. - Gallagher, Marcus - Berlin : Springer, 2016 - (Soft Computing ; 2016).
Evolutionary algorithms (EAs) are stochastic optimization methods that are based on principles derived from natural evolution. Mutation, recombination, and selection are iterated with the goal of driving a population of candidate solutions toward better and better regions of the search space. The goal of the theory of evolutionary algorithms is to develop methods to make them work more effectively, propose new principles for the design of evolutionary algorithms, and develop better insight and understanding concerning how they work.
We will leave scope for new topics to be brought forward by the participants (many of whom are young researchers who we anticipate will inject new ideas into the field). However, we expect the following key directions to play a central role at this seminar.
We need advances in some of the existing approaches to make further progress. Although there has been great progress in applying classical analysis of randomized algorithms, the existence of a population and the recombination operator make analysis difficult. New methods need to be developed and disseminated. Complexity theory is another such example. It is a corner stone of computer science; in application to randomized search heuristics it takes the form of black-box complexity. However, as currently defined it gives paradoxical results, such as assigning a low complexity to some notoriously hard problems. New definitions have been recently proposed to fix this, but it is still an open problem.
We also expect to explore new methods for designing algorithms. Recently a new principle for devising evolutionary algorithms was pro- posed, based on the idea of conducting a natural gradient descent in the space of probability distributions, a sample from which would rep- resent the population in an EA. This gives a new principled way of devising search algorithms, but also provides a new challenge to theory to understand these algorithms and demonstrate their properties.
The field needs to advance the complexity of the problems it can tackle. One of the most explosive areas of growth is multi-objective optimization, which requires the simultaneous optimization of many objectives, representing the trade-offs between different sources of cost or benefit. Evolutionary algorithms are the method of choice for multi-objective optimization; however, many fundamental questions on their working principles still remain unsolved.
Much of the advances in the theory of evolutionary algorithms has studied realistic algorithms on toy problems. The application to realistic problems is much more difficult. One promising approach is based on landscape analysis, by computing features of a search space that can be used to guide search. However, it usually takes exponential time to compute useful metrics exactly. Recent work has shown that some NP-hard problems can be decomposed and statistic moments of the search space can be computed in polynomial time. Can these be used to guide search and devise more effective search algorithms?
EA theory has gained much momentum over the last few years and has made numerous valuable contributions to the field. Much of this momentum is due to the Dagstuhl Seminars on “Theory of Evolutionary Algorithms", which has become one of the leading meetings for EA theorists in the world. We wish to sustain this momentum, but also explore new avenues for future research.
Evolutionary algorithms (EAs) are stochastic optimization methods that are based on principles derived from natural evolution. Mutation, recombination, and selection are iterated with the goal of driving a population of candidate solutions toward better and better regions of the search space.
In recent years, new methods have been developed at a rapid pace. Some of the advancements for continuous optimization methods have been enabled by focusing on how evolutionary algorithms can be compared and contrasted to more traditional gradient based methods. Arguably, evolutionary algorithms are one of the best methods now available for derivative-free optimization (DFO) on higher dimensional problems.
Another area of rapid recent advancement is in the area of run-time analysis for evolutionary algorithms applied to discrete optimization problems. Here, some techniques could be successfully borrowed from traditional algorithm analysis, but many new techniques were necessary to understand the more complicated stochastic processes arising from nature-inspired algorithms.
EA theory has gained much momentum over the last few years and has made numerous valuable contributions to the field of evolutionary computation. Much of this momentum is due to the Dagstuhl seminars on "Theory of Evolutionary Algorithms", which has become the leading meeting for EA theorists in the world.
Specific Topics
This year, the following topics had the particular attention of organizers, speakers both of overview and specialized talks, and participants of the breakout sessions (also called "working parties" or "working groups" in other Dagstuhl seminars). A brief summary of the breakout sessions can be found in Section 4.
Advanced Runtime Analysis Methods. One difficulty common to the analysis of most randomized search heuristics is that, while in principle these are nothing more than randomized algorithms, their particular nature disallows the use of many methods used in the classical analysis of the randomized algorithms community. The particular difficulties include dealing with populations (instead of a single search point as in other local optimizers) or recombination (instead of mutation only, which creates a search point close to the parent). Both the fitness level method and various variants of the drift analysis method were greatly improved in the last three years to cope with these difficulties. Also, the fixed budget view on runtime analysis was recognized as an alternative way of analyzing the performance of randomized search heuristics, and may better reflect performance indicators used by practitioners.
Complexity Theory for Randomized Search Heuristics. Complexity theory is one of the corner stones of classical computer science. Informally speaking, the black-box complexity of a problem is the number of fitness evaluations needed to find its solution. Unfortunately, it turns out that some notoriously hard problems like the clique problem in graphs have a ridiculously small black-box complexity. In their 2010 GECCO award winning paper, Lehre and Witt presented a promising way out of this dilemma. They introduced a restricted version of black-box complexity that on the one hand still covers most known evolutionary approaches, but on the other hand forbids the counter-intuitive tricks that led to the undesired results in the first approach. Following up on this work, several variants of black-box complexity have been suggested. During the seminar, in particular during the breakout session on this topic, these were intensively discussed, new variations have been proposed, both from the theory perspective and from practitioners, and a new approach was presented explaining how to gain new and better evolutionary algorithms from black-box complexity results.
Theory of Natural Evolutionary Algorithms. Recently, the idea of conducting a natural gradient descent in the space of sampling probability distributions has been introduced in evolution strategies. The idea offers a very principled design technique for search algorithms that sample from a parameterized distribution. Comparable to classical deterministic optimization, an iterated gradient descent is performed on the distribution parameters. The remarkable difference is that the curvature information on this space is known a priori. A natural descent that is based on the inner product from the Fisher information matrix uses this curvature and is comparable to a Newton method. This new and promising idea is lesser-known and largely unexploited for evolutionary computation. This is a completely new topic for this seminar series, but it is related to previous work on Covariance Matrix Adaptation.
Theory for Multi-Objective Optimization. One of the most explosive areas of growth both within evolutionary algorithms and in derivative-free optimization is multi-objective optimization. This is because good evolutionary algorithms now exist that can cover complex Pareto fronts for 2 to 12 objectives. This gives practitioners a much more informative view of the trade-offs that are possible when facing a multi-objective decision, and can also reveal trade-offs that otherwise would never be seen: for example if we are wishing to minimize cost and maximize quality, there can be "knees" at specific locations on the Pareto front where one might dramatically improve quality while incurring only a slight increase in cost. This is why multi-objective optimization methods that "map" the Pareto front are exciting. Yet, there is not a great deal of work on the theory of multi-objective optimization. Evolutionary algorithms are the method of choice for derivative-free multi-objective optimization and there is a great need to bring together theoreticians who are interested in evolutionary algorithms and those practitioners who are developing multi-objective optimization methods. This was another new topic for this seminar series.
Landscape Analysis. Landscape Analysis is an old idea: one should be able to compute features of a search space that can be used to guide search. One of the problems is that the kinds of metrics that one might wish to know about usually take exponential time to compute exactly. However, recent work has shown that some NP-hard problems (TSP, Graph Coloring, MAXSAT) can be decomposed to the point that Fourier methods can be used to exactly compute statistic moments of the search space (and subspaces of the search space) in polynomial time; these computation normally require exponential time for arbitrary problems. How can this information be used to guide the search, and to potentially replace heuristics with more exact information? New results in this area open new opportunities for exploration at this seminar series.
- Youhei Akimoto (Shinshu University, JP) [dblp]
- Khulood Alyahya (University of Birmingham, GB) [dblp]
- Dirk Arnold (Dalhousie University, CA) [dblp]
- Anne Auger (University of Paris South XI, FR) [dblp]
- Karl Bringmann (MPI für Informatik - Saarbrücken, DE) [dblp]
- Dimo Brockhoff (INRIA - University of Lille 1, FR) [dblp]
- Francisco Chicano (University of Malaga, ES) [dblp]
- Kenneth A. De Jong (George Mason University - Fairfax, US) [dblp]
- Benjamin Doerr (MPI für Informatik - Saarbrücken, DE) [dblp]
- Carola Doerr (University of Paris VII, FR) [dblp]
- Anton V. Eremeev (Sobolev Institute of Mathematics - Omsk, RU) [dblp]
- Tobias Friedrich (Universität Jena, DE) [dblp]
- Marcus Gallagher (The University of Queensland - Brisbane, AU) [dblp]
- Tobias Glasmachers (Ruhr-Universität Bochum, DE) [dblp]
- Nikolaus Hansen (University of Paris South XI, FR) [dblp]
- Michael Hellwig (Fachhochschule Vorarlberg, AT) [dblp]
- Christian Igel (University of Copenhagen, DK) [dblp]
- Thomas Jansen (Aberystwyth University, GB) [dblp]
- Daniel Johannsen (Potsdam, DE) [dblp]
- Joshua D. Knowles (University of Manchester, GB) [dblp]
- Timo Kötzing (Universität Jena, DE) [dblp]
- Marvin Künnemann (MPI für Informatik - Saarbrücken, DE) [dblp]
- William B. Langdon (University College London, GB) [dblp]
- Jörg Lässig (Hochschule Zittau/Görlitz, DE) [dblp]
- Per Kristian Lehre (University of Nottingham, GB) [dblp]
- Johannes Lengler (ETH Zürich, CH) [dblp]
- Andrei Lissovoi (Technical University of Denmark - Lyngby, DK) [dblp]
- Luigi Malagò (University of Milan, IT) [dblp]
- Rachael Morgan (The University of Queensland - Brisbane, AU) [dblp]
- Samadhi Nethmini Nallaperuma (University of Adelaide, AU) [dblp]
- Frank Neumann (University of Adelaide, AU) [dblp]
- Pietro S. Oliveto (University of Birmingham, GB) [dblp]
- Adam Prugel-Bennett (University of Southampton, GB) [dblp]
- Jonathan Rowe (University of Birmingham, GB) [dblp]
- Günter Rudolph (TU Dortmund, DE) [dblp]
- Tom Schaul (New York University, US) [dblp]
- Manuel Schmitt (Universität Erlangen-Nürnberg, DE) [dblp]
- Jonathan L. Shapiro (University of Manchester, GB) [dblp]
- Peter F. Stadler (Universität Leipzig, DE) [dblp]
- Sebastian U. Stich (ETH Zürich, CH) [dblp]
- Dirk Sudholt (University of Sheffield, GB) [dblp]
- Andrew M. Sutton (Colorado State University, US) [dblp]
- Olivier Teytaud (University of Paris South XI, FR) [dblp]
- Lothar Thiele (ETH Zürich, CH) [dblp]
- Darrell Whitley (Colorado State University, US) [dblp]
- Christine Zarges (University of Birmingham, GB) [dblp]
Related Seminars
- Dagstuhl Seminar 00071: Theory of Evolutionary Algorithms (2000-02-13 - 2000-02-18) (Details)
- Dagstuhl Seminar 02031: Theory of Evolutionary Algorithms (2002-01-13 - 2002-01-18) (Details)
- Dagstuhl Seminar 04081: Theory of Evolutionary Algorithms (2004-02-15 - 2004-02-20) (Details)
- Dagstuhl Seminar 06061: Theory of Evolutionary Algorithms (2006-02-05 - 2006-02-10) (Details)
- Dagstuhl Seminar 08051: Theory of Evolutionary Algorithms (2008-01-27 - 2008-02-01) (Details)
- Dagstuhl Seminar 10361: Theory of Evolutionary Algorithms (2010-09-05 - 2010-09-10) (Details)
- Dagstuhl Seminar 15211: Theory of Evolutionary Algorithms (2015-05-17 - 2015-05-22) (Details)
- Dagstuhl Seminar 17191: Theory of Randomized Optimization Heuristics (2017-05-07 - 2017-05-12) (Details)
- Dagstuhl Seminar 19431: Theory of Randomized Optimization Heuristics (2019-10-20 - 2019-10-25) (Details)
- Dagstuhl Seminar 22081: Theory of Randomized Optimization Heuristics (2022-02-20 - 2022-02-25) (Details)
- Dagstuhl Seminar 24271: Theory of Randomized Optimization Heuristics (2024-06-30 - 2024-07-05) (Details)
Classification
- Evolutionary algorithms
- optimization
- search heuristics
- algorithms
Keywords
- Evolutionary algorithms
- bio-inspired search heuristics
- theoretical analysis
- complexity analysis