Dagstuhl-Seminar 19431
Theory of Randomized Optimization Heuristics
( 20. Oct – 25. Oct, 2019 )
Permalink
Organisatoren
- Carola Doerr (Sorbonne University - Paris, FR)
- Carlos M. Fonseca (University of Coimbra, PT)
- Tobias Friedrich (Hasso-Plattner-Institut - Potsdam, DE)
- Xin Yao (Southern Univ. of Science and Technology - Shenzen, CN)
Kontakt
- Michael Gerke (für wissenschaftliche Fragen)
- Susanne Bach-Bernhard (für administrative Fragen)
Impacts
- A tight lower bound on the expected runtime of standard steady state genetic algorithms : article in GECCO '20 : Proceedings of the 2020 Genetic and Evolutionary Computation Conference, June 2020 - Oliveto, Pietro S.; Sudholt, Dirk; Witt, Carsten - New York : ACM, 2020. - Pages 1323–1331.
- Benchmarking in Optimization : Best Practice and Open Issues : article - Weise, Thomas; Wagner, Markus; Volz, Vanessa; Naujoks, Boris; Moore, Jason H.; Lopez-Ibanez, Manuel; Doerr, Carola; Bartz-Beielstein, Thomas; Eftimov, Tome; Fischbach, Andreas; Kerschke, Pascal; Cava, William La; Orzechowsky, PatrykBerg, Daan van den - Cornell University : arXiv.org, 2020. - 50 pp..
- Improved Fixed-Budget Results via Drift Analysis : article in International Conference on Parallel Problem Solving from Nature PPSN 2020 : Parallel Problem Solving from Nature, PPSN XVI - Kötzing, Timo; Witt, Carsten - Berlin : Springer, 2020. - pp. 648-660.
- Variance Reduction for Better Sampling in Continuous Domains : article in International Conference on Parallel Problem Solving from Nature, PPSN 2020 : Parallel Problem Solving from Nature, PPSN XVI - Meunier, Laurent; Doerr, Carola; Rapin, Jeremy; Teytaud, Olivier - Berlin : Springer, 2020. - pp 154-168.
Programm
Efficient optimization techniques affect our personal, industrial, and academic environments through the supply of well-designed processes that enable a best-possible use of our limited resources. Despite significant research efforts, most real-world problems remain too complex to admit exact analytical or computational solutions. Therefore, heuristic approaches that trade the accuracy of a solution for a simple algorithmic structure, fast running times, or an otherwise efficient use of computational resources are required. Randomized optimization heuristics form a highly successful and thus frequently applied class of such problem solvers. Among the best-known representatives of this class are stochastic local search methods, Monte Carlo techniques, genetic and evolutionary algorithms, and swarm intelligence techniques.
The theory of randomized optimization heuristics strives to set heuristic approaches on firm ground by providing a sound mathematical foundation for this important class of algorithms. Key challenges in this research area comprise optimization under uncertainty, parameter selection (most randomized optimization heuristics are parametrized), the role and usefulness of so-called crossover operations (i.e., the idea of creating high-quality solution candidates by recombining previously evaluated ones) and, more generally, performance guarantees for advanced heuristics such as population-based techniques, estimation-of-distribution algorithms, differential evolution, and others.
The present Dagstuhl Seminar is a continuation of the seminar series on "Theory of Evolutionary Algorithms". Today the field extends far beyond evolutionary algorithms – a development that previous Dagstuhl Seminars have significantly influenced. This seminar aims at continuing this highly beneficial exchange by addressing, in particular, two strongly linked research communities not previously represented in the seminar series: stochastic control theory and empirical benchmarking of randomized optimization heuristics.
Recent work has shown that there is a very close link between the theory of randomized optimization heuristics and stochastic control theory, both regarding the nature of the "systems" of interest and the analytical techniques that have been developed in the two communities. Exploring these affinities and how the two communities can work together to mutual benefit is one of the focus topics to be addressed with this seminar. The second focus topic is benchmarking, which plays a central role in empirical performance assessment. Benchmarking can be an essential tool for theoreticians to develop their mathematically-derived ideas into practical algorithms. It thereby encourages a principled discussion between empirically-driven and theoretically-driven researchers.
Discussing the mathematical challenges arising in the performance analysis of randomized heuristics has always been a central topic in this Dagstuhl Seminar series. Among other achievements, important connections between continuous and discrete optimization have been established. Theoretical performance guarantees will continue to play an important role, fostering innovative ideas and the systematic transfer of knowledge between the research communities represented in the seminar.
Press Reviews
- Nevergrad, an evolutionary optimization platform, adds new key features
Facebook Artificial Intelligence blog, April 13, 2020
Efficient optimization techniques affect our personal, industrial, and academic environments through the supply of well-designed processes that enable a best-possible use of our limited resources. Despite significant research efforts, most real-world problems remain too complex to admit exact analytical or computational solutions. Therefore, heuristic approaches that trade the accuracy of a solution for a simple algorithmic structure, fast running times, or an otherwise efficient use of computational resources are required. Randomized optimization heuristics form a highly successful and thus frequently applied class of such problem solvers. Among the best-known representatives of this class are stochastic local search methods, Monte Carlo techniques, genetic and evolutionary algorithms, and swarm intelligence techniques.
The theory of randomized optimization heuristics strives to set heuristic approaches on firm ground by providing a sound mathematical foundation for this important class of algorithms. Key challenges in this research area comprise optimization under uncertainty, parameter selection (most randomized optimization heuristics are parametrized), the role and usefulness of so-called crossover operations (i.e., the idea of creating high-quality solution candidates by recombining previously evaluated ones) and, more generally, performance guarantees for advanced heuristics such as population-based techniques, estimation-of-distribution algorithms, differential evolution, and others.
Dagstuhl Seminar 19431 on "Theory of Randomized Optimization Heuristics" was a continuation of the seminar series originally on "Theory of Evolutionary Algorithms". Today the field extends far beyond evolutionary algorithms -- a development that previous Dagstuhl seminars have significantly influenced.
While the previous seminar 17191 had a very strong focus on methodological questions and techniques needed to analyze stochastic optimization heuristics, the present seminar had among its three main focus topics chosen to foster interaction with two strongly linked research communities that were not previously represented in the seminar series: stochastic control theory and empirical benchmarking of randomized optimization heuristics.
Recent work has shown that there is a very close link between the theory of randomized optimization heuristics and stochastic control theory, both regarding the nature of the "systems" of interest and the analytical techniques that have been developed in the two communities. At the seminar, we have explored these affinities through the two invited presentations of Luc Pronzato and Vivek Borkar, through contributed talks highlighting different aspects studied in both communities (e.g., the presentation on one-shot optimization by Olivier Teytaud), and through focussed breakout sessions, in particular the one fully dedicated to Connection between the analysis of evolution strategies and estimation of distribution algorithms and the analysis of stochastic approximation and ordinary differential equations, in which interesting similarities and differences between the two fields were identified.
The second focus topic of Dagstuhl Seminar 19431 was benchmarking of optimization heuristics. Benchmarking plays a central role in empirical performance assessment. However, it can also be an essential tool for theoreticians to develop their mathematically-derived ideas into practical algorithms, thereby encouraging a principled discussion between empirically-driven and theoretically-driven researchers. Benchmarking has been a central topic in several breakout sessions, for example those on Competitions and Benchmarking, Algorithm Selection and Configuration, but also the breakout session on Multi-Objective Optimization. A survey of best practices in empirical benchmarking has been kick-started in the breakout session on Benchmarking: Best Practices and Open Issues.
Discussing the mathematical challenges arising in the performance analysis of randomized heuristics has always been a central topic in this Dagstuhl seminar series. Among other achievements, important connections between continuous and discrete optimization have been established, most notably in the form of drift theorems, which are typically applicable regardless of the nature of the search space. Apart from such methodological advances, we have also observed two other trends bridging discrete and continuous optimization: (i) an increased interest in analyzing parameter-dependent performance guarantees, and (ii) the recent advances in the study of estimation of distribution algorithms, which borrow techniques from both discrete and continuous optimization theory. These topics have been discussed in the invited talk of Youhei Akimoto, in several contributed presentations, and in the breakout sessions on Measuring Optimization Progress in an Invariant Way for Comparison-Based Algorithms and on Mixed-Integer Optimization.
Apart from these focus topics, we have discussed a large number of different aspects related to the theoretical analysis of optimization heuristics, including brainstorming sessions on doing "good" research, organizing a repository to share lecture materials, and discussing the role of uncertainty in heuristic optimization, the connections between experimental design and one-shot optimization, the importance of neutral representations, and differences between stochastic gradient descent methods and evolution strategies, to give but a few examples.
Organization
The seminar hosted the following type of events:
- Five invited talks of 30 minutes each:
- Youhei Akimoto on Expected Runtime Bound for the (1+1)-Evolution Strategy
- Vivek Borkar on Overview of Stochastic Approximation and Related Schemes
- Pietro S. Oliveto on What is Hot in Evolutionary Computation (Part~2)
- Luc Pronzato on Dynamical Search
- Carsten Witt on What is Hot in Evolutionary Computation (Part~1)
- 20 contributed talks of around 15-20 minutes
- Four "flash talks" of about 10 minutes
- Eleven parallel breakout sessions in various different formats, ranging from brainstorming on the purpose and future of theory research through actual problem solving on one-shot optimization to kick-starting a survey on best practices on benchmarking optimization heuristics.
All presentations were plenary, i.e., in a single session, while the breakouts were organized in parallel working groups, to allow for focused and specialized discussions. As in previous years, the breakout sessions were very well perceived, and can be considered a well-established format of this seminar series. As a result of these discussions, we are planning a workshop and a survey on benchmarking best practices. Several open problems have been proposed and discussed at the seminar, and we are confident that the seminar has helped to establish new collaborations.
Our traditional hike on Wednesday was a good opportunity to discuss in a less formal setting and to get to know each other. On Thursday evening, we had the special opportunity to hear Jonathan Rowe present activities of the Alain Turing Institute https://www.turing.ac.uk/, where he serves as Programme Director for Data Science for Science. Last, but not least, the wine-and-cheese party complemented the scientific activities with a relaxed social event.
We would like to thank the Dagstuhl team and all participants for making seminar 19431 a great success and a great pleasure to organize.
- Youhei Akimoto (University of Tsukuba, JP) [dblp]
- Denis Antipov (ITMO University - St. Petersburg, RU) [dblp]
- Anne Auger (INRIA Saclay - Palaiseau, FR) [dblp]
- Thomas Bäck (Leiden University, NL) [dblp]
- Thomas Bartz-Beielstein (TH Köln, DE) [dblp]
- Hans-Georg Beyer (Fachhochschule Vorarlberg - Dornbirn, AT) [dblp]
- Vivek Shripad Borkar (Indian Institute of Technology Bombay, IN) [dblp]
- Dimo Brockhoff (INRIA Saclay - Palaiseau, FR) [dblp]
- Maxim Buzdalov (ITMO University - St. Petersburg, RU) [dblp]
- Arina Buzdalova (ITMO University - St. Petersburg, RU) [dblp]
- Francisco Chicano (University of Málaga, ES) [dblp]
- Alexandre Chotard (Calais University, FR) [dblp]
- Benjamin Doerr (Ecole Polytechnique - Palaiseau, FR) [dblp]
- Carola Doerr (Sorbonne University - Paris, FR) [dblp]
- Anton V. Eremeev (Institute of Scientific Information for Social Sciences RAS - Moscow, RU) [dblp]
- Carlos M. Fonseca (University of Coimbra, PT) [dblp]
- Tobias Friedrich (Hasso-Plattner-Institut - Potsdam, DE) [dblp]
- Tobias Glasmachers (Ruhr-Universität Bochum, DE) [dblp]
- Nikolaus Hansen (INRIA Saclay - Palaiseau, FR) [dblp]
- Thomas Jansen (Aberystwyth University, GB) [dblp]
- Timo Kötzing (Hasso-Plattner-Institut - Potsdam, DE) [dblp]
- Martin S. Krejca (Hasso-Plattner-Institut - Potsdam, DE) [dblp]
- Per Kristian Lehre (University of Birmingham, GB) [dblp]
- Johannes Lengler (ETH Zürich, CH) [dblp]
- Andrei Lissovoi (University of Sheffield, GB) [dblp]
- Manuel López-Ibáñez (University of Manchester, GB) [dblp]
- Rolf H. Möhring (TU Berlin, DE) [dblp]
- Frank Neumann (University of Adelaide, AU) [dblp]
- Pietro S. Oliveto (University of Sheffield, GB) [dblp]
- Luc Pronzato (Laboratoire I3S - Sophia Antipolis, FR) [dblp]
- Jonathan Rowe (University of Birmingham, GB) [dblp]
- Günter Rudolph (TU Dortmund, DE) [dblp]
- Ofer M. Shir (Tel-Hai College - Upper Galilee, IL) [dblp]
- Patrick Spettel (Fachhochschule Vorarlberg - Dornbirn, AT) [dblp]
- Dirk Sudholt (University of Sheffield, GB) [dblp]
- Andrew M. Sutton (University of Minnesota - Duluth, US) [dblp]
- Olivier Teytaud (Facebook - Paris, FR) [dblp]
- Dirk Thierens (Utrecht University, NL) [dblp]
- Vida Vukašinovic (Jozef Stefan Institut - Ljubljana, SI) [dblp]
- Markus Wagner (University of Adelaide, AU) [dblp]
- Elizabeth Wanner (Aston University - Birmingham, GB) [dblp]
- Thomas Weise (Hefei University, CN) [dblp]
- Carsten Witt (Technical University of Denmark - Lyngby, DK) [dblp]
- Xin Yao (Southern Univ. of Science and Technology - Shenzen, CN) [dblp]
- Christine Zarges (Aberystwyth University, GB) [dblp]
- Anatoly Zhigljavsky (Cardiff University, GB) [dblp]
Verwandte Seminare
- Dagstuhl-Seminar 00071: Theory of Evolutionary Algorithms (2000-02-13 - 2000-02-18) (Details)
- Dagstuhl-Seminar 02031: Theory of Evolutionary Algorithms (2002-01-13 - 2002-01-18) (Details)
- Dagstuhl-Seminar 04081: Theory of Evolutionary Algorithms (2004-02-15 - 2004-02-20) (Details)
- Dagstuhl-Seminar 06061: Theory of Evolutionary Algorithms (2006-02-05 - 2006-02-10) (Details)
- Dagstuhl-Seminar 08051: Theory of Evolutionary Algorithms (2008-01-27 - 2008-02-01) (Details)
- Dagstuhl-Seminar 10361: Theory of Evolutionary Algorithms (2010-09-05 - 2010-09-10) (Details)
- Dagstuhl-Seminar 13271: Theory of Evolutionary Algorithms (2013-06-30 - 2013-07-05) (Details)
- Dagstuhl-Seminar 15211: Theory of Evolutionary Algorithms (2015-05-17 - 2015-05-22) (Details)
- Dagstuhl-Seminar 17191: Theory of Randomized Optimization Heuristics (2017-05-07 - 2017-05-12) (Details)
- Dagstuhl-Seminar 22081: Theory of Randomized Optimization Heuristics (2022-02-20 - 2022-02-25) (Details)
- Dagstuhl-Seminar 24271: Theory of Randomized Optimization Heuristics (2024-06-30 - 2024-07-05) (Details)
Klassifikation
- soft computing / evolutionary algorithms
Schlagworte
- heuristic optimization
- black-box optimization
- evolutionary algorithms
- randomized search algorithms
- theoretical computer science
- control theory
- benchmarking