TOP
Suche auf der Schloss Dagstuhl Webseite
Sie suchen nach Informationen auf den Webseiten der einzelnen Seminare? - Dann:
Nicht fündig geworden? - Einige unserer Dienste laufen auf separaten Webseiten mit jeweils eigener Suche. Bitte beachten Sie folgende Liste:
Schloss Dagstuhl - LZI - Logo
Schloss Dagstuhl Services
Seminare
Innerhalb dieser Seite:
Externe Seiten:
  • DOOR (zum Registrieren eines Dagstuhl Aufenthaltes)
  • DOSA (zum Beantragen künftiger Dagstuhl Seminare oder Dagstuhl Perspektiven Workshops)
Publishing
Innerhalb dieser Seite:
Externe Seiten:
dblp
Innerhalb dieser Seite:
Externe Seiten:
  • die Informatik-Bibliographiedatenbank dblp


Dagstuhl-Seminar 22081

Theory of Randomized Optimization Heuristics

( 20. Feb – 25. Feb, 2022 )

(zum Vergrößern in der Bildmitte klicken)

Permalink
Bitte benutzen Sie folgende Kurz-Url zum Verlinken dieser Seite: https://www.dagstuhl.de/22081

Organisatoren

Kontakt


Impacts

Programm

Summary

This seminar is part of a biennial seminar series. This year, we focused on connections between classical topics of the community, such as Evolutionary Algorithms and Evolutionary Strategies (EA, ES), Estimation-of-Distribution Algorithms (EDA) and Evolutionary Multi-Objective Optimization (EMO), and related fields like Stochastic Gradient Descent (SGD) and Bayesian Optimization (BO). The mixture proved to be extremely successful. Already the first talk turned into a two hours long, vivid and productive plenary discussion. Participants like Peter Richtarik and Sebastian Stich brought valuable new perspectives from the SGD community, and Mickaël Binois contributed the BO perspective. This yielded some new approaches to long-standing open problems, specifically for a convergence proof of the CMA-ES algorithm on quadratic functions.

Another interesting and fruitful aspect of the seminar was a shift of perspective to search spaces that are under-represented in the community. Traditionally, the search spaces are product spaces, either discrete (especially the n-dimensional hypercube), or continuous (d-dimensional Euclidean space). This year we had some intense discussions in plenum and in working groups on other search spaces, triggered especially by Ekhine Irurozki's presentation on permutation spaces.

Naturally, a big part of the seminar was also devoted to classical topics of the community. Highlights included talks by Benjamin Doerr on the first runtime result for the Non-Dominated Sorting Genetic Algorithm (NSGA-II) and by Tobias Glasmachers on Convergence Analysis of the Hessian Estimation Evolution Strategy (HE-ES). The latter is the first convergence proof for a covariance matrix algorithm that does not truncate the condition number of the estimated covariance matrix. Some interesting new topics were also identified in traditional fields, such as whether we can understand better in which situations adapativity is necessary for efficient optimization by considering k-adaptive query complexity of optimization benchmarks.

Overall, as organizers we were extremely happy with the mix of core community members and researchers from related fields. The connections with the latter were close enough that scientific discussions could (also) happen on technical levels, which is particularly useful since some low-hanging fruits are available from such interchanges. Importantly, the exchange happened between people who would probably not have met each other outside of the Dagstuhl Seminar.

The seminar took place during the peak of the Omicron wave of Covid19, which made planning very difficult. The key step during preparation phase was a survey among the participants a few weeks before the seminar. We asked how likely it was that they could participate in person, and under which circumstances they would prefer which format (in-person or hybrid). The participants signalled us very clearly that they wanted this event to happen, and that they wanted it to happen in person. We want to thank all participants for their support! Other seminars in the week before and after ours had to be cancelled altogether, and this might also have happened to our seminar if not for the determination of our participants.

The seminar was smaller than previous versions, due to corona regulations. Moreover, some participants had to cancel at the last moment because they were corona-positive, or because they had no reliable child care. Especially the latter point can be frustrating, and we hope that Dagstuhl will be able to resume their support for on-site child care in the future. On the positive side, the intensity of the seminar more than made up for the smaller size, and might even have been due to the smaller number of participants.

Finally, we want to thank Dagstuhl for their great support, both financially and to their great staff. We could always feel that it was their top priority to help us, and we are greatly indebted for the support!

The organizers,
Anne Auger, Carlos M Fonseca, Tobias Friedrich, Johannes Lengler

Copyright Anne Auger, Carlos M. Fonseca, Tobias Friedrich, and Johannes Lengler

Motivation

Optimization is at the core of many applications in academia or industry. At the heart of an optimization problem is a search space and one or more objective functions to be optimized. In practical applications, objective functions are often not explicitly known, and therefore cannot be written analytically. Instead, they are available to the optimizer as a black-box. In those situations, exact optimization methods such as (integer) linear programming, semi-definite programming, etc., cannot be applied. Instead, one needs to resort to black-box or gray-box methods. Among those, randomized optimization heuristics (ROHs) are often the methods of choice when objective function gradients are not available, when the objective functions are not convex, are subject to noise or uncertainty, or when one needs to optimize several conflicting objectives. In this last case, one talks about multiobjective optimization, a field where ROHs are successful because they allow the Pareto front to be approximated as a whole in contrast to more classical approaches based on objective aggregation that find a single solution at a time.

The field of ROHs, including evolutionary algorithms (EAs), shares some common ground with the related field of Machine Learning (ML). Learning is underlying many important algorithms such as Estimation-of-Distribution Algorithms (EDAs), Covariance Matrix Adaptation Evolutionary Strategy (CMA-ES), and Linkage-Tree Genetic Algorithms, with several components of those algorithms strongly connected to (or using) machine learning techniques such as hierarchical clustering and expectation maximization algorithms. Also, optimization is an important component of machine learning for training neural networks or for hyperparameter tuning, which is a typical black-box problem.

Theoretical analysis of ROH helps to quantify the performance of different methods on different fitness landscapes and classes of functions in order to improve the understanding of the algorithms, compare them, and get more insights into the influence of different components and parameters. Important progress has been made in the past years to understand more and more complex algorithms that are increasingly closer to those used in practice. Yet, crucial aspects are still missing, such as a better understanding of the dynamics of set-based optimization methods in the context of multiobjective optimization, being able to analyze algorithms that learn associations between variables, both in continuous and discrete domains, as well as the corresponding parameter adaptation mechanisms. Those share some common ground with stochastic gradient descent (SGD) algorithms and will be a focus topic of the Dagstuhl Seminar.

In this context, the overall goal of this Dagstuhl Seminar is to drive research in theory of ROHs towards analyzing more and more complex methods that are closer to state-of-the-art ROHs and, specifically, methods that have some impact outside the Evolutionary Computation domain. In addition to specialists in the theory of ROHs, researchers with expertise in the design of EDAs, adaptive Evolution Strategies and evolutionary multiobjective optimization (EMO) algorithms, but also expert researchers from derivative free optimization, Bayesian optimization, and optimization for machine learning, including adaptive stochastic gradient descent techniques are invited.

Copyright Anne Auger, Carlos M. Fonseca, Tobias Friedrich, and Johannes Lengler

Teilnehmer
  • Mickaël Binois (INRIA - Sophia Antipolis, FR) [dblp]
  • Alexandre Chotard (Calais University, FR) [dblp]
  • Duc-Cuong Dang (University of Southampton, GB) [dblp]
  • Benjamin Doerr (Ecole Polytechnique - Palaiseau, FR) [dblp]
  • Carola Doerr (Sorbonne University - Paris, FR) [dblp]
  • Carlos M. Fonseca (University of Coimbra, PT) [dblp]
  • Armand Gissler (Ecole Polytechnique - Palaiseau, FR) [dblp]
  • Tobias Glasmachers (Ruhr-Universität Bochum, DE) [dblp]
  • Andreia P. Guerreiro (INESC-ID - Lisboa, PT) [dblp]
  • Nikolaus Hansen (INRIA Saclay - Palaiseau, FR) [dblp]
  • Ekhine Irurozki (Telecom Paris, FR) [dblp]
  • Martin S. Krejca (Sorbonne University - Paris, FR) [dblp]
  • Johannes Lengler (ETH Zürich, CH) [dblp]
  • Xiaoyue Li (Hasso-Plattner-Institut, Universität Potsdam, DE)
  • Peter Richtarik (KAUST - Thuwal, SA) [dblp]
  • Günter Rudolph (TU Dortmund, DE) [dblp]
  • Sebastian U. Stich (CISPA - Saarbrücken, DE) [dblp]
  • Dirk Sudholt (Universität Passau, DE) [dblp]
  • Andrew M. Sutton (University of Minnesota - Duluth, US) [dblp]
  • Ricardo Takahashi (Federal University of Minas Gerais-Belo Horizonte, BR)
  • Elizabeth Wanner (CEFET - Belo Horizonte, BR) [dblp]

Verwandte Seminare
  • Dagstuhl-Seminar 00071: Theory of Evolutionary Algorithms (2000-02-13 - 2000-02-18) (Details)
  • Dagstuhl-Seminar 02031: Theory of Evolutionary Algorithms (2002-01-13 - 2002-01-18) (Details)
  • Dagstuhl-Seminar 04081: Theory of Evolutionary Algorithms (2004-02-15 - 2004-02-20) (Details)
  • Dagstuhl-Seminar 06061: Theory of Evolutionary Algorithms (2006-02-05 - 2006-02-10) (Details)
  • Dagstuhl-Seminar 08051: Theory of Evolutionary Algorithms (2008-01-27 - 2008-02-01) (Details)
  • Dagstuhl-Seminar 10361: Theory of Evolutionary Algorithms (2010-09-05 - 2010-09-10) (Details)
  • Dagstuhl-Seminar 13271: Theory of Evolutionary Algorithms (2013-06-30 - 2013-07-05) (Details)
  • Dagstuhl-Seminar 15211: Theory of Evolutionary Algorithms (2015-05-17 - 2015-05-22) (Details)
  • Dagstuhl-Seminar 17191: Theory of Randomized Optimization Heuristics (2017-05-07 - 2017-05-12) (Details)
  • Dagstuhl-Seminar 19431: Theory of Randomized Optimization Heuristics (2019-10-20 - 2019-10-25) (Details)
  • Dagstuhl-Seminar 24271: Theory of Randomized Optimization Heuristics (2024-06-30 - 2024-07-05) (Details)

Klassifikation
  • Neural and Evolutionary Computing

Schlagworte
  • black-box optimization
  • evolutionary and genetic algorithms
  • stochastic gradient descent
  • randomized search algorithms
  • theoretical computer science
  • derivative-free optimization