TOP
Suche auf der Schloss Dagstuhl Webseite
Sie suchen nach Informationen auf den Webseiten der einzelnen Seminare? - Dann:
Nicht fündig geworden? - Einige unserer Dienste laufen auf separaten Webseiten mit jeweils eigener Suche. Bitte beachten Sie folgende Liste:
Schloss Dagstuhl - LZI - Logo
Schloss Dagstuhl Services
Seminare
Innerhalb dieser Seite:
Externe Seiten:
  • DOOR (zum Registrieren eines Dagstuhl Aufenthaltes)
  • DOSA (zum Beantragen künftiger Dagstuhl Seminare oder Dagstuhl Perspektiven Workshops)
Publishing
Innerhalb dieser Seite:
Externe Seiten:
dblp
Innerhalb dieser Seite:
Externe Seiten:
  • die Informatik-Bibliographiedatenbank dblp


Dagstuhl-Seminar 24271

Theory of Randomized Optimization Heuristics

( 30. Jun – 05. Jul, 2024 )

(zum Vergrößern in der Bildmitte klicken)

Permalink
Bitte benutzen Sie folgende Kurz-Url zum Verlinken dieser Seite: https://www.dagstuhl.de/24271

Organisatoren

Kontakt

Gemeinsame Dokumente


Summary

This seminar is part of a biennial seminar series. This year, we envisioned to focus on constraint handling, multivariate estimation-of-distribution algorithms (EDAs), as well as stochastic approximation and Markov chain stability analysis. This vision worked well for constraint handling but not so much for the other two topics, since several key invitees rejected our invitations. Nonetheless, the seminar quickly and organically refocused, and we had plenty of other important topics to discuss instead, as we detail below.

The previous iteration of the seminar had taken place during the peak of the COVID Omicron wave, with lots of restrictions in place. We were glad to see this seminar happening again in the usual format. It was still a bit smaller than usual, due to unfortunate last-minute cancellations, but we managed to partially make up for that by inviting more young researchers. In any case, the group size of about 40 worked very well.

We used morning sessions for talks and group discussions, and afternoon sessions for breakout formats, with the exception of Monday. Several time slots were exempt from the schedule, explicitly leaving free time for individual discussions and/or leisure activities. We kept talks reasonably short, usually at a maximum of 20 minutes, leaving sufficient time for discussion. We had to stop discussions only very rarely so as to respect Dagstuhl's meal schedule. Otherwise, we managed to give sufficient space to each topic by flexibly re-scheduling a few topics on the fly.

We used the first day for group forming and to bring everyone to the same page. We started with a general introduction, a small ice-breaker game, and a series of overview talks. We had talks on optimization heuristics in discrete and continuous search spaces, which represents a classic divide in our community, as well as introductions to multi-objective optimization and constraint handling. Furthermore, we had presentations by Thomas Sauerwald on Balls into Bins and by Vanessa Volz on Democratising Real-World Problem Tailored Optimisation, two contributions a bit remote from the core topic of the seminar, so as to build bridges to neighboring fields. Kathrin Klamroth and Oswin Krause continued this series later on with presentations on A Multiobjective Perspective on Constraint Handling and on Evolution in the Quantum world: on tuning quantum dot devices.

We invited and actually nudged all young researchers to introduce themselves and their research with brief presentations. We had a total of seven such talks, taking place in the morning slots of the following days. This format was generally perceived as useful and fruitful.

Besides junior and outreach talks, there was of course also a lot of activity on core topics of the community. Two selected highlights were the presentation of Armand Gissler's proof of linear convergence of the CMA-ES algorithm on convex quadratic problems by means of Markov chain analysis, and the presentation of Per Kristian Lehre on his SLO hierarchy, a categorization of complexity classes for black-box optimization.

There was a total of 10 breakout sessions, which are all summarized in Section 4 of the full Dagstuhl Report. They covered a broad spectrum of diverse and important topics, and they easily replaced the initial focus topics that we had in mind. This time, we had a fair amount of breakout sessions that garnered the attention of both the discrete and the continuous subcommunity in roughly equal parts. As such, we had sessions on negative drift (which is a tool applicable in either domain), on mixed-integer problems (which requires expertise from both domains), the general structure of discrete problems with the aim to classify them similarly to how continuous problems can be classified, and a session on an abstract framework that can be used by practitioners and analyzed by theoreticians. Moreover, we discussed rather recent hot topics such as multi-objective optimization, co-evolution, and rich surrogate models, assessing which direction to take for the future. This was complemented by a session on permutation problems, highlighting a domain that only saw little attention so far, as well as a session on how to place the theoretical analysis of randomized search heuristics within the grander spectrum of artificial intelligence. All of these sessions had a larger number of participants and vibrant discussions, showing the interest of the community in these topics.

We established a new format for collecting topics and for scheduling breakout sessions. It was inspired by a workshop that had taken place at the Lorentz Center in Leiden (Netherlands) earlier this year. Instead of collecting topics asynchronously upfront and over lunch breaks, we dedicated a short session to it. To this end, we replaced the usual classroom arrangement with a fully symmetrical setup, asking participants to step up, briefly explain their proposal, write down a title on a sheet of colored paper, and put it up on a wall. Moreover, instead of trying to reach consensus, we left it to session organizers to schedule their sessions so as to minimize (perceived) overlap. The new process was received as an improvement.

For the first time in this seminar series, we offered a trip to Trier as a social activity on Wednesday afternoon, with a hike taking place in parallel (and in the rain). This worked very smoothly, and it was a great experience, especially for participants from far away, even if they had been to Dagstuhl before.

We are very grateful for the opportunity of organizing a seminar in Dagstuhl. We have to thank for the financial support, for the comfort of rooms directly at the venue, four meals a day, wonderful facilities, and all of that including full service by very friendly, reactive, and always helpful staff. Thank you!

Copyright Anne Auger, Tobias Glasmachers, Martin S. Krejca, and Johannes Lengler

Motivation

Randomized optimization heuristics (ROHs) are general-purpose solvers that are frequently applied for discrete or continuous optimization when other classes of solvers fail or are not applicable. Typical examples of ROHs include genetic algorithms, evolution strategies, and estimation-of-distribution algorithms (EDAs). Since ROHs are not designed with a proof or analysis in mind, their theoretical analysis is a particularly challenging endeavor. Yet, their practical success, extremely wide applicability, and growing relevance in many domains in industry and academic research push for progress in their understanding. This Dagstuhl Seminar is part of a very successful Dagstuhl Seminar series with the goal of driving this research area forward.

Over the span of more than two decades, the theoretical analysis of ROHs has made significant progress, going from simple settings to evermore complex scenarios. This advance allows us to tackle highly relevant topics that seemed out of reach before. In this seminar, we will focus on three such topics, for which our understanding is still limited, namely, (1) constraint handling, (2) multivariate EDAs, and (3) stochastic approximation and Markov chain stability analysis. In order to approach these problems from multiple angles, we aim to bring together top researchers from inside the ROH community and from related fields. This should result in a vibrant mixture of different expertise, allowing each participant of the seminar to learn about new topics, to participate in insightful discussions, and to potentially start promising collaborations.

Constraint handling is a classic topic in optimization with a theory-practice gap: Many real-world problems have constraints, while many algorithms and nearly all existing theoretical analyses are limited to unconstrained search spaces. In ROHs, the main route to optimization in the presence of constraints is to equip proven algorithms with additional constraint handling mechanisms. This is analogous to turning a gradient-based optimization algorithm into an interior-point solver by means of the barrier method. However, other communities like gradient-based and derivative-free optimization have different approaches to constraint handling.

EDAs constitute an important class of ROHs. In contrast to classical ROHs, which maintain an explicit multi-set of solutions, EDAs evolve a probabilistic model of the search space. Over the last years, EDAs have received growing interest from the theoretical ROH community. Still, to this date, all theoretical insights for EDAs refer to algorithms that build their probabilistic model based on the assumption that the problem variables are independent from each other (so-called univariate models). This is far from how EDAs are applied in practice, where these algorithms are capable of modeling complex (multivariate) interactions among the problem variables.

In the continuous domain, the main challenge is to analyze adaptive algorithms where the distribution for sampling candidate solutions is adapted in each iteration, using advanced mechanisms. In the past few years, tremendous theoretical progress was achieved to analyze step-size adaptive algorithms. Those results were proven using different approaches: finding a Lyapunov function in the state variables of the algorithm that satisfies a drift condition, using an ordinary differential equation (ODE) method, as well as analyzing the stability of underlying Markov chains. This progress was possible thanks to developing new theoretical tools that take into account the specificity of adaptive evolution strategies.

As with each seminar of this series, we are looking forward to exciting exchanges among researchers from the various areas invited, which has been instrumental in the past for substantial progress and unification in our field and beyond. With this foundation, this seminar offers a great opportunity to initiate collaborations connecting ROHs to neighboring fields that could drastically reduce the gap between theory and practice.

Copyright Anne Auger, Tobias Glasmachers, Martin S. Krejca, and Johannes Lengler

Teilnehmer

Please log in to DOOR to see more details.

  • Sumit Adak (Technical University of Denmark - Lyngby, DK) [dblp]
  • Denis Antipov (University of Adelaide, AU) [dblp]
  • Dirk Arnold (Dalhousie University - Halifax, CA) [dblp]
  • Asma Atamna (Ruhr-Universität Bochum, DE) [dblp]
  • Anne Auger (INRIA Saclay - Palaiseau, FR) [dblp]
  • Dimo Brockhoff (INRIA Saclay - Palaiseau, FR) [dblp]
  • Alexandre Chotard (Calais University, FR) [dblp]
  • Benjamin Doerr (Ecole Polytechnique - Palaiseau, FR) [dblp]
  • Carlos M. Fonseca (University of Coimbra, PT) [dblp]
  • Stephan Frank (Ruhr-Universität Bochum, DE)
  • Oskar Girardin (Ecole Polytechnique - Palaiseau, FR)
  • Armand Gissler (Ecole Polytechnique - Palaiseau, FR) [dblp]
  • Tobias Glasmachers (Ruhr-Universität Bochum, DE) [dblp]
  • Andreia P. Guerreiro (INESC-ID - Lisbon, PT) [dblp]
  • Nikolaus Hansen (Inria & Institut Polytechnique de Paris - Palaiseau, FR) [dblp]
  • Mario Alejandro Hevia Fajardo (University of Birmingham, GB) [dblp]
  • Thomas Jansen (Aberystwyth University, GB) [dblp]
  • Alexander Jungeilges (Ruhr-Universität Bochum, DE)
  • Kathrin Klamroth (Universität Wuppertal, DE) [dblp]
  • Timo Kötzing (Hasso-Plattner-Institut, Universität Potsdam, DE) [dblp]
  • Oswin Krause (University of Copenhagen, DK) [dblp]
  • Martin S. Krejca (Ecole Polytechnique - Palaiseau, FR) [dblp]
  • Per Kristian Lehre (University of Birmingham, GB) [dblp]
  • Tristan Marty (Ecole Polytechnique - Palaiseau, FR)
  • Daiki Morinaga (University of Tsukuba, JP) [dblp]
  • Frank Neumann (University of Adelaide, AU) [dblp]
  • Andre Opris (Universität Passau, DE) [dblp]
  • Aishwarya Radhakrishnan (Hasso-Plattner-Institut, Universität Potsdam, DE) [dblp]
  • Jonathan Rowe (University of Birmingham, GB) [dblp]
  • Günter Rudolph (TU Dortmund, DE) [dblp]
  • Thomas Sauerwald (University of Cambridge, GB) [dblp]
  • Jonathan L. Shapiro (University of Manchester, GB) [dblp]
  • Dirk Sudholt (Universität Passau, DE) [dblp]
  • Andrew M. Sutton (University of Minnesota - Duluth, US) [dblp]
  • Kento Uchida (Yokohama National University, JP) [dblp]
  • Vanessa Volz (CWI - Amsterdam, NL) [dblp]
  • Carsten Witt (Technical University of Denmark - Lyngby, DK) [dblp]
  • Christine Zarges (Aberystwyth University, GB) [dblp]
  • Weijie Zheng (Harbin Institute of Technology - Shenzhen, CN) [dblp]

Verwandte Seminare
  • Dagstuhl-Seminar 00071: Theory of Evolutionary Algorithms (2000-02-13 - 2000-02-18) (Details)
  • Dagstuhl-Seminar 02031: Theory of Evolutionary Algorithms (2002-01-13 - 2002-01-18) (Details)
  • Dagstuhl-Seminar 04081: Theory of Evolutionary Algorithms (2004-02-15 - 2004-02-20) (Details)
  • Dagstuhl-Seminar 06061: Theory of Evolutionary Algorithms (2006-02-05 - 2006-02-10) (Details)
  • Dagstuhl-Seminar 08051: Theory of Evolutionary Algorithms (2008-01-27 - 2008-02-01) (Details)
  • Dagstuhl-Seminar 10361: Theory of Evolutionary Algorithms (2010-09-05 - 2010-09-10) (Details)
  • Dagstuhl-Seminar 13271: Theory of Evolutionary Algorithms (2013-06-30 - 2013-07-05) (Details)
  • Dagstuhl-Seminar 15211: Theory of Evolutionary Algorithms (2015-05-17 - 2015-05-22) (Details)
  • Dagstuhl-Seminar 17191: Theory of Randomized Optimization Heuristics (2017-05-07 - 2017-05-12) (Details)
  • Dagstuhl-Seminar 19431: Theory of Randomized Optimization Heuristics (2019-10-20 - 2019-10-25) (Details)
  • Dagstuhl-Seminar 22081: Theory of Randomized Optimization Heuristics (2022-02-20 - 2022-02-25) (Details)

Klassifikation
  • Neural and Evolutionary Computing

Schlagworte
  • black-box optimization heuristics
  • evolution strategies
  • genetic and evolutionary algorithms
  • runtime and convergence analysis
  • stochastic processes