TOP
Search the Dagstuhl Website
Looking for information on the websites of the individual seminars? - Then please:
Not found what you are looking for? - Some of our services have separate websites, each with its own search option. Please check the following list:
Schloss Dagstuhl - LZI - Logo
Schloss Dagstuhl Services
Seminars
Within this website:
External resources:
  • DOOR (for registering your stay at Dagstuhl)
  • DOSA (for proposing future Dagstuhl Seminars or Dagstuhl Perspectives Workshops)
Publishing
Within this website:
External resources:
dblp
Within this website:
External resources:
  • the dblp Computer Science Bibliography


Dagstuhl Seminar 25432

Deep Continual Learning in the Foundation Model Era

( Oct 19 – Oct 24, 2025 )

Permalink
Please use the following short url to reference this page: https://www.dagstuhl.de/25432

Organizers

Contact

Motivation

Foundation models are gigantic deep neural networks trained using self-supervised learning. They are revolutionizing AI and have resulted in significant socio-economic impact. They excel at many downstream applications. Deep continual learning studies accumulating knowledge from non-stationary data streams, a highly desirable capability for future AI systems. Continual learning offers a range of tools, theories, and methods that can effectively address some of the primary challenges in the use of foundation models.

Research on foundation models and continual learning converge on numerous topics. There is a pressing need for theory and methodologies for the continual learning of foundation models, bypassing the costly retraining from scratch when new data arrives, while ensuring the ongoing relevance of models. Foundation models have also sparked societal concerns, particularly around intellectual property (e.g., image generators) and undesired skills (e.g., nudity generation). The potential of continual learning to address these concerns, such as through the development of "unlearning" theory to remove these skills, offers a promising future. Further, the evaluation of forgetting on foundation models necessitates new methodologies and metrics. Due to their vast size, parameter-efficient and compute-efficient methods for continual learning need to be devised. In summary, the rise of foundation models and their interaction with continual learning research, present a range of urgent, impactful and societally relevant research topics.

In this Dagstuhl Seminar we are planning to discuss the following topics:

  • How can continual learning contribute to the sustainable, data- and energy-efficient, updating of foundation models?
  • How to use continual learning with parameter-efficient adaptation methods required due to the gigantic size of foundation models?
  • How can continual adaptation aid in aligning foundation models with human values, in selectively remediating biases, and in personalization?
  • What new benchmarks and evaluation metrics are needed to propel continual foundation model learning research, and how do we measure knowledge accumulation and loss in foundation models?
  • How can we exploit continual learning theory to address the unlearning of undesired skills and the removal of private information in foundation models?
  • How can we integrate new continual learning opportunities offered by foundation models, such as in-context learning or retrieval-based schemes?

In discussing these topics with a group of world-class researchers, we aim to set the research agenda on this topic for the years to come.

Copyright Christopher Kanan, Martin Mundt, Tinne Tuytelaars, and Joost van de Weijer

Related Seminars
  • Dagstuhl Seminar 23122: Deep Continual Learning (2023-03-19 - 2023-03-24) (Details)

Classification
  • Artificial Intelligence
  • Computer Vision and Pattern Recognition
  • Machine Learning

Keywords
  • Continual learning
  • Foundation Models
  • Deep Learning