Dagstuhl-Seminar 25432
Deep Continual Learning in the Foundation Model Era
( 19. Oct – 24. Oct, 2025 )
Permalink
Organisatoren
- Christopher Kanan (University of Rochester, US)
- Martin Mundt (Universität Bremen, DE)
- Tinne Tuytelaars (KU Leuven, BE)
- Joost van de Weijer (Computer Vision Center - Barcelona, ES)
Kontakt
- Andreas Dolzmann (für wissenschaftliche Fragen)
- Jutka Gasiorowski (für administrative Fragen)
Foundation models are gigantic deep neural networks trained using self-supervised learning. They are revolutionizing AI and have resulted in significant socio-economic impact. They excel at many downstream applications. Deep continual learning studies accumulating knowledge from non-stationary data streams, a highly desirable capability for future AI systems. Continual learning offers a range of tools, theories, and methods that can effectively address some of the primary challenges in the use of foundation models.
Research on foundation models and continual learning converge on numerous topics. There is a pressing need for theory and methodologies for the continual learning of foundation models, bypassing the costly retraining from scratch when new data arrives, while ensuring the ongoing relevance of models. Foundation models have also sparked societal concerns, particularly around intellectual property (e.g., image generators) and undesired skills (e.g., nudity generation). The potential of continual learning to address these concerns, such as through the development of "unlearning" theory to remove these skills, offers a promising future. Further, the evaluation of forgetting on foundation models necessitates new methodologies and metrics. Due to their vast size, parameter-efficient and compute-efficient methods for continual learning need to be devised. In summary, the rise of foundation models and their interaction with continual learning research, present a range of urgent, impactful and societally relevant research topics.
In this Dagstuhl Seminar we are planning to discuss the following topics:
- How can continual learning contribute to the sustainable, data- and energy-efficient, updating of foundation models?
- How to use continual learning with parameter-efficient adaptation methods required due to the gigantic size of foundation models?
- How can continual adaptation aid in aligning foundation models with human values, in selectively remediating biases, and in personalization?
- What new benchmarks and evaluation metrics are needed to propel continual foundation model learning research, and how do we measure knowledge accumulation and loss in foundation models?
- How can we exploit continual learning theory to address the unlearning of undesired skills and the removal of private information in foundation models?
- How can we integrate new continual learning opportunities offered by foundation models, such as in-context learning or retrieval-based schemes?
In discussing these topics with a group of world-class researchers, we aim to set the research agenda on this topic for the years to come.
Verwandte Seminare
- Dagstuhl-Seminar 23122: Deep Continual Learning (2023-03-19 - 2023-03-24) (Details)
Klassifikation
- Artificial Intelligence
- Computer Vision and Pattern Recognition
- Machine Learning
Schlagworte
- Continual learning
- Foundation Models
- Deep Learning