ML Scientist

Connecting Scholars with the Latest Academic News and Career Paths

Conference Calls

Scalable Continual Learning for Lifelong Foundation Models Workshop at NeurIPS 2024

The Scalable Continual Learning for Lifelong Foundation Models Workshop is set to take place at NeurIPS 2024. This workshop aims to bring together experts and researchers from various domains, including language, vision, speech, and multimodal machine learning, to discuss recent advances in scalable continual learning (CL) that could replace static foundation model (FM) training. The goal is to model dynamic real-world information and foster collaboration.

The call for papers is now open, and submissions are welcome on topics related to scaling the continual learning of foundation models. Key areas of interest include addressing catastrophic forgetting, dealing with real-world problems with domain shifts and long-tailed data distributions, and combining FMs with structured knowledge sources. The deadline for submissions is September 09, 23:59 AoE.

For any questions, please email continual-fomo at googlegroups.com.

Leave a Reply

Your email address will not be published. Required fields are marked *