Continual Adaptation at Scale: Towards Sustainable AI
Ghada Sokar ⋅ Gintare Karolina Dziugaite
Abstract
Training Foundation Models (FMs) is currently so costly that only few can afford it. The immense data, compute, and energy demands are increasingly unsustainable. Continual adaptation offers a viable alternative, where AI models can learn quickly and continually through every day interactions, just like humans and animals. Unfortunately, FMs lack this rapid adaptability: new behavior in FMs can be induced by prompting or fine-tuning, but there are no easy ways to quickly shape the behavior, for instance, to permanently add, remove, or modify their skill set in a sustainable way. This workshop aims to discuss new research directions that will enable fast continual adaptation at scale to drive more sustainable AI.
Successful Page Load