Workshop
Tiny Titans: The next wave of On-Device Learning for Foundation Models (TTODLer-FM)
Stefanos Laskaridis · Samuel Horvath · Berivan Isik · Peter Kairouz · Bilge Acun · Christina Giannoula · Angelos Katharopoulos · Martin Takac · Nic Lane
The rapid evolution of Deep Learning, propelled by transformer-based architectures and significant hardware advancements, has unlocked unprecedented capabilities across diverse domains, from biological sciences to autonomous systems. As foundation models continue to scale, they introduce new challenges in resource management, particularly in data centers, and data availability prompting us to broaden our exploration of leveraging distributed and on-device resources for training and inference. Small Language Models (SLMs) are emerging as a compelling alternative for generative AI, particularly at the edge, offering a sustainable balance between efficiency and user privacy. This workshop aims to bring together algorithms and systems experts to discuss the opportunities and challenges of on-device machine learning. We hope to explore to what extent SLMs can compete with or complement LLMs and identify methods to enhance their quality and efficiency. Addressing this shift requires innovation in algorithm and system co-design, underscoring the importance of interdisciplinary approaches for future applications.
Live content is unavailable. Log in and register to view live content