Skip to yearly menu bar Skip to main content


Combining Pre-trained LoRA Modules Improves Few-shot Adaptation of Foundation Models to New Tasks

Nader Asadi · Mahdi Beitollahi · Yasser Khalil · Yinchuan Li · Guojun Zhang · Xi Chen

Abstract

Chat is not available.