Skip to yearly menu bar Skip to main content


Virtual invited talk
in
Workshop: Dynamic Neural Networks

Incorporating Dynamic Structures into Pre-trained Language Models

Xuanjing Huang


Abstract:

Recent years have witnessed great success of large-scale pre-trained language models. However, performing the entire language model for each sample can be computationally uneconomical. Hence, dynamic networks are attracting a lot of attention in the NLP community, which can adapt their structures or parameters to the input samples during inference. In contrast to static language models, dynamic ones enjoy favorable properties such as efficiency, adaptiveness, accuracy, etc. In this talk, I will review recent advances on dynamic networks in NLP and discuss prospects and challenges of applying dynamic structure to pre-trained language models.

Chat is not available.