Timezone: »

Contributed Talk: Continual Adaptation for Efficient Machine Communication
Minae Kwon

Sat Jun 15 04:45 PM -- 05:00 PM (PDT) @

To communicate with new partners in new contexts, humans rapidly form new linguistic conventions. Recent language models trained with deep neural networks are able to comprehend and produce the existing conventions present in their training data, but are not able to flexibly and interactively adapt those conventions on the fly as humans do. We introduce a repeated reference task as a benchmark for models of adaptation in communication and propose a regularized continual learning framework that allows an artificial agent initialized with a generic language model to more accurately and efficiently understand their partner over time. We evaluate this framework through simulations on COCO and in real-time reference game experiments with human partners.

Author Information

Minae Kwon (Stanford University)

More from the Same Authors