Timezone: »
No man is an island. Humans develop the ability to communicate with a large community by coordinating with different interlocutors within short conversations. This ability is largely understudied by the research on building neural language communicative agents. We study the task of few-shot language coordination: agents quickly adapting to their conversational partners’ language abilities. Different from current communicative agents trained with self-play, we in- investigate this more general paradigm by requiring the lead agent to coordinate with a population of agents each of whom has different linguistic abilities. This leads to a general agent able to quickly adapt to communicating with unseen agents in the population. Unlike prior work, success here requires the ability to model the partner’s beliefs, a vital component of human communication. Drawing inspiration from the study of theory-of-mind (ToM; Premack & Woodruff (1978)), we study the effect of the speaker explicitly modeling the listener’s mental state. Learning by communicating with a population, the speakers, as shown in our experiments, acquire the ability to learn to predict the reactions of their partner upon various messages on-the-fly. The speaker’s predictions for the future actions help it generate the best instructions in order to maximize communicative goal with message costs. To examine our hypothesis that the instructions generated with ToM modeling yield better communication per- performance, we employ our agents in both a referential game and a language navigation task. Positive results from our experiments also hint at the importance of explicitly modeling language acquisition as a socio-pragmatic progress.
Author Information
Hao Zhu (Carnegie Mellon University)
Graham Neubig (Carnegie Mellon University)
Yonatan Bisk (Carnegie Mellon University)
Related Events (a corresponding poster, oral, or spotlight)
-
2021 Poster: Few-shot Language Coordination by Modeling Theory of Mind »
Fri. Jul 23rd 04:00 -- 06:00 AM Room Virtual
More from the Same Authors
-
2023 : Adapting to Gradual Distribution Shifts with Continual Weight Averaging »
Jared Fernandez · Saujas Vaduguru · Sanket Vaibhav Mehta · Yonatan Bisk · Emma Strubell -
2023 Workshop: Workshop on Theory of Mind in Communicating Agents »
Hao Zhu · Jennifer Hu · Hyunwoo Kim · Alane Suhr · Saujas Vaduguru · Chenghao Yang · Pei Zhou · Xuhui Zhou -
2023 Oral: Cross-Modal Fine-Tuning: Align then Refine »
Junhong Shen · Liam Li · Lucio Dery · Corey Staten · Mikhail Khodak · Graham Neubig · Ameet Talwalkar -
2023 Poster: Cross-Modal Fine-Tuning: Align then Refine »
Junhong Shen · Liam Li · Lucio Dery · Corey Staten · Mikhail Khodak · Graham Neubig · Ameet Talwalkar -
2023 Poster: PAL: Program-aided Language Models »
Luyu Gao · Aman Madaan · Shuyan Zhou · Uri Alon · Pengfei Liu · Yiming Yang · Jamie Callan · Graham Neubig -
2023 Poster: Why do Nearest Neighbor Language Models Work? »
Frank Xu · Uri Alon · Graham Neubig -
2022 Poster: Neuro-Symbolic Language Modeling with Automaton-augmented Retrieval »
Uri Alon · Frank Xu · Junxian He · Sudipta Sengupta · Dan Roth · Graham Neubig -
2022 Poster: A Framework for Learning to Request Rich and Contextually Useful Information from Humans »
Khanh Nguyen · Yonatan Bisk · Hal Daumé III -
2022 Spotlight: A Framework for Learning to Request Rich and Contextually Useful Information from Humans »
Khanh Nguyen · Yonatan Bisk · Hal Daumé III -
2022 Spotlight: Neuro-Symbolic Language Modeling with Automaton-augmented Retrieval »
Uri Alon · Frank Xu · Junxian He · Sudipta Sengupta · Dan Roth · Graham Neubig -
2022 Poster: Symmetric Machine Theory of Mind »
Melanie Sclar · Graham Neubig · Yonatan Bisk -
2022 Spotlight: Symmetric Machine Theory of Mind »
Melanie Sclar · Graham Neubig · Yonatan Bisk -
2021 Poster: Examining and Combating Spurious Features under Distribution Shift »
Chunting Zhou · Xuezhe Ma · Paul Michel · Graham Neubig -
2021 Spotlight: Examining and Combating Spurious Features under Distribution Shift »
Chunting Zhou · Xuezhe Ma · Paul Michel · Graham Neubig -
2020 Poster: Optimizing Data Usage via Differentiable Rewards »
Xinyi Wang · Hieu Pham · Paul Michel · Antonios Anastasopoulos · Jaime Carbonell · Graham Neubig -
2020 Poster: XTREME: A Massively Multilingual Multi-task Benchmark for Evaluating Cross-lingual Generalisation »
Junjie Hu · Sebastian Ruder · Aditya Siddhant · Graham Neubig · Orhan Firat · Melvin Johnson