Skip to yearly menu bar Skip to main content


Keynote Talk
in
Workshop: Neural Conversational AI Workshop - What’s left to TEACH (Trustworthy, Enhanced, Adaptable, Capable and Human-centric) chatbots?

Invited Talk: LLMs with long-term memory and better factuality by Zhou Yu


Abstract:

Seamlessly communicating with machines has always been the ultimate goal of artificial intelligence. This talk addresses the two key milestones towards general intelligence: how to effectively track infinite long history and improve the generation content's factuality. Specifically, we will talk about a stateful transformer architecture that can achieve effective memory read and write. We will also address how to use reinforcement learning with human feedback to improve generation content's faithfulness.

Chat is not available.