Timezone: »

Structure-informed Language Models Are Protein Designers
Zaixiang Zheng · Yifan Deng · Dongyu Xue · Yi Zhou · Fei YE · Quanquan Gu

Tue Jul 25 05:00 PM -- 06:30 PM (PDT) @ Exhibit Hall 1 #329
Event URL: https://github.com/BytedProtein/ByProt »

This paper demonstrates that language models are strong structure-based protein designers. We present LM-Design, a generic approach to reprogramming sequence-based protein language models (pLMs), that have learned massive sequential evolutionary knowledge from the universe of natural protein sequences, to acquire an immediate capability to design preferable protein sequences for given folds. We conduct a structural surgery on pLMs, where a lightweight structural adapter is implanted into pLMs and endows it with structural awareness. During inference, iterative refinement is performed to effectively optimize the generated protein sequences. Experiments show that LM-Design improves the state-of-the-art results by a large margin, leading to 4% to 12% accuracy gains in sequence recovery (e.g., 55.65%/56.63% on CATH 4.2/4.3 single-chain benchmarks, and >60% when designing protein complexes). We provide extensive and in-depth analyses, which verify that LM-Design can (1) indeed leverage both structural and sequential knowledge to accurately handle structurally non-deterministic regions, (2) benefit from scaling data and model size, and (3) generalize to other proteins (e.g., antibodies and de novo proteins).

Author Information

Zaixiang Zheng (ByteDance Research)
Yifan Deng (University Of Wisconsin-Madison)
Dongyu Xue (ByteDance AI Lab)
Yi Zhou (ByteDance AI Lab)
Fei YE (ByteDance AI Lab)
Quanquan Gu (University of California, Los Angeles)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors