Skip to yearly menu bar Skip to main content


Poster
in
Workshop: AI for Science: Scaling in AI for Scientific Discovery

Efficient Evolutionary Search over Chemical Space with Large Language Models

Haorui Wang · Marta Skreta · Yuanqi Du · Wenhao Gao · Lingkai Kong · Cher-Tian Ser · Felix Strieth-Kalthoff · Chenru Duan · Yuchen Zhuang · Yue Yu · Yanqiao Zhu · Alan Aspuru-Guzik · Kirill Neklyudov · Chao Zhang

Keywords: [ large language models ] [ Molecule Optimization ] [ Evolutionary Search ] [ AI For Science ] [ Molecular Generation ]


Abstract:

Molecular discovery, when formulated as an optimization problem, presents significant computational challenges as the optimization objectives can be non-differentiable. Evolutionary Algorithms (EAs), often used to optimize black-box objectives in molecular discovery, traverse chemical space by performing random mutations and crossovers, leading to a large number of expensive objective evaluations. In this work, we ameliorate this shortcoming by incorporating chemistry-aware Large Language Models (LLMs) into EAs. We consider both commercial and open-source LLMs trained on large corpora of chemical information as crossover and mutation operations in EAs. We perform an extensive empirical study on multiple tasks involving property optimization and molecular similarity, demonstrating that the joint usage of LLMs with EAs yields superior performance over all baseline models across single- and multi-objective settings. We demonstrate that our algorithm improves both the quality of the final solution and convergence speed, thereby reducing the number of required objective evaluations.

Chat is not available.