Timezone: »
A well-known limitation of existing molecular generative models is that the generated molecules highly resemble those in the training set. To generate truly novel molecules that may have even better properties for de novo drug discovery, more powerful exploration in the chemical space is necessary. To this end, we propose Molecular Out-Of-distribution Diffusion(MOOD), a score-based diffusion scheme that incorporates out-of-distribution (OOD) control in the generative stochastic differential equation (SDE) with simple control of a hyperparameter, thus requires no additional costs. Since some novel molecules may not meet the basic requirements of real-world drugs, MOOD performs conditional generation by utilizing the gradients from a property predictor that guides the reverse-time diffusion process to high-scoring regions according to target properties such as protein-ligand interactions, drug-likeness, and synthesizability. This allows MOOD to search for novel and meaningful molecules rather than generating unseen yet trivial ones. We experimentally validate that MOOD is able to explore the chemical space beyond the training distribution, generating molecules that outscore ones found with existing methods, and even the top 0.01% of the original training pool. Our code is available at https://github.com/SeulLee05/MOOD.
Author Information
Seul Lee (KAIST)
Jaehyeong Jo (KAIST)
Sung Ju Hwang (UNIST)
More from the Same Authors
-
2021 : Entropy Weighted Adversarial Training »
Minseon Kim · Jihoon Tack · Jinwoo Shin · Sung Ju Hwang -
2021 : Consistency Regularization for Adversarial Robustness »
Jihoon Tack · Sihyun Yu · Jongheon Jeong · Minseon Kim · Sung Ju Hwang · Jinwoo Shin -
2023 : Generalizable Lightweight Proxy for Robust NAS against Diverse Perturbations »
Hyeonjeong Ha · Minseon Kim · Sung Ju Hwang -
2023 Poster: Personalized Subgraph Federated Learning »
Jinheon Baek · Wonyong Jeong · Jiongdao Jin · Jaehong Yoon · Sung Ju Hwang -
2023 Poster: Continual Learners are Incremental Model Generalizers »
Jaehong Yoon · Sung Ju Hwang · Yue Cao -
2023 Poster: Scalable Set Encoding with Universal Mini-Batch Consistency and Unbiased Full Set Gradient Approximation »
Jeffrey Willette · Seanie Lee · Bruno Andreis · Kenji Kawaguchi · Juho Lee · Sung Ju Hwang -
2023 Poster: Margin-based Neural Network Watermarking »
Byungjoo Kim · Suyoung Lee · Seanie Lee · Son · Sung Ju Hwang -
2022 Poster: Score-based Generative Modeling of Graphs via the System of Stochastic Differential Equations »
Jaehyeong Jo · Seul Lee · Sung Ju Hwang -
2022 Spotlight: Score-based Generative Modeling of Graphs via the System of Stochastic Differential Equations »
Jaehyeong Jo · Seul Lee · Sung Ju Hwang