Poster
in
Workshop: AI for Science: Scaling in AI for Scientific Discovery
Physics-Informed Weakly Supervised Learning for Interatomic Potentials
Makoto Takamoto · Viktor Zaverkin · Mathias Niepert
Keywords: [ weakly supervised method ] [ physics informed method ] [ Molecule ] [ machine learning interatomic potential ]
Machine learning plays an increasingly important role in computational chemistry and materials science, complementing computationally intensive ab initio and first-principles methods. Despite their utility, machine-learning models often lack generalization capability and robustness during atomistic simulations, yielding unphysical energy and force predictions that hinder their real-world applications. We address this challenge by introducing a physics-informed, weakly supervised approach for training machine-learned interatomic potentials (MLIPs). We introduce two novel loss functions, using the concept of conservative forces and extrapolating the total energy via a Taylor expansion. Our approach enhances the accuracy of MLIPs applied to learning tasks with sparse training data set sizes and reduces the need for pre-training computationally demanding models. Particularly, we perform extensive experiments demonstrating reduced energy and force errors---often lower by a factor of two---for various baseline models and benchmark data sets. We also enhance the accuracy of energy and force predictions compared to previous methods that employ data augmentation via a Taylor expansion. Finally, we show that our approach facilitates MLIPs' training in a setting where the computation of forces is infeasible at the reference level, such as those employing complete-basis-set extrapolation.