Derivative Informed Learning of Exchange-Correlation Functionals
Eike S. Eberhard ⋅ Luca Anthony Thiede ⋅ Abdulrahman Aldossary ⋅ Andreas Burger ⋅ Nicholas Gao ⋅ Vignesh Bhethanabotla ⋅ Alan Aspuru-Guzik ⋅ Stephan Günnemann
Abstract
Machine-learned (ML) XC-functionals promise improved accuracy, but overfit to training energies and basis sets without proper regularization. We introduce Derivative Informed XC-Loss (DI-Loss), a loss that regularizes ML-XC training by supervising energy gradients on the Grassmannian of density matrices. Crucially, rather than merely matching the self-consistent fixed point, DI-Loss forces the dynamics of the SCF process to align with the target functional. Across all evaluated architectures, this improves basis set generalization and electron densities. Distilling hybrid ($\mathcal{O}(N^4)$-scaling) functionals to $\mathcal{O}(N^3)$-scaling ML-XC functionals, we observe a $>60\%$ reduction in energy MAE compared to energy and density supervision alone, while simultaneously reducing the density-dipole error by 65\%. We show that initializing from these distilled functionals can reduce hybrid SCF iterations by up to 55\%. Furthermore, DI-Loss improves TDDFT excited-state predictions by approximately 30\%.
Successful Page Load