Equivariant Covariance Tensors: Guaranteed SPD Uncertainty for Tensor-Valued Geometric Learning
Ruihan Liu ⋅ Yu Ji ⋅ Jianbo Yu ⋅ Shifu Yan ⋅ Qingchao Jiang
Abstract
Tensor-valued prediction is fundamental to geometric deep learning, yet uncertainty quantification (UQ) for such outputs remains an open challenge. While E(3)-equivariant neural networks excel at point estimates, they lack rigorous confidence measures. We introduce a general framework for E(3)-equivariant UQ, modeling the full predictive distribution where both mean and covariance preserve rotational symmetry. Our approach decomposes the covariance into irreducible representations $\mathrm{Sym}^2(\rho_c) \cong 2\times(l=0) \oplus 2\times(l=2) \oplus 1\times(l=4)$. By mapping from the flat Lie algebra $\mathfrak{sym}(6)$ to the curved SPD manifold via matrix exponentiation, we strictly ensure positive-definite covariances while maintaining exact equivariance. Furthermore, we formulate a Log-Euclidean Equivariant Scoring Objective (LE-ESO)---a robust surrogate loss based on the Multivariate Laplace distribution---providing mathematical robustness to heavy-tailed errors and guaranteed stability. Extensive validation on ModelNet40 (inertia tensors) and large-scale materials science benchmarks (dielectric tensors) demonstrates that our method achieves competitive performance and provides physically consistent, symmetry-preserving uncertainty estimates with reliable OOD detection capabilities.
Successful Page Load