Conservation Laws for Modern Neural Architectures
Viet Hoang Tran ⋅ VINH KHANH BUI ⋅ Ngoc Tan Lai ⋅ Nam Nguyen ⋅ Tuan Dam ⋅ Tan Nguyen
Abstract
Understanding gradient descent dynamics is key to explaining the success of over-parameterized models, where implicit bias manifests through conservation laws in gradient flow. While such laws are well understood for linear and ReLU networks, they remain largely unexplored for modern architectures. This work develops a unified framework to characterize conservation laws for contemporary models, including feedforward networks with GELU, SiLU, and SwiGLU activations, multihead attention with sinusoidal and rotary positional encodings, and Mixture-of-Experts architectures under diverse gating designs. Our theoretical findings are supported by experiments that validate the predicted invariants.
Successful Page Load