Balancing Fidelity and Diversity in Diffusion Models via Symmetric Attention Decomposition: Hopfield Perspective
Hyunmin Cho ⋅ Woo Kyoung Han ⋅ Kyong Hwan Jin
Abstract
We characterize the pre-softmax attention matrix $\mathbf{QK^\top}$ in transformers as an associative memory matrix encoding pairwise associations between input features. By decomposing this matrix into its symmetric and skew-symmetric parts, we interpret the symmetric component as governing the structure of the *energy landscape*, and the skew-symmetric component as driving *circulation* on that landscape. Leveraging the energy formulation induced by the symmetric component, we derive Hopfield-style stability measures that quantify the stability of retrieved features. Empirically, we observe meaningful correlations between Hopfield-style stability measures and the fidelity$-$diversity trade-offs in generation. Finally, we propose a controllable knob to modulate this trade-off by directly modifying the circulation of the underlying dynamics.
Successful Page Load