Biologically plausible heavy-tailed connectivity enhances generalizations on cognitive tasks in recurrent neural networks
Abstract
While heavy-tailed synaptic weight distributions are pervasive in biological neural networks, their computational role---particularly in relation to generalization---remains poorly understood. To address this, we develop a novel optimal-transport-based optimization algorithm that incorporates key biological constraints, including Dale’s principle and heavy-tailed synaptic statistics, to train recurrent neural networks (RNNs) on a wide range of cognitive tasks. We show that these biologically constrained, heavy-tailed RNNs exhibit substantially improved generalization, which we further characterize within a PAC-Bayes framework. Our theoretical analysis and numerical experiments reveal two complementary mechanisms underlying this generalization enhancement. Topologically, heavy-tailed connectivity induces an effectively low-rank structure, which in turn yields low-dimensional neural dynamics. Geometrically, heavy-tailed connectivity intrinsically shapes task variable representations to lie near a linear manifold, thereby improving generalization for a linear readout strategy. Together, these results identify heavy-tailed connectivity as a biologically grounded intrinsic mechanism that promotes low-rank structure and favorable representational geometry, leading to improved generalization in flexible cognitive tasks.