Kronecker Generative Networks: A General Neural Architecture for Parameter-Efficient Learning Across Classification Tasks
Abstract
Modern neural networks derive much of their effectiveness from rich connectivity patterns. Yet, existing architectures often fix the topology at either the sparse or dense extremes, thereby limiting structural flexibility and analysis. We propose Kronecker Generative Networks (KGNs), an algebraic framework that constructs neural network topologies via recursive generation rules, treating topology as a first-class design object. KGNs generate families of directed acyclic graphs with controllable connectivity complexity, enabling systematic interpolation between sparse and dense aggregation regimes. Under this formulation, architectures such as FractalNet and DenseNet arise as specific instantiations corresponding to different generation rules. We provide theoretical analysis of acyclicity, connectivity scaling, and expressiveness, and demonstrate experimentally that KGN instantiations achieve favorable accuracy-efficiency trade-offs across multiple domains.