Poster
in
Workshop: Next Generation of Sequence Modeling Architectures
Reparameterized Multi-Resolution Convolutions for Long Sequence Modelling
Jake Cunningham · Giorgio Giannone · Mingtian Zhang · Marc Deisenroth
Abstract:
Global convolutions have shown increasing promise as powerful general-purpose sequence models. However, training long convolutions is challenging, and kernel parameterizations must be able to learn long-range dependencies without overfitting. This work introduces reparameterized multi-resolution convolutions ($\texttt{MRConv}$), a novel approach to parameterizing global convolutional kernels for long-sequence modelling. By leveraging multi-resolution convolutions, incorporating structural reparameterization and introducing learnable kernel decay, $\texttt{MRConv}$ learns expressive long-range kernels that perform well across various data modalities. Our experiments demonstrate state-of-the-art performance on Long Range Arena and we report improved performance on ImageNet classification by replacing 2D convolutions with 1D $\texttt{MRConv}$ layers.
Chat is not available.