Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Neural Compression: From Information Theory to Applications

MLIC$^{++}$: Linear Complexity Multi-Reference Entropy Modeling for Learned Image Compression

Wei Jiang · Ronggang Wang


Abstract: Recently, multi-reference entropy model has been proposed, which captures channel-wise, local spatial, and global spatial correlations. Previous works adopt attention for global correlation capturing, however, the quadratic cpmplexity limits the potential of high-resolution image coding. In this paper, we propose the linear complexity global correlations capturing, via the decomposition of softmax operation. Based on it, we propose the MLIC$^{++}$, a learned image compression with linear complexity for multi-reference entropy modeling. Our MLIC$^{++}$ is more efficient and it reduces BD-rate by $12.44$% on the Kodak dataset compared to VTM-17.0 when measured in PSNR.

Chat is not available.