Timezone: »
Non-local interactions play a vital role in boosting performance for image restoration. However, local window Transformer has been preferred due to its efficiency for processing high-resolution images. The superiority in efficiency comes at the cost of sacrificing the ability to model non-local interactions. In this paper, we present that local window Transformer can also function as modeling non-local interactions. The counterintuitive function is based on the permutation-equivariance of self-attention. The basic principle is quite simple: by randomly shuffling the input, local self-attention also has the potential to model non-local interactions without introducing extra parameters. Our random shuffle strategy enjoys elegant theoretical guarantees in extending the local scope. The resulting Transformer dubbed ShuffleFormer is capable of processing high-resolution images efficiently while modeling non-local interactions. Extensive experiments demonstrate the effectiveness of ShuffleFormer across a variety of image restoration tasks, including image denoising, deraining, and deblurring. Code is available at https://github.com/jiexiaou/ShuffleFormer.
Author Information
Jie Xiao (University of Science and Technology of China)
Xueyang Fu (University of Science and Technology of China)
Man Zhou
Hongjian Liu (University of Science and Technology of China)
Zheng-Jun Zha (University of Science and Technology of China)
More from the Same Authors
-
2023 Poster: Fourmer: An Efficient Global Modeling Paradigm for Image Restoration »
Man Zhou · Jie Huang · Chunle Guo · Chongyi Li -
2023 Oral: Fourmer: An Efficient Global Modeling Paradigm for Image Restoration »
Man Zhou · Jie Huang · Chunle Guo · Chongyi Li -
2022 Poster: Principled Knowledge Extrapolation with GANs »
Ruili Feng · Jie Xiao · Kecheng Zheng · Deli Zhao · Jingren Zhou · Qibin Sun · Zheng-Jun Zha -
2022 Spotlight: Principled Knowledge Extrapolation with GANs »
Ruili Feng · Jie Xiao · Kecheng Zheng · Deli Zhao · Jingren Zhou · Qibin Sun · Zheng-Jun Zha -
2021 Poster: Understanding Noise Injection in GANs »
Ruili Feng · Deli Zhao · Zheng-Jun Zha -
2021 Spotlight: Understanding Noise Injection in GANs »
Ruili Feng · Deli Zhao · Zheng-Jun Zha -
2021 Poster: Uncertainty Principles of Encoding GANs »
Ruili Feng · Zhouchen Lin · Jiapeng Zhu · Deli Zhao · Jingren Zhou · Zheng-Jun Zha -
2021 Spotlight: Uncertainty Principles of Encoding GANs »
Ruili Feng · Zhouchen Lin · Jiapeng Zhu · Deli Zhao · Jingren Zhou · Zheng-Jun Zha