Timezone: »

GNOT: A General Neural Operator Transformer for Operator Learning
Zhongkai Hao · Zhengyi Wang · Hang Su · Chengyang Ying · Yinpeng Dong · LIU SONGMING · Ze Cheng · Jian Song · Jun Zhu

Tue Jul 25 02:00 PM -- 04:30 PM (PDT) @ Exhibit Hall 1 #228

Learning partial differential equations' (PDEs) solution operators is an essential problem in machine learning. However, there are several challenges for learning operators in practical applications like the irregular mesh, multiple input functions, and complexity of the PDEs' solution. To address these challenges, we propose a general neural operator transformer (GNOT), a scalable and effective transformer-based framework for learning operators. By designing a novel heterogeneous normalized attention layer, our model is highly flexible to handle multiple input functions and irregular meshes. Besides, we introduce a geometric gating mechanism which could be viewed as a soft domain decomposition to solve the multi-scale problems. The large model capacity of the transformer architecture grants our model the possibility to scale to large datasets and practical problems. We conduct extensive experiments on multiple challenging datasets from different domains and achieve a remarkable improvement compared with alternative methods. Our code and data are publicly available at https://github.com/thu-ml/GNOT.

Author Information

Zhongkai Hao (Tsinghua University)
Zhengyi Wang (Tsinghua University)
Hang Su (Tsinghua University)
Chengyang Ying (Tsinghua University, Tsinghua University)
Yinpeng Dong (Tsinghua University)
LIU SONGMING (Tsinghua University)
Ze Cheng
Jian Song (Tsinghua University)
Jun Zhu (Tsinghua University)

More from the Same Authors