Skip to yearly menu bar Skip to main content


Spotlight

Large-Scale Graph Neural Architecture Search

Chaoyu Guan · Xin Wang · Hong Chen · Ziwei Zhang · Wenwu Zhu

Room 327 - 329

Abstract:

Graph Neural Architecture Search (GNAS) has become a powerful method in automatically discovering suitable Graph Neural Network (GNN) architectures for different tasks. However, existing approaches fail to handle large-scale graphs because current performance estimation strategies in GNAS are computationally expensive for large-scale graphs and suffer from consistency collapse issues. To tackle these problems, we propose the Graph ArchitectUre Search at Scale (GAUSS) methodthat can handle large-scale graphs by designing an efficient light-weight supernet and the joint architecture-graph sampling.In particular, a graph sampling-based single-path one-shot supernet is proposed to reduce the computation burden.To address the consistency collapse issues, we further explicitly consider the joint architecture-graph sampling through a novel architecture peer learning mechanism on the sampled sub-graphs and an architecture importance sampling algorithm.Our proposed framework is able to smooth the highly non-convex optimization objective and stabilize the architecture sampling process.We provide theoretical analyses on GAUSS and empirically evaluate it on five datasets whose vertex sizes range from 10^4 to 10^8. The experimental results demonstrate substantial improvements of GAUSS over other GNAS baselines on all datasets.To the best of our knowledge, the proposed GAUSS method is the first graph neural architecture search framework that can handle graphs with billions of edges within 1 GPU day.

Chat is not available.