Deep and Flexible Graph Neural Architecture Search

Wentao Zhang · Zheyu Lin · Yu Shen · Yang Li · Zhi Yang · Bin Cui

Hall E #420

Keywords: [ MISC: Unsupervised and Semi-supervised Learning ] [ MISC: Representation Learning ] [ OPT: Bilevel optimization ] [ DL: Graph Neural Networks ]

[ Abstract ]
[ Paper PDF
Tue 19 Jul 3:30 p.m. PDT — 5:30 p.m. PDT
Spotlight presentation: DL: Graph Neural Networks
Tue 19 Jul 10:30 a.m. PDT — noon PDT


Graph neural networks (GNNs) have been intensively applied to various graph-based applications. Despite their success, designing good GNN architectures is non-trivial, which heavily relies on lots of human efforts and domain knowledge. Although several attempts have been made in graph neural architecture search, they suffer from the following limitations: 1) fixed pipeline pattern of propagation (P) and (T) transformation operations; 2) restricted pipeline depth of GNN architectures. This paper proposes DFG-NAS, a novel method that searches for deep and flexible GNN architectures. Unlike most existing methods that focus on micro-architecture, DFG-NAS highlights another level of design: the search for macro-architectures of how atomic P and T are integrated and organized into a GNN. Concretely, DFG-NAS proposes a novel-designed search space for the P-T permutations and combinations based on the message-passing dis-aggregation, and defines various mutation strategies and employs the evolutionary algorithm to conduct an efficient and effective search. Empirical studies on four benchmark datasets demonstrate that DFG-NAS could find more powerful architectures than state-of-the-art manual designs and meanwhile are more efficient than the current graph neural architecture search approaches.

Chat is not available.