Skip to yearly menu bar Skip to main content


( events)   Timezone:  
Poster
Wed Jul 11 09:15 AM -- 12:00 PM (PDT) @ Hall B #57
Representation Learning on Graphs with Jumping Knowledge Networks
Keyulu Xu · Chengtao Li · Yonglong Tian · Tomohiro Sonobe · Ken-ichi Kawarabayashi · Stefanie Jegelka
[ PDF

Recent deep learning approaches for representation learning on graphs follow a neighborhood aggregation procedure. We analyze some important properties of these models, and propose a strategy to overcome those. In particular, the range of "neighboring" nodes that a node's representation draws from strongly depends on the graph structure, analogous to the spread of a random walk. To adapt to local neighborhood properties and tasks, we explore an architecture -- jumping knowledge (JK) networks -- that flexibly leverages, for each node, different neighborhood ranges to enable better structure-aware representation. In a number of experiments on social, bioinformatics and citation networks, we demonstrate that our model achieves state-of-the-art performance. Furthermore, combining the JK framework with models like Graph Convolutional Networks, GraphSAGE and Graph Attention Networks consistently improves those models' performance.