Skip to yearly menu bar Skip to main content


Poster Teaser
in
Workshop: Graph Representation Learning and Beyond (GRL+)

(#24 / Sess. 2) Degree-Quant: Quantization-Aware Training for Graph Neural Networks

Shyam Tailor


Abstract:

Graph neural networks have demonstrated strong performance modelling non-uniform structured data. However, there exists little research exploring methods to make them more efficient at inference time. In this work, we explore the viability of training quantized GNNs models, enabling the usage of low precision integer arithmetic for inference. We propose a method, Degree-Quant, to improve performance over existing quantization-aware training baselines commonly used on other architectures, such as CNNs. Our work demonstrates that it is possible to train models using 8-bit integer arithmetic at inference-time with similar accuracy to their full precision counterparts.

Teaser video |

Chat is not available.