Skip to yearly menu bar Skip to main content


Poster Teaser
in
Workshop: Graph Representation Learning and Beyond (GRL+)

(#94 / Sess. 2) Are Hyperbolic Representations in Graphs Created Equal?

Maxim Kochurov


Abstract:

Recently there was an increasing interest in applications of graph neural networks in non-Euclidean geometry; however, are non-Euclidean representations always useful for graph learning tasks? For different problems such as node classification and link prediction we compute hyperbolic embeddings and conclude that for tasks that require global prediction consistency it might be useful to use non-Euclidean embeddings, while for other tasks Euclidean models are superior. To do so we first fix an issue of the existing models associated with the optimization process at zero curvature. Current hyperbolic models deal with gradients at the origin in ad-hoc manner, which is inefficient and can lead to numerical instabilities. We solve the instabilities of kappa-Stereographic model at zero curvature cases and evaluate the approach of embedding graphs into the manifold in several graph representation learning tasks.

Teaser video |

Chat is not available.