Skip to yearly menu bar Skip to main content


Poster

Hyperbolic Entailment Cones for Learning Hierarchical Embeddings

Octavian-Eugen Ganea · Gary Becigneul · Thomas Hofmann

Hall B #100

Abstract:

Learning graph representations via low-dimensional embeddings that preserve relevant network properties is an important class of problems in machine learning. We here present a novel method to embed directed acyclic graphs. Following prior work, we first advocate for using hyperbolic spaces which provably model tree-like structures better than Euclidean geometry. Second, we view hierarchical relations as partial orders defined using a family of nested geodesically convex cones. We prove that these entailment cones admit an optimal shape with a closed form expression both in the Euclidean and hyperbolic spaces, and they canonically define the embedding learning process. Experiments show significant improvements of our method over strong recent baselines both in terms of representational capacity and generalization.

Live content is unavailable. Log in and register to view live content