Skip to yearly menu bar Skip to main content


Poster

Lorentzian Distance Learning for Hyperbolic Representations

Marc Law · Renjie Liao · Jake Snell · Richard Zemel

Pacific Ballroom #30

Keywords: [ Metric Learning ] [ Non-convex Optimization ] [ Representation Learning ]


Abstract:

We introduce an approach to learn representations based on the Lorentzian distance in hyperbolic geometry. Hyperbolic geometry is especially suited to hierarchically-structured datasets, which are prevalent in the real world. Current hyperbolic representation learning methods compare examples with the Poincar\'e distance. They try to minimize the distance of each node in a hierarchy with its descendants while maximizing its distance with other nodes. This formulation produces node representations close to the centroid of their descendants. To obtain efficient and interpretable algorithms, we exploit the fact that the centroid w.r.t the squared Lorentzian distance can be written in closed-form. We show that the Euclidean norm of such a centroid decreases as the curvature of the hyperbolic space decreases. This property makes it appropriate to represent hierarchies where parent nodes minimize the distances to their descendants and have smaller Euclidean norm than their children. Our approach obtains state-of-the-art results in retrieval and classification tasks on different datasets.

Live content is unavailable. Log in and register to view live content