Klein Hyperbolic Metric Learning
Abstract
Hyperbolic metric learning is highly effective in embedding hierarchical data structures. However, past work has predominantly focused on the conformal Poincaré model, leaving other geometries like the Klein model largely under-explored. In addition, the curved geodesics of the Poincaré model present a fundamental geometric misalignment with the linear projections dominating the feature transformation steps in the modern neural network backbones. In this paper, we investigate the Klein model, a projective model of hyperbolic geometry whose straight-line geodesics offer a structurally aligned alternative in linear encoders, for hyperbolic metric learning. By formalizing a framework based on Einstein gyrovector operations, we derive a numerically stable metric learning approach that mitigates the inherent optimization challenges of the Klein model. Extensive experiments on multiple image datasets for fine-grained image classification task show that the Klein model not only serves as a viable alternative to the Poincaré model but also achieves highly competitive performance by leveraging its unique geometric properties, without increasing parameter complexity. Our empirical findings establish the Klein model as an efficient geometric prior for hyperbolic metric learning.