Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Geometry-grounded Representation Learning and Generative Modeling

Lorentzian Residual Neural Networks

Neil He · Yang · ZHITAO YING

Keywords: [ Neural Network ] [ hyperbolic geometry ] [ residual network ]


Abstract:

Hyperbolic neural networks have emerged as a powerful tool for modeling hierarchical data structures prevalent in real-world datasets. Notably, residual connections, which facilitate the flow of information across layers, have been instrumental in the success of deep neural networks. However, current methods for constructing hyperbolic residual layers suffer from limitations such as increased model complexity, numerical instability, and errors due to multiple mappings to and from the tangent space. To address these limitations, we introduce LRN, a novel hyperbolic residual neural network based on the weighted Lorentzian centroid in the Lorentz model of hyperbolic space. Extensive experiments showcase the superior performance of LRN compared to state-of-the-art Euclidean and hyperbolic alternatives, highlighting its potential for building more expressive neural networks in hyperbolic space as a general applicable method to multiple architectures, including GNNs and graph Transformers.

Chat is not available.