Hyperbolic RQ-VAE enhanced Generative Recommendation with Differential-Length Codebook Strategy
Abstract
Recently, the integration of large language models (LLMs) with generative recommendation (GR) has demonstrated promising potential. However, most existing GR methods adopt residual quantization to implicitly model hierarchical relationships across codebook layers in Euclidean space, which distorts the intrinsic tree-like hierarchy and leads to low codebook utilization. To address these issues, we propose a Hyperbolic RQ-VAE enhanced Generative Recommendation, namely HG-Rec. Specifically, HG-Rec enhances the residual quantization mechanism by embedding the latent discrete representations into hyperbolic space to explicitly model hierarchical relationships across codebook layers. Motivated by the exponential volume growth of hyperbolic space, we further design a differential-length codebook strategy, i.e. the codebook size follows a pyramidal structure, which aligns with the tree-like structure and effectively compresses the codebook size. Hence, benefiting from the alignment of hyperbolic geometry and codebook hierarchy, HG-Rec achieves lower collision rates, more uniform codebook usage, and less training time compared to existing methods. Extensive experiments across multiple benchmark datasets demonstrate that HG-Rec consistently achieves state-of-the-art performance. The code is available in the Supplementary Material.