Skip to yearly menu bar Skip to main content


Poster

GALS: Generalizable Alternating Least Squares for Recommender System

Yuanhao Pu · Xiaolong Chen · Xu Huang · Jin Chen · Defu Lian · Enhong Chen


Abstract:

The implicit Alternating Least Squares algorithm (iALS) is widely recognized as an efficient approach for recommender systems, consistently delivering competitive performance when compared to recent approaches. However, a notable challenge arises from the fact that iALS utilizes a quadratic regression loss function, which lacks a clear connection to the ranking objective, such as DCG. This discrepancy poses a fundamental difficulty in explaining the algorithm's exceptional ranking performance.In this work, we make a breakthrough by establishing a connection between quadratic regression loss and ranking metrics through a Taylor expansion of the DCG-consistent surrogate loss —— softmax. We also remarkably discovered a new surrogate quadratic loss function and conducted thorough theoretical analyses, specifically focusing on the DCG-consistency and generalization properties of this newly proposed loss function. These analyses provide solid theoretical foundations and enhance the reliability and applicability of our approach.Moreover, we generalize the original ALS method to incorporate our novel loss function, resulting in a more efficient and effective ranking algorithm. The experimental results over three public datasets demonstrate the effectiveness of the proposed method, i.e., GALS. The results showcased comparable ranking performance to softmax while achieving faster convergence due to the optimization with closed-form solutions.This significant advancement presents a practical alternative to the widely used softmax function, representing a substantial leap forward in our understanding of objective functions in recommendation systems.

Live content is unavailable. Log in and register to view live content