Poster

StochasticRank: Global Optimization of Scale-Free Discrete Functions

Aleksei Ustimenko · Liudmila Prokhorenkova

Keywords: [ Information Retrieval ] [ Non-convex Optimization ] [ Ranking and Preference Learning ] [ Supervised Learning ] [ Boosting / Ensemble Methods ]

[ Abstract ]
Tue 14 Jul 1 p.m. PDT — 1:45 p.m. PDT
Wed 15 Jul 2 a.m. PDT — 2:45 a.m. PDT

Abstract:

In this paper, we introduce a powerful and efficient framework for direct optimization of ranking metrics. The problem is ill-posed due to the discrete structure of the loss, and to deal with that, we introduce two important techniques: a stochastic smoothing and a novel gradient estimate based on partial integration. We also address the problem of smoothing bias and present a universal solution for a proper debiasing. To guarantee the global convergence of our method, we adopt a recently proposed Stochastic Gradient Langevin Boosting algorithm. Our algorithm is implemented as a part of the CatBoost gradient boosting library and outperforms the existing approaches on several learning-to-rank datasets. In addition to ranking metrics, our framework applies to any scale-free discrete loss function.

Chat is not available.