Skip to yearly menu bar Skip to main content


Poster

SGLB: Stochastic Gradient Langevin Boosting

Aleksei Ustimenko · Liudmila Prokhorenkova

Keywords: [ Algorithms ] [ Boosting and Ensemble Methods ]


Abstract:

This paper introduces Stochastic Gradient Langevin Boosting (SGLB) - a powerful and efficient machine learning framework that may deal with a wide range of loss functions and has provable generalization guarantees. The method is based on a special form of the Langevin diffusion equation specifically designed for gradient boosting. This allows us to theoretically guarantee the global convergence even for multimodal loss functions, while standard gradient boosting algorithms can guarantee only local optimum. We also empirically show that SGLB outperforms classic gradient boosting when applied to classification tasks with 0-1 loss function, which is known to be multimodal.

Chat is not available.