Surrogate Maximization/Minimization Algorithms for AdaBoost and the Logistic Regression Model
Zhihua Zhang - Hong Kong University of Science and Technology
James T. Kwok - Hong Kong University of Science and Technology
Dit-Yan Yeung - Hong Kong University of Science and Technology
Surrogate maximization (or minimization) (SM) algorithms are a family of algorithms that can be regarded as a generalization of expectation-maximization (EM) algorithms. There are three major approaches to the construction of surrogate functions, all relying on the convexity of some function. In this paper, we solve the boosting problem by proposing SM algorithms for the corresponding optimization problem. Specifically, for AdaBoost, we derive an SM algorithm that can be shown to be identical to the algorithm proposed by Collins (2002) based on Bregman distance. More importantly, forLogitBoost (or logistic boosting), we use several methods to constructdifferent surrogate functions which result in different SM algorithms. Bycombining multiple methods, we are able to derive an SM algorithm that is alsothe same as an algorithm derived by Collins (2002). Our approach based on SMalgorithms is much simpler and convergence results follow naturally.