The Bayesian Backfitting Relevance Vector Machine
Aaron D'Souza - University of Southern California
Sethu Vijayakumar - University of Edinburgh
Stefan Schaal - University of Southern California & ATR Computational Neuroscience Laboratory
Traditional non-parametric statistical learning techniques are oftencomputationally attractive, but lack the same generalization and modelselection abilities as state-of-the-art Bayesian algorithms which, however,are usually computationally prohibitive. This paper makes several importantcontributions that allow Bayesian learning to scale to more complex,real-world learning scenarios. Firstly, we show that backfitting -- atraditional non-parametric, yet highly efficient regression tool -- can bederived in a novel formulation within an expectation maximization (EM)framework and thus can finally be given a probabilistic interpretation. Secondly, we show that the general framework of sparse Bayesian learning andin particular the relevance vector machine (RVM), can be derived as a highlyefficient algorithm using a Bayesian version of backfitting at its core. As wedemonstrate on several regression and classification benchmarks, Bayesianbackfitting offers a compelling alternative to current regression methods,especially when the size and dimensionality of the data challengecomputational resources.