Timezone: »
Random Fourier features is a widely used, simple, and effective technique for scaling up kernel methods. The existing theoretical analysis of the approach, however, remains focused on specific learning tasks and typically gives pessimistic bounds which are at odds with the empirical results. We tackle these problems and provide the first unified risk analysis of learning with random Fourier features using the squared error and Lipschitz continuous loss functions. In our bounds, the trade-off between the computational cost and the expected risk convergence rate is problem specific and expressed in terms of the regularization parameter and the number of effective degrees of freedom. We study both the standard random Fourier features method for which we improve the existing bounds on the number of features required to guarantee the corresponding minimax risk convergence rate of kernel ridge regression, as well as a data-dependent modification which samples features proportional to ridge leverage scores and further reduces the required number of features. As ridge leverage scores are expensive to compute, we devise a simple approximation scheme which provably reduces the computational cost without loss of statistical efficiency.
Author Information
Zhu Li (University of Oxford)
Jean-Francois Ton (University of Oxford)
Dino Oglic (King's College London)
Dino Sejdinovic (University of Oxford)
Related Events (a corresponding poster, oral, or spotlight)
-
2019 Poster: Towards a Unified Analysis of Random Fourier Features »
Wed. Jun 12th 01:30 -- 04:00 AM Room Pacific Ballroom #220
More from the Same Authors
-
2022 Expo Talk Panel: Towards Robust Waveform-Based Acoustic Models »
Dino Oglic -
2020 Poster: Inter-domain Deep Gaussian Processes »
Tim G. J. Rudner · Dino Sejdinovic · Yarin Gal -
2019 : Networking Lunch (provided) + Poster Session »
Abraham Stanway · Alex Robson · Aneesh Rangnekar · Ashesh Chattopadhyay · Ashley Pilipiszyn · Benjamin LeRoy · Bolong Cheng · Ce Zhang · Chaopeng Shen · Christian Schroeder · Christian Clough · Clement DUHART · Clement Fung · Cozmin Ududec · Dali Wang · David Dao · di wu · Dimitrios Giannakis · Dino Sejdinovic · Doina Precup · Duncan Watson-Parris · Gege Wen · George Chen · Gopal Erinjippurath · Haifeng Li · Han Zou · Herke van Hoof · Hillary A Scannell · Hiroshi Mamitsuka · Hongbao Zhang · Jaegul Choo · James Wang · James Requeima · Jessica Hwang · Jinfan Xu · Johan Mathe · Jonathan Binas · Joonseok Lee · Kalai Ramea · Kate Duffy · Kevin McCloskey · Kris Sankaran · Lester Mackey · Letif Mones · Loubna Benabbou · Lynn Kaack · Matthew Hoffman · Mayur Mudigonda · Mehrdad Mahdavi · Michael McCourt · Mingchao Jiang · Mohammad Mahdi Kamani · Neel Guha · Niccolo Dalmasso · Nick Pawlowski · Nikola Milojevic-Dupont · Paulo Orenstein · Pedram Hassanzadeh · Pekka Marttinen · Ramesh Nair · Sadegh Farhang · Samuel Kaski · Sandeep Manjanna · Sasha Luccioni · Shuby Deshpande · Soo Kim · Soukayna Mouatadid · Sunghyun Park · Tao Lin · Telmo Felgueira · Thomas Hornigold · Tianle Yuan · Tom Beucler · Tracy Cui · Volodymyr Kuleshov · Wei Yu · yang song · Ydo Wexler · Yoshua Bengio · Zhecheng Wang · Zhuangfang Yi · Zouheir Malki -
2019 Poster: Scalable Learning in Reproducing Kernel Krein Spaces »
Dino Oglic · Thomas Gaertner -
2019 Oral: Scalable Learning in Reproducing Kernel Krein Spaces »
Dino Oglic · Thomas Gaertner -
2018 Poster: Learning in Reproducing Kernel Kreı̆n Spaces »
Dino Oglic · Thomas Gaertner -
2018 Oral: Learning in Reproducing Kernel Kreı̆n Spaces »
Dino Oglic · Thomas Gaertner -
2017 Poster: Nyström Method with Kernel K-means++ Samples as Landmarks »
Dino Oglic · Thomas Gaertner -
2017 Talk: Nyström Method with Kernel K-means++ Samples as Landmarks »
Dino Oglic · Thomas Gaertner