Diversity-Aware Recursive Feature Multiple Kernel Learning
Abstract
Multiple kernel learning~(MKL), which borrows ideas from ensemble learning, aims to achieve improved generalization performance by treating individual kernels as base learners and combining them appropriately. However, existing MKL methods often lack comprehensive consideration of diversity among base kernels, which has been demonstrated to play an essential role in ensemble learning. Moreover, the traditional kernels are predefined functions and equally treat all the input features, which ignore the diversity of feature, and yield suboptimal performance. In this paper, we formally define kernel diversity, and propose a novel data-driven class kernel named Recursive Feature Machine~(RFM) kernel, which is able to learn the feature importance directly from the different datasets. Moreover, a novel kernel selection method is proposed that explicitly optimizes both kernel diversity and quality. The resulting binary quadratic programming problem is NP-hard. Therefore, it is reformulated as a linear program and accelerated via sketching techniques, and a theoretical analysis of the estimation error is provided based on covering number bounds. Extensive empirical studies demonstrate that the proposed method outperforms state-of-the-art MKL approaches.