Skip to yearly menu bar Skip to main content


Poster

Deep Principal Support Vector Machines for Nonlinear Sufficient Dimension Reduction

YinFeng Chen · Jin Liu · Rui Qiu

East Exhibition Hall A-B #E-2310
[ ] [ ]
Tue 15 Jul 11 a.m. PDT — 1:30 p.m. PDT

Abstract: The normal vectors obtained from the support vector machine (SVM) method offer the potential to achieve sufficient dimension reduction in both classification and regression scenarios. Motivated by it, we in this paper introduce a unified framework for nonlinear sufficient dimension reduction based on classification ensemble. Kernel principal SVM, which leverages the reproducing kernel Hilbert space, can almost be regarded as a special case of this framework, and we generalize it by using a neural network function class for more flexible deep nonlinear reduction. We theoretically prove its unbiasedness with respect to the central $\sigma$-field and provide a nonasymptotic upper bound for the estimation error. Simulations and real data analysis demonstrate the considerable competitiveness of the proposed method, especially under heavy data contamination, large sample sizes, and complex inputs.

Lay Summary:

In many real-world problems, data can be noisy, making it difficult to identify the most important information. Inspired by how support vector machines (SVMs) detect key patterns, we propose a general framework to achieve more efficient dimension reduction using multiple SVMs. This approach helps isolate the most informative structures within the data.To increase flexibility, we incorporate neural networks, which are well-known for adapting to complex problems. By combining these tools, we can learn meaningful representations that capture both linear and nonlinear relationships. Our method performs well in both theoretical analysis and practical experiments. Overall, it represents a promising step toward making complex data easier to understand and analyze efficiently.

Live content is unavailable. Log in and register to view live content