Timezone: »
Much of the theory for classical sparse recovery is based on conditions on the dictionary that are both necessary and sufficient (e.g., nullspace property) or only sufficient (e.g., incoherence and restricted isometry). In contrast, much of the theory for subspace-preserving recovery, the theoretical underpinnings for sparse subspace classification and clustering methods, is based on conditions on the subspaces and the data that are only sufficient (e.g., subspace incoherence and data inner-radius). This paper derives a necessary and sufficient condition for subspace-preserving recovery that is inspired by the classical nullspace property.Based on this novel condition, called here the subspace nullspace property, we derive equivalent characterizations that either admit a clear geometric interpretation that relates data distribution and subspace separation to the recovery success, or can be verified using a finite set of extreme points of a properly defined set. We further exploit these characterizations to derive new sufficient conditions, based on inner-radius and outer-radius measures and dual bounds, that generalize existing conditions and preserve the geometric interpretations. These results fill an important gap in the subspace-preserving recovery literature.
Author Information
Mustafa D Kaba (eBay Inc)
Chong You (University of California Berkeley)
Daniel Robinson (Lehigh University)
Enrique Mallada (Johns Hopkins University)
Rene Vidal (Johns Hopkins University, USA)
Related Events (a corresponding poster, oral, or spotlight)
-
2021 Spotlight: A Nullspace Property for Subspace-Preserving Recovery »
Wed. Jul 21st 02:25 -- 02:30 PM Room
More from the Same Authors
-
2023 Workshop: HiLD: High-dimensional Learning Dynamics Workshop »
Courtney Paquette · Zhenyu Liao · Mihai Nica · Elliot Paquette · Andrew Saxe · Rene Vidal -
2023 Poster: On the Convergence of Gradient Flow on Multi-layer Linear Models »
Hancheng Min · Rene Vidal · Enrique Mallada -
2023 Poster: Learning Globally Smooth Functions on Manifolds »
Juan Cervino · Luiz Chamon · Benjamin Haeffele · Rene Vidal · Alejandro Ribeiro -
2023 Poster: The Ideal Continual Learner: An Agent That Never Forgets »
Liangzu Peng · Paris Giampouras · Rene Vidal -
2022 Poster: Understanding Doubly Stochastic Clustering »
Tianjiao Ding · Derek Lim · Rene Vidal · Benjamin Haeffele -
2022 Spotlight: Understanding Doubly Stochastic Clustering »
Tianjiao Ding · Derek Lim · Rene Vidal · Benjamin Haeffele -
2022 Poster: On the Optimization Landscape of Neural Collapse under MSE Loss: Global Optimality with Unconstrained Features »
Jinxin Zhou · Xiao Li · Tianyu Ding · Chong You · Qing Qu · Zhihui Zhu -
2022 Poster: Robust Training under Label Noise by Over-parameterization »
Sheng Liu · Zhihui Zhu · Qing Qu · Chong You -
2022 Spotlight: Robust Training under Label Noise by Over-parameterization »
Sheng Liu · Zhihui Zhu · Qing Qu · Chong You -
2022 Spotlight: On the Optimization Landscape of Neural Collapse under MSE Loss: Global Optimality with Unconstrained Features »
Jinxin Zhou · Xiao Li · Tianyu Ding · Chong You · Qing Qu · Zhihui Zhu -
2022 Poster: Reverse Engineering $\ell_p$ attacks: A block-sparse optimization approach with recovery guarantees »
Darshan Thaker · Paris Giampouras · Rene Vidal -
2022 Spotlight: Reverse Engineering $\ell_p$ attacks: A block-sparse optimization approach with recovery guarantees »
Darshan Thaker · Paris Giampouras · Rene Vidal -
2021 Poster: Dual Principal Component Pursuit for Robust Subspace Learning: Theory and Algorithms for a Holistic Approach »
Tianyu Ding · Zhihui Zhu · Rene Vidal · Daniel Robinson -
2021 Spotlight: Dual Principal Component Pursuit for Robust Subspace Learning: Theory and Algorithms for a Holistic Approach »
Tianyu Ding · Zhihui Zhu · Rene Vidal · Daniel Robinson -
2021 Poster: Understanding the Dynamics of Gradient Flow in Overparameterized Linear models »
Salma Tarmoun · Guilherme Franca · Benjamin Haeffele · Rene Vidal -
2021 Poster: On the Explicit Role of Initialization on the Convergence and Implicit Bias of Overparametrized Linear Networks »
Hancheng Min · Salma Tarmoun · Rene Vidal · Enrique Mallada -
2021 Spotlight: On the Explicit Role of Initialization on the Convergence and Implicit Bias of Overparametrized Linear Networks »
Hancheng Min · Salma Tarmoun · Rene Vidal · Enrique Mallada -
2021 Spotlight: Understanding the Dynamics of Gradient Flow in Overparameterized Linear models »
Salma Tarmoun · Guilherme Franca · Benjamin Haeffele · Rene Vidal -
2020 : Spotlight talk 1 - A Second-Order Optimization Algorithm for Solving Problems Involving Group Sparse Regularization »
Daniel Robinson -
2019 Poster: Noisy Dual Principal Component Pursuit »
Tianyu Ding · Zhihui Zhu · Tianjiao Ding · Yunchen Yang · Daniel Robinson · Manolis Tsakiris · Rene Vidal -
2019 Oral: Noisy Dual Principal Component Pursuit »
Tianyu Ding · Zhihui Zhu · Tianjiao Ding · Yunchen Yang · Daniel Robinson · Manolis Tsakiris · Rene Vidal -
2018 Poster: ADMM and Accelerated ADMM as Continuous Dynamical Systems »
Guilherme Franca · Daniel Robinson · Rene Vidal -
2018 Oral: ADMM and Accelerated ADMM as Continuous Dynamical Systems »
Guilherme Franca · Daniel Robinson · Rene Vidal