Timezone: »

 
Oral
Dimensionality Reduction for Tukey Regression
Kenneth Clarkson · Ruosong Wang · David Woodruff

Thu Jun 13 11:20 AM -- 11:25 AM (PDT) @ Room 101
We give the first dimensionality reduction methods for the overconstrained Tukey regression problem. The Tukey loss function $\|y\|_M = \sum_i M(y_i)$ has $M(y_i) \approx |y_i|^p$ for residual errors $y_i$ smaller than a prescribed threshold $\tau$, but $M(y_i)$ becomes constant for errors $|y_i| > \tau$. Our results depend on a new structural result, proven constructively, showing that for any $d$-dimensional subspace $L \subset \mathbb{R}^n$, there is a fixed bounded-size subset of coordinates containing, for every $y\in L$, all the large coordinates, {\it with respect to the Tukey loss function}, of~$y$. Our methods reduce a given Tukey regression problem to a smaller weighted version, whose solution is a provably good approximate solution to the original problem. Our reductions are fast, simple, and easy to implement, and we give empirical results demonstrating their practicality, using existing heuristic solvers for the small versions. We also give exponential-time algorithms giving provably good solutions, and hardness results suggesting that a significant speedup in the worst case is unlikely.

Author Information

Kenneth Clarkson (IBM Research)
Ruosong Wang (Carnegie Mellon University)
David Woodruff (Carnegie Mellon University)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors