Timezone: »
Privately Learning Subspaces
Vikrant Singhal · Thomas Steinke
Private data analysis suffers a costly curse of dimensionality. However, the data often has an underlying low-dimensional structure. For example, when optimizing via gradient descent, the gradients often lie in or near a low-dimensional subspace. If that low-dimensional structure can be identified, then we can avoid paying (in terms of privacy or accuracy) for the high ambient dimension.
We present differentially private algorithms that take input data sampled from a low-dimensional linear subspace (possibly with a small amount of error) and output that subspace (or an approximation to it). These algorithms can serve as a pre-processing step for other procedures.
Author Information
Vikrant Singhal (Northeastern University)
Thomas Steinke (IBM, Almaden)
More from the Same Authors
-
2021 : The Distributed Discrete Gaussian Mechanism for Federated Learning with Secure Aggregation »
Peter Kairouz · Ziyu Liu · Thomas Steinke -
2023 : Privacy Auditing with One (1) Training Run »
Thomas Steinke · Milad Nasresfahani · Matthew Jagielski -
2023 Poster: Why Is Public Pretraining Necessary for Private Model Training? »
Arun Ganesh · Mahdi Haghifam · Milad Nasresfahani · Sewoong Oh · Thomas Steinke · Om Thakkar · Abhradeep Guha Thakurta · Lun Wang -
2022 Poster: Public Data-Assisted Mirror Descent for Private Model Training »
Ehsan Amid · Arun Ganesh · Rajiv Mathews · Swaroop Ramaswamy · Shuang Song · Thomas Steinke · Thomas Steinke · Vinith Suriyakumar · Om Thakkar · Abhradeep Guha Thakurta -
2022 Spotlight: Public Data-Assisted Mirror Descent for Private Model Training »
Ehsan Amid · Arun Ganesh · Rajiv Mathews · Swaroop Ramaswamy · Shuang Song · Thomas Steinke · Thomas Steinke · Vinith Suriyakumar · Om Thakkar · Abhradeep Guha Thakurta -
2021 : Invited Talk: Thomas Steinke »
Thomas Steinke -
2020 Poster: New Oracle-Efficient Algorithms for Private Synthetic Data Release »
Giuseppe Vietri · Grace Tian · Mark Bun · Thomas Steinke · Steven Wu