Skip to yearly menu bar Skip to main content


Poster

Position: Considerations for Differentially Private Learning with Large-Scale Public Pretraining

Florian Tramer · Gautam Kamath · Nicholas Carlini

Hall C 4-9 #2406
Best Paper Best Paper
[ ] [ Paper PDF ]
Tue 23 Jul 2:30 a.m. PDT — 4 a.m. PDT
 
Oral presentation: Oral 1B Positions on How We Do Machine Learning Research
Tue 23 Jul 1:30 a.m. PDT — 2:30 a.m. PDT

Abstract:

The performance of differentially private machine learning can be boosted significantly by leveraging the transfer learning capabilities of non-private models pretrained on large public datasets. We critically review this approach. We primarily question whether the use of large Web-scraped datasets should be viewed as differential-privacy-preserving. We further scrutinize whether existing machine learning benchmarks are appropriate for measuring the ability of pretrained models to generalize to sensitive domains. Finally, we observe that reliance on large pretrained models may lose other forms of privacy, requiring data to be outsourced to a more compute-powerful third party.

Chat is not available.