Timezone: »
Privately Learning Mixtures of AxisAligned Gaussians
Ishaq AdenAli · Hassan Ashtiani · Christopher Liaw
We consider the problem of learning multivariate Gaussians under the constraint of approximate differential privacy. We prove that $\widetilde{O}(k^2 d \log^{3/2}(1/\delta) / \alpha^2 \varepsilon)$ samples are sufficient to learn a mixture of $k$ axisaligned Gaussians in $\mathbb{R}^d$ to within total variation distance $\alpha$ while satisfying $(\varepsilon, \delta)$differential privacy. This is the first result for privately learning mixtures of unbounded axisaligned (or even unbounded univariate) Gaussians. If the covariance matrices of each of the Gaussians is the identity matrix, we show that $\widetilde{O}(kd/\alpha^2 + kd \log(1/\delta) / \alpha \varepsilon)$ samples are sufficient.
To prove our results, we design a new technique for privately learning mixture distributions. A class of distributions $\mathcal{F}$ is said to be listdecodable if there is an algorithm that, given "heavily corrupted" samples from $f \in \mathcal{F}$, outputs a list of distributions, $\widehat{\mathcal{F}}$, such that one of the distributions in $\widehat{\mathcal{F}}$ approximates $f$. We show that if $\mathcal{F}$ is privately listdecodable then we can learn mixtures of distributions in $\mathcal{F}$. Finally, we show axisaligned Gaussian distributions are privately listdecodable, thereby proving mixtures of such distributions are privately learnable.
Author Information
Ishaq AdenAli (McMaster University)
Hassan Ashtiani (McMaster University)
Christopher Liaw (Department of Computer Science, University of Toronto)
More from the Same Authors

2023 Poster: Polynomial Time and Private Learning of Unbounded Gaussian Mixture Models »
Jamil Arbas · Hassan Ashtiani · Christopher Liaw 
2020 Poster: Blackbox Certification and Learning under Adversarial Perturbations »
Hassan Ashtiani · Vinayak Pathak · Ruth Urner