Timezone: »
Distribution estimation is a statistical-learning cornerstone. Its classical min-max formulation minimizes the estimation error for the worst distribution, hence under-performs for practical distributions that, like power-law, are often rather simple. Modern research has therefore focused on two frameworks: structural estimation that improves learning accuracy by assuming a simple structure of the underlying distribution; and competitive, or instance-optimal, estimation that achieves the performance of a genie aided estimator up to a small excess error that vanishes as the sample size grows, regardless of the distribution. This paper combines and strengthens the two frameworks. It designs a single estimator whose excess error vanishes both at a universal rate as the sample size grows, as well as when the (unknown) distribution gets simpler. We show that the resulting algorithm significantly improves the performance guarantees for numerous competitive- and structural-estimation results. The algorithm runs in near-linear time and is robust to model misspecification and domain-symbol permutations.
Author Information
Yi Hao (University of California, San Diego)
Alon Orlitsky (UCSD)
Related Events (a corresponding poster, oral, or spotlight)
-
2019 Poster: Doubly-Competitive Distribution Estimation »
Wed. Jun 12th 01:30 -- 04:00 AM Room Pacific Ballroom #166
More from the Same Authors
-
2022 Poster: TURF: Two-Factor, Universal, Robust, Fast Distribution Learning Algorithm »
Yi Hao · Ayush Jain · Alon Orlitsky · Vaishakh Ravindrakumar -
2022 Spotlight: TURF: Two-Factor, Universal, Robust, Fast Distribution Learning Algorithm »
Yi Hao · Ayush Jain · Alon Orlitsky · Vaishakh Ravindrakumar -
2021 Poster: Compressed Maximum Likelihood »
Yi Hao · Alon Orlitsky -
2021 Spotlight: Compressed Maximum Likelihood »
Yi Hao · Alon Orlitsky -
2021 Poster: Robust Density Estimation from Batches: The Best Things in Life are (Nearly) Free »
Ayush Jain · Alon Orlitsky -
2021 Oral: Robust Density Estimation from Batches: The Best Things in Life are (Nearly) Free »
Ayush Jain · Alon Orlitsky -
2020 Poster: Optimal Sequential Maximization: One Interview is Enough! »
Moein Falahatgar · Alon Orlitsky · Venkatadheeraj Pichapati -
2020 Poster: Optimal Robust Learning of Discrete Distributions from Batches »
Ayush Jain · Alon Orlitsky -
2020 Poster: Data Amplification: Instance-Optimal Property Estimation »
Yi Hao · Alon Orlitsky -
2018 Poster: The Limits of Maxing, Ranking, and Preference Learning »
Moein Falahatgar · Ayush Jain · Alon Orlitsky · Venkatadheeraj Pichapati · Vaishakh Ravindrakumar -
2018 Oral: The Limits of Maxing, Ranking, and Preference Learning »
Moein Falahatgar · Ayush Jain · Alon Orlitsky · Venkatadheeraj Pichapati · Vaishakh Ravindrakumar -
2017 Poster: A Unified Maximum Likelihood Approach for Estimating Symmetric Properties of Discrete Distributions »
Jayadev Acharya · Hirakendu Das · Alon Orlitsky · Ananda Suresh -
2017 Poster: Maximum Selection and Ranking under Noisy Comparisons »
Moein Falahatgar · Alon Orlitsky · Venkatadheeraj Pichapati · Ananda Theertha Suresh -
2017 Talk: A Unified Maximum Likelihood Approach for Estimating Symmetric Properties of Discrete Distributions »
Jayadev Acharya · Hirakendu Das · Alon Orlitsky · Ananda Suresh -
2017 Talk: Maximum Selection and Ranking under Noisy Comparisons »
Moein Falahatgar · Alon Orlitsky · Venkatadheeraj Pichapati · Ananda Theertha Suresh