Skip to yearly menu bar Skip to main content


Spotlight

TURF: Two-Factor, Universal, Robust, Fast Distribution Learning Algorithm

Yi Hao · Ayush Jain · Alon Orlitsky · Vaishakh Ravindrakumar

Hall G

Abstract: Approximating distributions from their samples is a canonical statistical-learning problem. One of its most powerful and successful modalities approximates every distribution to an 1 distance essentially at most a constant times larger than its closest t-piece degree-d polynomial, where t1 and d0. Letting ct,d denote the smallest such factor, clearly c1,0=1, and it can be shown that ct,d2 for all other t and d. Yet current computationally efficient algorithms show only ct,12.25 and the bound rises quickly to ct,d3 for d9. We derive a near-linear-time and essentially sample-optimal estimator that establishes ct,d=2 for all (t,d)(1,0). Additionally, for many practical distributions, the lowest approximation distance is achieved by polynomials with vastly varying number of pieces. We provide a method that estimates this number near-optimally, hence helps approach the best possible approximation. Experiments combining the two techniques confirm improved performance over existing methodologies.

Chat is not available.