Timezone: »
Recent work has combined Stein's method with reproducing kernel Hilbert space theory to develop nonparametric goodness-of-fit tests for un-normalized probability distributions. However, the currently available tests apply exclusively to distributions with smooth density functions. In this work, we introduce a kernelized Stein discrepancy measure for discrete spaces, and develop a nonparametric goodness-of-fit test for discrete distributions with intractable normalization constants. Furthermore, we propose a general characterization of Stein operators that encompasses both discrete and continuous distributions, providing a recipe for constructing new Stein operators. We apply the proposed goodness-of-fit test to three statistical models involving discrete distributions, and our experiments show that the proposed test typically outperforms a two-sample test based on the maximum mean discrepancy.
Author Information
Jiasen Yang (Purdue University)
Qiang Liu (UT Austin)
Vinayak A Rao (Purdue University)
Jennifer Neville (Purdue University)
Related Events (a corresponding poster, oral, or spotlight)
-
2018 Poster: Goodness-of-fit Testing for Discrete Distributions via Stein Discrepancy »
Thu. Jul 12th 04:15 -- 07:00 PM Room Hall B #25
More from the Same Authors
-
2022 Poster: Centroid Approximation for Bootstrap: Improving Particle Quality at Inference »
Mao Ye · Qiang Liu -
2022 Poster: How to Fill the Optimum Set? Population Gradient Descent with Harmless Diversity »
Chengyue Gong · · Qiang Liu -
2022 Spotlight: How to Fill the Optimum Set? Population Gradient Descent with Harmless Diversity »
Chengyue Gong · · Qiang Liu -
2022 Spotlight: Centroid Approximation for Bootstrap: Improving Particle Quality at Inference »
Mao Ye · Qiang Liu -
2022 Poster: A Langevin-like Sampler for Discrete Distributions »
Ruqi Zhang · Xingchao Liu · Qiang Liu -
2022 Spotlight: A Langevin-like Sampler for Discrete Distributions »
Ruqi Zhang · Xingchao Liu · Qiang Liu -
2021 Poster: AlphaNet: Improved Training of Supernets with Alpha-Divergence »
Dilin Wang · Chengyue Gong · Meng Li · Qiang Liu · Vikas Chandra -
2021 Oral: AlphaNet: Improved Training of Supernets with Alpha-Divergence »
Dilin Wang · Chengyue Gong · Meng Li · Qiang Liu · Vikas Chandra -
2021 Poster: Coach-Player Multi-agent Reinforcement Learning for Dynamic Team Composition »
Bo Liu · Qiang Liu · Peter Stone · Animesh Garg · Yuke Zhu · Anima Anandkumar -
2021 Oral: Coach-Player Multi-agent Reinforcement Learning for Dynamic Team Composition »
Bo Liu · Qiang Liu · Peter Stone · Animesh Garg · Yuke Zhu · Anima Anandkumar -
2020 Poster: Good Subnetworks Provably Exist: Pruning via Greedy Forward Selection »
Mao Ye · Chengyue Gong · Lizhen Nie · Denny Zhou · Adam Klivans · Qiang Liu -
2020 Poster: Go Wide, Then Narrow: Efficient Training of Deep Thin Networks »
Denny Zhou · Mao Ye · Chen Chen · Tianjian Meng · Mingxing Tan · Xiaodan Song · Quoc Le · Qiang Liu · Dale Schuurmans -
2020 Poster: Accountable Off-Policy Evaluation With Kernel Bellman Statistics »
Yihao Feng · Tongzheng Ren · Ziyang Tang · Qiang Liu -
2020 Poster: A Chance-Constrained Generative Framework for Sequence Optimization »
Xianggen Liu · Qiang Liu · Sen Song · Jian Peng -
2019 Workshop: Stein’s Method for Machine Learning and Statistics »
Francois-Xavier Briol · Lester Mackey · Chris Oates · Qiang Liu · Larry Goldstein · Larry Goldstein -
2019 Poster: Improving Neural Language Modeling via Adversarial Training »
Dilin Wang · Chengyue Gong · Qiang Liu -
2019 Oral: Improving Neural Language Modeling via Adversarial Training »
Dilin Wang · Chengyue Gong · Qiang Liu -
2019 Poster: Quantile Stein Variational Gradient Descent for Batch Bayesian Optimization »
Chengyue Gong · Jian Peng · Qiang Liu -
2019 Poster: Nonlinear Stein Variational Gradient Descent for Learning Diversified Mixture Models »
Dilin Wang · Qiang Liu -
2019 Poster: Relational Pooling for Graph Representations »
Ryan Murphy · Balasubramaniam Srinivasan · Vinayak A Rao · Bruno Ribeiro -
2019 Oral: Quantile Stein Variational Gradient Descent for Batch Bayesian Optimization »
Chengyue Gong · Jian Peng · Qiang Liu -
2019 Oral: Relational Pooling for Graph Representations »
Ryan Murphy · Balasubramaniam Srinivasan · Vinayak A Rao · Bruno Ribeiro -
2019 Oral: Nonlinear Stein Variational Gradient Descent for Learning Diversified Mixture Models »
Dilin Wang · Qiang Liu -
2018 Poster: An Iterative, Sketching-based Framework for Ridge Regression »
Agniva Chowdhury · Jiasen Yang · Petros Drineas -
2018 Poster: Learning to Explore via Meta-Policy Gradient »
Tianbing Xu · Qiang Liu · Liang Zhao · Jian Peng -
2018 Poster: Stein Variational Gradient Descent Without Gradient »
Jun Han · Qiang Liu -
2018 Oral: Stein Variational Gradient Descent Without Gradient »
Jun Han · Qiang Liu -
2018 Oral: Learning to Explore via Meta-Policy Gradient »
Tianbing Xu · Qiang Liu · Liang Zhao · Jian Peng -
2018 Oral: An Iterative, Sketching-based Framework for Ridge Regression »
Agniva Chowdhury · Jiasen Yang · Petros Drineas -
2018 Poster: Stein Variational Message Passing for Continuous Graphical Models »
Dilin Wang · Zhe Zeng · Qiang Liu -
2018 Oral: Stein Variational Message Passing for Continuous Graphical Models »
Dilin Wang · Zhe Zeng · Qiang Liu