Random Function Priors for Correlation Modeling
Aonan Zhang · John Paisley

Wed Jun 12th 02:25 -- 02:30 PM @ Room 101

Many hidden structures underlying high dimensional data can be compactly expressed by a discrete random measure $\xin=\sum{k\in[K]} Z{nk}\delta{\thetak}$, where $(\thetak){k\in[K]}\subset\Theta$ is a collection of hidden atoms shared across observations (indexed by $n$). Previous Bayesian nonparametric methods focus on embedding $\xin$ onto alternative spaces to resolve complex atom correlations. However, these methods can be rigid and hard to learn in practice. In this paper, we temporarily ignore the atom space $\Theta$ and embed population random measures $(\xin){n\in\bbN}$ altogether as $\xi'$ onto an infinite strip $[0,1]\times\bbR+$, where the order of atoms is \textit{removed} by assuming separate exchangeability. Through a ``de Finetti type" result, we can represent $\xi'$ as a coupling of a 2d Poisson process and exchangeable random functions $(fn){n\in\bbN}$, where each $fn$ is an object-specific atom sampling function. In this way, we transform the problem from learning complex correlations with discrete random measures into learning complex functions that can be learned with deep neural networks. In practice, we introduce an efficient amortized variational inference algorithm to learn $f_n$ without pain; i.e., no local gradient steps are required during stochastic inference.

Author Information

Aonan Zhang (Columbia University)
John Paisley (Columbia University)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors