Timezone: »
Gaussian Process Factor Analysis (GPFA) has been broadly applied to the problem of identifying smooth, low-dimensional temporal structure underlying large-scale neural recordings. However, spike trains are non-Gaussian, which motivates combining GPFA with discrete observation models for binned spike count data. The drawback to this approach is that GPFA priors are not conjugate to count model likelihoods, which makes inference challenging. Here we address this obstacle by introducing a fast, approximate inference method for non-conjugate GPFA models. Our approach uses orthogonal second-order polynomials to approximate the nonlinear terms in the non-conjugate log-likelihood, resulting in a method we refer to as polynomial approximate log-likelihood (PAL) estimators. This approximation allows for accurate closed-form evaluation of marginal likelihoods and fast numerical optimization for parameters and hyperparameters. We derive PAL estimators for GPFA models with binomial, Poisson, and negative binomial observations and find the PAL estimation is highly accurate, and achieves faster convergence times compared to existing state-of-the-art inference methods. We also find that PAL hyperparameters can provide sensible initialization for black box variational inference (BBVI), which improves BBVI accuracy. We demonstrate that PAL estimators achieve fast and accurate extraction of latent structure from multi-neuron spike train data.
Author Information
Stephen Keeley (Princeton University)
David Zoltowski (Princeton University)
Yiyi Yu (UNC)
Spencer Smith (UC Santa Barbara)
Jonathan Pillow (Princeton University)
More from the Same Authors
-
2021 Poster: Factor-analytic inverse regression for high-dimension, small-sample dimensionality reduction »
Aditi Jha · Michael J. Morais · Jonathan Pillow -
2021 Poster: Inferring Latent Dynamics Underlying Neural Population Activity via Neural Differential Equations »
Timothy Kim · Thomas Luo · Jonathan Pillow · Carlos Brody -
2021 Spotlight: Factor-analytic inverse regression for high-dimension, small-sample dimensionality reduction »
Aditi Jha · Michael J. Morais · Jonathan Pillow -
2021 Oral: Inferring Latent Dynamics Underlying Neural Population Activity via Neural Differential Equations »
Timothy Kim · Thomas Luo · Jonathan Pillow · Carlos Brody -
2020 Poster: A general recurrent state space framework for modeling neural dynamics during decision-making »
David Zoltowski · Jonathan Pillow · Scott Linderman