Timezone: »

Streaming Inference for Infinite Feature Models
Rylan Schaeffer · Yilun Du · Gabrielle K Liu · Ila R. Fiete

Wed Jul 20 10:50 AM -- 10:55 AM (PDT) @ Room 301 - 303

Unsupervised learning from a continuous stream of data is arguably one of the most common and most challenging problems facing intelligent agents. One class of unsupervised models, collectively termed \textit{feature models}, attempts unsupervised discovery of latent features underlying the data and includes common models such as PCA, ICA, and NMF. However, if the data arrives in a continuous stream, determining the number of features is a significant challenge and the number may grow with time. In this work, we make feature models significantly more applicable to streaming data by imbuing them with the ability to create new features, online, in a probabilistic and principled manner. To achieve this, we derive a novel recursive form of the Indian Buffet Process, which we term the \textit{Recursive IBP} (R-IBP). We demonstrate that R-IBP can be be used as a prior for feature models to efficiently infer a posterior over an unbounded number of latent features, with quasilinear average time complexity and logarithmic average space complexity. We compare R-IBP to existing offline sampling and variational baselines in two feature models (Linear Gaussian and Factor Analysis) and demonstrate on synthetic and real data that R-IBP achieves comparable or better performance in significantly less time.

Author Information

Rylan Schaeffer (Stanford University)
Yilun Du (MIT)
Gabrielle K Liu (Massachusetts Institute of Technology)
Ila R. Fiete (MIT)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors