Timezone: »

Latent LSTM Allocation: Joint clustering and non-linear dynamic modeling of sequence data
Manzil Zaheer · Amr Ahmed · Alex Smola

Mon Aug 07 11:42 PM -- 12:00 AM (PDT) @ Parkside 1

Recurrent neural networks, such as long-short term memory (LSTM) networks, are powerful tools for modeling sequential data like user browsing history (Tan et al., 2016; Korpusik et al., 2016) or natural language text (Mikolov et al., 2010). However, to generalize across different user types, LSTMs require a large number of parameters, notwithstanding the simplicity of the underlying dynamics, rendering it uninterpretable, which is highly undesirable in user modeling. The increase in complexity and parameters arises due to a large action space in which many of the actions have similar intent or topic. In this paper, we introduce Latent LSTM Allocation (LLA) for user modeling combining hierarchical Bayesian models with LSTMs. In LLA, each user is modeled as a sequence of actions, and the model jointly groups actions into topics and learns the temporal dynamics over the topic sequence, instead of action space directly. This leads to a model that is highly interpretable, concise, and can capture intricate dynamics. We present an efficient Stochastic EM inference algorithm for our model that scales to millions of users/documents. Our experimental evaluations show that the proposed model compares favorably with several state-of-the-art baselines.

Author Information

Manzil Zaheer (Carnegie Mellon University)
Amr Ahmed (Google)
Alex Smola (Amazon)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors