Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Time Series Workshop

Morning Poster Session: Prediction-Constrained Hidden Markov Models for Semi-Supervised Classification

Gabriel Hope


Abstract:

We develop a new framework for training hidden Markov models that balances generative and discriminative goals. Our approach requires likelihood-based or Bayesian learning to meet task-specific prediction quality constraints, preventing model misspecification from leading to poor subsequent predictions. When users specify an appropriate loss function to constrain predictions, our approach can enhance semi-supervised learning when labeled sequences are rare and boost accuracy when data has unbalanced label frequencies. Via automatic differentiation we backpropagate gradients through dynamic programming computation of the marginal likelihood, making training feasible without auxiliary bounds or approximations. Our approach is effective for human activity modeling and healthcare intervention forecasting, delivering accuracies competitive with well-tuned neural networks for fully labeled data, and substantially better for partially labeled data. Simultaneously, our learned generative model illuminates the dynamical states driving predictions.