Poster
in
Workshop: Structured Probabilistic Inference and Generative Modeling
SOLMformer - Incorporating Sequence and Observation Level Metadata for Categorical Time Series Modeling
Yamini Ananth · Gregory Benton · Jingxing Fang · Jerry Cheung · Xu Chu · Cong Yu
Keywords: [ generative modeling ] [ Time Series ] [ Transformer ] [ categorical time series ] [ process mining ] [ predictive process modeling ]
Sequential modeling, such as time series forecasting or language generation, traditionally uses the set of previous observations to predict future outcomes. However, it is often insufficient to exclusively model the sequence of observations. Not only may the observations themselves be dependent on metadata about the sequence, but predicting future metadata itself may be of interest. To address the shortcomings of sequential modeling alone, we propose SOLMformer, a multi-task approach that incorporates Sequence and Observation Level metadata into the transformer architecture as both inputs and as multi-task outputs. We evaluate SOLMformer on real-world process mining datasets and where it outperforms state of the art deep learning methods.