Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Structured Probabilistic Inference and Generative Modeling

STable Permutation-based Framework for Table Generation in Sequence-to-Sequence Models

Michał Pietruszka · Michał Turski · Łukasz Borchmann · Tomasz Dwojak · Gabriela Pałka · Karolina Szyndler · Dawid Jurkiewicz · Łukasz Garncarek

Keywords: [ information extraction ] [ structured decoding ] [ text-to-table ] [ permutation-based language modeling ] [ flexible generation ] [ key information extraction ] [ heuristic search algorithm ]


Abstract: We present a permutation-based text-to-table neural framework that unifies diverse NLP tasks into table outputs. The framework uses a probabilistic approach during training, maximizing the expected log-likelihood across all random permutations of table content factorization. At the inference stage, we optimize model uncertainties and minimize error propagation by leveraging the model's ability to generate cells in any order. Our method accelerates inference by up to 4$\times$ on some datasets and improves text-to-table performance by up to 15\% over previous solutions, all while preserving output quality.

Chat is not available.