STable Permutation-based Framework for Table Generation in Sequence-to-Sequence Models
Michał Pietruszka · Michał Turski · Łukasz Borchmann · Tomasz Dwojak · Gabriela Pałka · Karolina Szyndler · Dawid Jurkiewicz · Łukasz Garncarek
Keywords:
heuristic search algorithm
key information extraction
flexible generation
permutation-based language modeling
text-to-table
structured decoding
information extraction
Abstract
We present a permutation-based text-to-table neural framework that unifies diverse NLP tasks into table outputs. The framework uses a probabilistic approach during training, maximizing the expected log-likelihood across all random permutations of table content factorization. At the inference stage, we optimize model uncertainties and minimize error propagation by leveraging the model's ability to generate cells in any order. Our method accelerates inference by up to 4$\times$ on some datasets and improves text-to-table performance by up to 15\% over previous solutions, all while preserving output quality.
Video
Chat is not available.
Successful Page Load