Tokenisation is NP-Complete
Philip Whittington · Gregor Bachmann · Tiago Pimentel
Abstract
In this work, we prove the NP-completeness of two variants of tokenisation, defined as the problem of compressing a dataset to at most $\delta$ symbols by either finding a vocabulary directly (_direct_ tokenisation), or selecting a sequence of merge operations (_bottom-up_ tokenisation).
Video
Chat is not available.
Successful Page Load