Skip to yearly menu bar Skip to main content


Poster

An Information-Theoretic Analysis of In-Context Learning

Hong Jun Jeon · Jason Lee · Qi Lei · Benjamin Van Roy


Abstract:

Previous theoretical results pertaining to meta-learning on sequences build on contrived assumptions and are somewhat convoluted.We introduce new information-theoretic tools that lead to an elegant and very general decomposition of error into two components: meta-learning error, and intra-task error. These tools unify analyses across many meta-learning challenges. To illustrate, we apply them to establish new results about in-context learning with transformers. Our theoretical results characterizes how error decays in both the number of training sequences and sequence lengths. Our results are very general; for example, they avoid contrived mixing time assumptions made by all prior results that establish decay of error with sequence length.

Live content is unavailable. Log in and register to view live content