2000 character limit reached
L$^2$M: Mutual Information Scaling Law for Long-Context Language Modeling (2503.04725v1)
Published 6 Mar 2025 in cs.CL, cs.AI, cs.IT, cs.LG, math.IT, and physics.data-an
Abstract: We rigorously establish a bipartite mutual information scaling law in natural language that governs long-range dependencies. This scaling law, which we show is distinct from and scales independently of the conventional two-point mutual information, is the key to understanding long-context LLMing. Using this scaling law, we formulate the Long-context LLMing (L$2$M) condition, which relates a model's capacity for effective long context length modeling to the scaling of its latent state size for storing past information. Our results are validated through experiments on both transformers and state space models. This work establishes a theoretical foundation that guides the development of LLMs toward longer context lengths.
- Zhuo Chen (319 papers)
- Oriol Mayné i Comas (1 paper)
- Zhuotao Jin (1 paper)
- Di Luo (63 papers)
- Marin Soljačić (141 papers)