Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
8 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Distributional Scaling for Emergent Capabilities (2502.17356v3)

Published 24 Feb 2025 in cs.LG

Abstract: This paper explores the nature of sudden breakthroughs in LLM performance at scale, which stand in contrast to smooth improvements governed by scaling laws. While advocates of "emergence" view breakthroughs as unlocked capabilities, others attribute them to thresholding effects on noncontinuous metrics. We propose that breakthroughs are instead driven by continuous changes in the probability distribution of training outcomes when performance is bimodally distributed across random seeds. In synthetic length generalization tasks, we show that different random seeds can produce either highly linear or emergent scaling trends. We reveal that sharp breakthroughs in metrics are produced by underlying continuous changes in their distribution across seeds. Furthermore, we provide a case study of inverse scaling. We validate our distributional scaling framework on realistic settings by measuring MMLU performance in LM populations. These insights emphasize the role of random variation in the effect of scale on LM capabilities.

Summary

We haven't generated a summary for this paper yet.