Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 77 tok/s
Gemini 2.5 Pro 54 tok/s Pro
GPT-5 Medium 29 tok/s Pro
GPT-5 High 26 tok/s Pro
GPT-4o 103 tok/s Pro
Kimi K2 175 tok/s Pro
GPT OSS 120B 454 tok/s Pro
Claude Sonnet 4.5 38 tok/s Pro
2000 character limit reached

Zipf's law and L. Levin's probability distributions (1301.0427v2)

Published 3 Jan 2013 in cs.IT and math.IT

Abstract: Zipf's law in its basic incarnation is an empirical probability distribution governing the frequency of usage of words in a language. As Terence Tao recently remarked, it still lacks a convincing and satisfactory mathematical explanation. In this paper I suggest that at least in certain situations, Zipf's law can be explained as a special case of the a priori distribution introduced and studied by L. Levin. The Zipf ranking corresponding to diminishing probability appears then as the ordering determined by the growing Kolmogorov complexity. One argument justifying this assertion is the appeal to a recent interpretation by Yu. Manin and M. Marcolli of asymptotic bounds for error--correcting codes in terms of phase transition. In the respective partition function, Kolmogorov complexity of a code plays the role of its energy. This version contains minor corrections and additions.

Citations (10)

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.