Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
143 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Automata networks for memory loss effects in the formation of linguistic conventions (1508.01580v2)

Published 7 Aug 2015 in cs.CL and physics.soc-ph

Abstract: This work attempts to give new theoretical insights to the absence of intermediate stages in the evolution of language. In particular, it is developed an automata networks approach to a crucial question: how a population of language users can reach agreement on a linguistic convention? To describe the appearance of sharp transitions in the self-organization of language, it is adopted an extremely simple model of (working) memory. At each time step, language users simply loss part of their word-memories. Through computer simulations of low-dimensional lattices, it appear sharp transitions at critical values that depend on the size of the vicinities of the individuals.

Citations (2)

Summary

We haven't generated a summary for this paper yet.