Learning thresholds lead to stable language coexistence (2406.14522v2)
Abstract: We introduce a language competition model that is based on the Abrams-Strogatz model and incorporates the effects of memory and learning in the language shift dynamics. On a coarse grained time scale, the effects of memory and learning can be expressed as thresholds on the speakers fractions of the competing languages. In its simplest form, the resulting model is exactly solvable. Besides the consensus on one of the two languages, the model describes additional equilibrium states that are not present in the Abrams-Strogatz model: a stable dynamical coexistence of the two languages and a frozen state coinciding with the initial state. We show numerically that these results are preserved for threshold functions of a more general shape. The comparison of the model predictions with historical datasets demonstrates that while the Abrams-Strogatz model fails to describe some relevant language competition situations, the proposed model provides a good fitting.
Sponsored by Paperpile, the PDF & BibTeX manager trusted by top AI labs.
Get 30 days freePaper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.