Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
194 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Cross-situational learning of large lexicons with finite memory (1809.11047v1)

Published 28 Sep 2018 in physics.soc-ph and cs.CL

Abstract: Cross-situational word learning, wherein a learner combines information about possible meanings of a word across multiple exposures, has previously been shown to be a very powerful strategy to acquire a large lexicon in a short time. However, this success may derive from idealizations that are made when modeling the word-learning process. In particular, an earlier model assumed that a learner could perfectly recall all previous instances of a word's use and the inferences that were drawn about its meaning. In this work, we relax this assumption and determine the performance of a model cross-situational learner who forgets word-meaning associations over time. Our main finding is that it is possible for this learner to acquire a human-scale lexicon by adulthood with word-exposure and memory-decay rates that are consistent with empirical research on childhood word learning, as long as the degree of referential uncertainty is not too high or the learner employs a mutual exclusivity constraint. Our findings therefore suggest that successful word learning does not necessarily demand either highly accurate long-term tracking of word and meaning statistics or hypothesis-testing strategies.

Citations (1)

Summary

We haven't generated a summary for this paper yet.