Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
126 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Modeling Bilingual Sentence Processing: Evaluating RNN and Transformer Architectures for Cross-Language Structural Priming (2405.09508v2)

Published 15 May 2024 in cs.CL and cs.LG

Abstract: This study evaluates the performance of Recurrent Neural Network (RNN) and Transformer models in replicating cross-language structural priming, a key indicator of abstract grammatical representations in human language processing. Focusing on Chinese-English priming, which involves two typologically distinct languages, we examine how these models handle the robust phenomenon of structural priming, where exposure to a particular sentence structure increases the likelihood of selecting a similar structure subsequently. Our findings indicate that transformers outperform RNNs in generating primed sentence structures, with accuracy rates that exceed 25.84\% to 33. 33\%. This challenges the conventional belief that human sentence processing primarily involves recurrent and immediate processing and suggests a role for cue-based retrieval mechanisms. This work contributes to our understanding of how computational models may reflect human cognitive processes across diverse language families.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (19)
  1. Kathryn Bock and Zenzi M. Griffin. 2000. The persistence of structural priming: Transient activation or implicit learning? Journal of Experimental Psychology: General, 129(2):177–192.
  2. Stefan Frank. 2021. Cross-language structural priming in recurrent neural network language models. In Proceedings of the Annual Meeting of the Cognitive Science Society, volume 43.
  3. Neural network models of language acquisition and processing. In Human language: From genes and brain to behavior, pages 277–293. MIT Press.
  4. Enhancing N-Gram Based Metrics with Semantics for Better Evaluation of Abstractive Text Summarization. Journal of Computer Science and Technology, 37(5):1118–1133.
  5. Yufen Hsieh. 2017. Structural priming during sentence comprehension in Chinese–English bilinguals. Applied Psycholinguistics, 38(3):657–678.
  6. T. Florian Jaeger and Neal E. Snider. 2013. Alignment as a consequence of expectation adaptation: Syntactic priming is affected by the prime’s prediction error given both prior and recent experience. Cognition, 127(1):57–83.
  7. Structural priming as implicit learning: Cumulative priming effects and individual differences. Psychonomic Bulletin & Review, 18(6):1133–1139.
  8. Computational principles of working memory in sentence comprehension. Trends in Cognitive Sciences, 10(10):447–454.
  9. Few-shot Learning with Multilingual Generative Language Models. Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 9019–9052.
  10. Tal Linzen and Marco Baroni. 2021. Syntactic structure from deep learning. Annual Review of Linguistics, 7:195–212.
  11. Helga Loebell and Kathryn Bock. 2003. Structural priming across languages.
  12. Danny Merkx and Stefan L. Frank. 2021. Human Sentence Processing: Recurrence or Attention? In Proceedings of the Workshop on Cognitive Modeling and Computational Linguistics, pages 12–22, Online. Association for Computational Linguistics.
  13. Structural priming demonstrates abstract grammatical representations in multilingual language models. arXiv preprint arXiv:2311.09194.
  14. Dan Parker and Michael Shvartsman. 2018. The cue-based retrieval theory. Language Processing and Disorders, page 121.
  15. Martin J. Pickering and Holly P. Branigan. 1998. The Representation of Verbs: Evidence from Syntactic Priming in Language Production. Journal of Memory and Language, 39(4):633–651.
  16. Martin J Pickering and Victor S Ferreira. 2008. Structural priming: a critical review. Psychological bulletin, 134(3):427.
  17. Using priming to uncover the organization of syntactic representations in neural language models. arXiv preprint arXiv:1909.10579.
  18. Jeong-Ah Shin and Kiel Christianson. 2009. Syntactic processing in korean–english bilingual production: Evidence from cross-linguistic structural priming. Cognition, 112(1):175–180.
  19. Marten Van Schijndel and Tal Linzen. 2018. A neural model of adaptation in reading. arXiv preprint arXiv:1808.09930.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets