Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Are Transformers a Modern Version of ELIZA? Observations on French Object Verb Agreement (2109.10133v1)

Published 21 Sep 2021 in cs.CL

Abstract: Many recent works have demonstrated that unsupervised sentence representations of neural networks encode syntactic information by observing that neural LLMs are able to predict the agreement between a verb and its subject. We take a critical look at this line of research by showing that it is possible to achieve high accuracy on this agreement task with simple surface heuristics, indicating a possible flaw in our assessment of neural networks' syntactic ability. Our fine-grained analyses of results on the long-range French object-verb agreement show that contrary to LSTMs, Transformers are able to capture a non-trivial amount of grammatical structure.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Bingzhi Li (6 papers)
  2. Guillaume Wisniewski (17 papers)
  3. Benoit Crabbé (2 papers)
Citations (6)