Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Influence Paths for Characterizing Subject-Verb Number Agreement in LSTM Language Models (2005.01190v1)

Published 3 May 2020 in cs.CL

Abstract: LSTM-based recurrent neural networks are the state-of-the-art for many NLP tasks. Despite their performance, it is unclear whether, or how, LSTMs learn structural features of natural languages such as subject-verb number agreement in English. Lacking this understanding, the generality of LSTM performance on this task and their suitability for related tasks remains uncertain. Further, errors cannot be properly attributed to a lack of structural capability, training data omissions, or other exceptional faults. We introduce influence paths, a causal account of structural properties as carried by paths across gates and neurons of a recurrent neural network. The approach refines the notion of influence (the subject's grammatical number has influence on the grammatical number of the subsequent verb) into a set of gate or neuron-level paths. The set localizes and segments the concept (e.g., subject-verb agreement), its constituent elements (e.g., the subject), and related or interfering elements (e.g., attractors). We exemplify the methodology on a widely-studied multi-layer LSTM LLM, demonstrating its accounting for subject-verb number agreement. The results offer both a finer and a more complete view of an LSTM's handling of this structural aspect of the English language than prior results based on diagnostic classifiers and ablation.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Kaiji Lu (5 papers)
  2. Piotr Mardziel (18 papers)
  3. Klas Leino (14 papers)
  4. Matt Fedrikson (1 paper)
  5. Anupam Datta (51 papers)
Citations (9)

Summary

We haven't generated a summary for this paper yet.