Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
38 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Finding Universal Grammatical Relations in Multilingual BERT (2005.04511v2)

Published 9 May 2020 in cs.CL and cs.LG

Abstract: Recent work has found evidence that Multilingual BERT (mBERT), a transformer-based multilingual masked LLM, is capable of zero-shot cross-lingual transfer, suggesting that some aspects of its representations are shared cross-lingually. To better understand this overlap, we extend recent work on finding syntactic trees in neural networks' internal representations to the multilingual setting. We show that subspaces of mBERT representations recover syntactic tree distances in languages other than English, and that these subspaces are approximately shared across languages. Motivated by these results, we present an unsupervised analysis method that provides evidence mBERT learns representations of syntactic dependency labels, in the form of clusters which largely agree with the Universal Dependencies taxonomy. This evidence suggests that even without explicit supervision, multilingual masked LLMs learn certain linguistic universals.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Ethan A. Chi (8 papers)
  2. John Hewitt (24 papers)
  3. Christopher D. Manning (169 papers)
Citations (145)