Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Low-Resource Parsing with Crosslingual Contextualized Representations (1909.08744v1)

Published 19 Sep 2019 in cs.CL

Abstract: Despite advances in dependency parsing, languages with small treebanks still present challenges. We assess recent approaches to multilingual contextual word representations (CWRs), and compare them for crosslingual transfer from a language with a large treebank to a language with a small or nonexistent treebank, by sharing parameters between languages in the parser itself. We experiment with a diverse selection of languages in both simulated and truly low-resource scenarios, and show that multilingual CWRs greatly facilitate low-resource dependency parsing even without crosslingual supervision such as dictionaries or parallel text. Furthermore, we examine the non-contextual part of the learned LLMs (which we call a "decontextual probe") to demonstrate that polyglot LLMs better encode crosslingual lexical correspondence compared to aligned monolingual LLMs. This analysis provides further evidence that polyglot training is an effective approach to crosslingual transfer.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Phoebe Mulcaire (8 papers)
  2. Jungo Kasai (38 papers)
  3. Noah A. Smith (224 papers)
Citations (20)

Summary

We haven't generated a summary for this paper yet.