Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

UDapter: Language Adaptation for Truly Universal Dependency Parsing (2004.14327v2)

Published 29 Apr 2020 in cs.CL

Abstract: Recent advances in multilingual dependency parsing have brought the idea of a truly universal parser closer to reality. However, cross-language interference and restrained model capacity remain major obstacles. To address this, we propose a novel multilingual task adaptation approach based on contextual parameter generation and adapter modules. This approach enables to learn adapters via language embeddings while sharing model parameters across languages. It also allows for an easy but effective integration of existing linguistic typology features into the parsing network. The resulting parser, UDapter, outperforms strong monolingual and multilingual baselines on the majority of both high-resource and low-resource (zero-shot) languages, showing the success of the proposed adaptation approach. Our in-depth analyses show that soft parameter sharing via typological features is key to this success.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Ahmet Üstün (38 papers)
  2. Arianna Bisazza (43 papers)
  3. Gosse Bouma (11 papers)
  4. Gertjan van Noord (16 papers)
Citations (111)

Summary

We haven't generated a summary for this paper yet.