Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
38 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Meta-Learning for Fast Cross-Lingual Adaptation in Dependency Parsing (2104.04736v3)

Published 10 Apr 2021 in cs.CL and cs.AI

Abstract: Meta-learning, or learning to learn, is a technique that can help to overcome resource scarcity in cross-lingual NLP problems, by enabling fast adaptation to new tasks. We apply model-agnostic meta-learning (MAML) to the task of cross-lingual dependency parsing. We train our model on a diverse set of languages to learn a parameter initialization that can adapt quickly to new languages. We find that meta-learning with pre-training can significantly improve upon the performance of language transfer and standard supervised learning baselines for a variety of unseen, typologically diverse, and low-resource languages, in a few-shot learning setup.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Anna Langedijk (4 papers)
  2. Verna Dankers (14 papers)
  3. Phillip Lippe (21 papers)
  4. Sander Bos (1 paper)
  5. Bryan Cardenas Guevara (1 paper)
  6. Helen Yannakoudakis (32 papers)
  7. Ekaterina Shutova (52 papers)
Citations (12)