Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On Difficulties of Cross-Lingual Transfer with Order Differences: A Case Study on Dependency Parsing (1811.00570v3)

Published 1 Nov 2018 in cs.CL and cs.LG

Abstract: Different languages might have different word orders. In this paper, we investigate cross-lingual transfer and posit that an order-agnostic model will perform better when transferring to distant foreign languages. To test our hypothesis, we train dependency parsers on an English corpus and evaluate their transfer performance on 30 other languages. Specifically, we compare encoders and decoders based on Recurrent Neural Networks (RNNs) and modified self-attentive architectures. The former relies on sequential information while the latter is more flexible at modeling word order. Rigorous experiments and detailed analysis shows that RNN-based architectures transfer well to languages that are close to English, while self-attentive models have better overall cross-lingual transferability and perform especially well on distant languages.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Wasi Uddin Ahmad (41 papers)
  2. Zhisong Zhang (31 papers)
  3. Xuezhe Ma (50 papers)
  4. Eduard Hovy (115 papers)
  5. Kai-Wei Chang (292 papers)
  6. Nanyun Peng (205 papers)
Citations (8)

Summary

We haven't generated a summary for this paper yet.