Cross-lingual Dependency Parsing with Unlabeled Auxiliary Languages (1909.09265v1)
Abstract: Cross-lingual transfer learning has become an important weapon to battle the unavailability of annotated resources for low-resource languages. One of the fundamental techniques to transfer across languages is learning \emph{language-agnostic} representations, in the form of word embeddings or contextual encodings. In this work, we propose to leverage unannotated sentences from auxiliary languages to help learning language-agnostic representations. Specifically, we explore adversarial training for learning contextual encoders that produce invariant representations across languages to facilitate cross-lingual transfer. We conduct experiments on cross-lingual dependency parsing where we train a dependency parser on a source language and transfer it to a wide range of target languages. Experiments on 28 target languages demonstrate that adversarial training significantly improves the overall transfer performances under several different settings. We conduct a careful analysis to evaluate the language-agnostic representations resulted from adversarial training.
- Wasi Uddin Ahmad (41 papers)
- Zhisong Zhang (31 papers)
- Xuezhe Ma (50 papers)
- Kai-Wei Chang (292 papers)
- Nanyun Peng (205 papers)