Unsupervised Cross-lingual Adaptation for Sequence Tagging and Beyond (2010.12405v3)
Abstract: Cross-lingual adaptation with multilingual pre-trained LLMs (mPTLMs) mainly consists of two lines of works: zero-shot approach and translation-based approach, which have been studied extensively on the sequence-level tasks. We further verify the efficacy of these cross-lingual adaptation approaches by evaluating their performances on more fine-grained sequence tagging tasks. After re-examining their strengths and drawbacks, we propose a novel framework to consolidate the zero-shot approach and the translation-based approach for better adaptation performance. Instead of simply augmenting the source data with the machine-translated data, we tailor-make a warm-up mechanism to quickly update the mPTLMs with the gradients estimated on a few translated data. Then, the adaptation approach is applied to the refined parameters and the cross-lingual transfer is performed in a warm-start way. The experimental results on nine target languages demonstrate that our method is beneficial to the cross-lingual adaptation of various sequence tagging tasks.
- Xin Li (980 papers)
- Lidong Bing (144 papers)
- Wenxuan Zhang (75 papers)
- Zheng Li (326 papers)
- Wai Lam (117 papers)