2000 character limit reached
Cross-Lingual Transfer in Zero-Shot Cross-Language Entity Linking (2010.09828v2)
Published 19 Oct 2020 in cs.CL
Abstract: Cross-language entity linking grounds mentions in multiple languages to a single-language knowledge base. We propose a neural ranking architecture for this task that uses multilingual BERT representations of the mention and the context in a neural network. We find that the multilingual ability of BERT leads to robust performance in monolingual and multilingual settings. Furthermore, we explore zero-shot language transfer and find surprisingly robust performance. We investigate the zero-shot degradation and find that it can be partially mitigated by a proposed auxiliary training objective, but that the remaining error can best be attributed to domain shift rather than language transfer.
- Elliot Schumacher (10 papers)
- James Mayfield (21 papers)
- Mark Dredze (66 papers)