Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Zero-Resource Cross-Lingual Named Entity Recognition (1911.09812v1)

Published 22 Nov 2019 in cs.CL and cs.LG

Abstract: Recently, neural methods have achieved state-of-the-art (SOTA) results in Named Entity Recognition (NER) tasks for many languages without the need for manually crafted features. However, these models still require manually annotated training data, which is not available for many languages. In this paper, we propose an unsupervised cross-lingual NER model that can transfer NER knowledge from one language to another in a completely unsupervised way without relying on any bilingual dictionary or parallel data. Our model achieves this through word-level adversarial learning and augmented fine-tuning with parameter sharing and feature augmentation. Experiments on five different languages demonstrate the effectiveness of our approach, outperforming existing models by a good margin and setting a new SOTA for each language pair.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. M Saiful Bari (22 papers)
  2. Shafiq Joty (187 papers)
  3. Prathyusha Jwalapuram (9 papers)
Citations (49)

Summary

We haven't generated a summary for this paper yet.