Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Coupling Distant Annotation and Adversarial Training for Cross-Domain Chinese Word Segmentation (2007.08186v2)

Published 16 Jul 2020 in cs.CL and cs.LG

Abstract: Fully supervised neural approaches have achieved significant progress in the task of Chinese word segmentation (CWS). Nevertheless, the performance of supervised models tends to drop dramatically when they are applied to out-of-domain data. Performance degradation is caused by the distribution gap across domains and the out of vocabulary (OOV) problem. In order to simultaneously alleviate these two issues, this paper proposes to couple distant annotation and adversarial training for cross-domain CWS. For distant annotation, we rethink the essence of "Chinese words" and design an automatic distant annotation mechanism that does not need any supervision or pre-defined dictionaries from the target domain. The approach could effectively explore domain-specific words and distantly annotate the raw texts for the target domain. For adversarial training, we develop a sentence-level training procedure to perform noise reduction and maximum utilization of the source domain information. Experiments on multiple real-world datasets across various domains show the superiority and robustness of our model, significantly outperforming previous state-of-the-art cross-domain CWS methods.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Ning Ding (122 papers)
  2. Dingkun Long (23 papers)
  3. Guangwei Xu (18 papers)
  4. Muhua Zhu (8 papers)
  5. Pengjun Xie (85 papers)
  6. Xiaobin Wang (39 papers)
  7. Hai-Tao Zheng (94 papers)
Citations (17)

Summary

We haven't generated a summary for this paper yet.