Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Bidirectional LSTM-CRF Attention-based Model for Chinese Word Segmentation (2105.09681v1)

Published 20 May 2021 in cs.LG

Abstract: Chinese word segmentation (CWS) is the basic of Chinese NLP. The quality of word segmentation will directly affect the rest of NLP tasks. Recently, with the artificial intelligence tide rising again, Long Short-Term Memory (LSTM) neural network, as one of easily modeling in sequence, has been widely utilized in various kinds of NLP tasks, and functions well. Attention mechanism is an ingenious method to solve the memory compression problem on LSTM. Furthermore, inspired by the powerful abilities of bidirectional LSTM models for modeling sequence and CRF model for decoding, we propose a Bidirectional LSTM-CRF Attention-based Model in this paper. Experiments on PKU and MSRA benchmark datasets show that our model performs better than the baseline methods modeling by other neural networks.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Chen Jin (18 papers)
  2. Zhuangwei Shi (7 papers)
  3. Weihua Li (43 papers)
  4. Yanbu Guo (2 papers)
Citations (7)