Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Fine-tuning BERT for Joint Entity and Relation Extraction in Chinese Medical Text (1908.07721v2)

Published 21 Aug 2019 in cs.CL

Abstract: Entity and relation extraction is the necessary step in structuring medical text. However, the feature extraction ability of the bidirectional long short term memory network in the existing model does not achieve the best effect. At the same time, the LLM has achieved excellent results in more and more natural language processing tasks. In this paper, we present a focused attention model for the joint entity and relation extraction task. Our model integrates well-known BERT LLM into joint learning through dynamic range attention mechanism, thus improving the feature representation ability of shared parameter layer. Experimental results on coronary angiography texts collected from Shuguang Hospital show that the F1-score of named entity recognition and relation classification tasks reach 96.89% and 88.51%, which are better than state-of-the-art methods 1.65% and 1.22%, respectively.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Kui Xue (10 papers)
  2. Yangming Zhou (27 papers)
  3. Zhiyuan Ma (70 papers)
  4. Tong Ruan (22 papers)
  5. Huanhuan Zhang (9 papers)
  6. Ping He (58 papers)
Citations (85)