Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

KBioXLM: A Knowledge-anchored Biomedical Multilingual Pretrained Language Model (2311.11564v1)

Published 20 Nov 2023 in cs.CL

Abstract: Most biomedical pretrained LLMs are monolingual and cannot handle the growing cross-lingual requirements. The scarcity of non-English domain corpora, not to mention parallel data, poses a significant hurdle in training multilingual biomedical models. Since knowledge forms the core of domain-specific corpora and can be translated into various languages accurately, we propose a model called KBioXLM, which transforms the multilingual pretrained model XLM-R into the biomedical domain using a knowledge-anchored approach. We achieve a biomedical multilingual corpus by incorporating three granularity knowledge alignments (entity, fact, and passage levels) into monolingual corpora. Then we design three corresponding training tasks (entity masking, relation masking, and passage relation prediction) and continue training on top of the XLM-R model to enhance its domain cross-lingual ability. To validate the effectiveness of our model, we translate the English benchmarks of multiple tasks into Chinese. Experimental results demonstrate that our model significantly outperforms monolingual and multilingual pretrained models in cross-lingual zero-shot and few-shot scenarios, achieving improvements of up to 10+ points. Our code is publicly available at https://github.com/ngwlh-gl/KBioXLM.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (9)
  1. Lei Geng (10 papers)
  2. Xu Yan (130 papers)
  3. Ziqiang Cao (34 papers)
  4. Juntao Li (89 papers)
  5. Wenjie Li (183 papers)
  6. Sujian Li (82 papers)
  7. Xinjie Zhou (3 papers)
  8. Yang Yang (883 papers)
  9. Jun Zhang (1008 papers)