Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Mixture-of-Partitions: Infusing Large Biomedical Knowledge Graphs into BERT (2109.04810v1)

Published 10 Sep 2021 in cs.CL

Abstract: Infusing factual knowledge into pre-trained models is fundamental for many knowledge-intensive tasks. In this paper, we proposed Mixture-of-Partitions (MoP), an infusion approach that can handle a very large knowledge graph (KG) by partitioning it into smaller sub-graphs and infusing their specific knowledge into various BERT models using lightweight adapters. To leverage the overall factual knowledge for a target task, these sub-graph adapters are further fine-tuned along with the underlying BERT through a mixture layer. We evaluate our MoP with three biomedical BERTs (SciBERT, BioBERT, PubmedBERT) on six downstream tasks (inc. NLI, QA, Classification), and the results show that our MoP consistently enhances the underlying BERTs in task performance, and achieves new SOTA performances on five evaluated datasets.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Zaiqiao Meng (42 papers)
  2. Fangyu Liu (59 papers)
  3. Thomas Hikaru Clark (3 papers)
  4. Ehsan Shareghi (54 papers)
  5. Nigel Collier (83 papers)
Citations (36)

Summary

We haven't generated a summary for this paper yet.