Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Integrating Graph Contextualized Knowledge into Pre-trained Language Models (1912.00147v3)

Published 30 Nov 2019 in cs.CL and cs.AI

Abstract: Complex node interactions are common in knowledge graphs, and these interactions also contain rich knowledge information. However, traditional methods usually treat a triple as a training unit during the knowledge representation learning (KRL) procedure, neglecting contextualized information of the nodes in knowledge graphs (KGs). We generalize the modeling object to a very general form, which theoretically supports any subgraph extracted from the knowledge graph, and these subgraphs are fed into a novel transformer-based model to learn the knowledge embeddings. To broaden usage scenarios of knowledge, pre-trained LLMs are utilized to build a model that incorporates the learned knowledge representations. Experimental results demonstrate that our model achieves the state-of-the-art performance on several medical NLP tasks, and improvement above TransE indicates that our KRL method captures the graph contextualized information effectively.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Bin He (58 papers)
  2. Di Zhou (60 papers)
  3. Jinghui Xiao (9 papers)
  4. Qun Liu (230 papers)
  5. Nicholas Jing Yuan (22 papers)
  6. Tong Xu (113 papers)
  7. Xin Jiang (242 papers)
Citations (74)

Summary

We haven't generated a summary for this paper yet.