Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

UNTER: A Unified Knowledge Interface for Enhancing Pre-trained Language Models (2305.01624v2)

Published 2 May 2023 in cs.CL

Abstract: Recent research demonstrates that external knowledge injection can advance pre-trained LLMs (PLMs) in a variety of downstream NLP tasks. However, existing knowledge injection methods are either applicable to structured knowledge or unstructured knowledge, lacking a unified usage. In this paper, we propose a UNified knowledge inTERface, UNTER, to provide a unified perspective to exploit both structured knowledge and unstructured knowledge. In UNTER, we adopt the decoder as a unified knowledge interface, aligning span representations obtained from the encoder with their corresponding knowledge. This approach enables the encoder to uniformly invoke span-related knowledge from its parameters for downstream applications. Experimental results show that, with both forms of knowledge injected, UNTER gains continuous improvements on a series of knowledge-driven NLP tasks, including entity typing, named entity recognition and relation extraction, especially in low-resource scenarios.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Deming Ye (10 papers)
  2. Yankai Lin (125 papers)
  3. Zhengyan Zhang (46 papers)
  4. Maosong Sun (337 papers)