Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Knowledge-Grounded Dialog System Based on Pre-Trained Language Models (2106.14444v1)

Published 28 Jun 2021 in cs.CL

Abstract: We present a knowledge-grounded dialog system developed for the ninth Dialog System Technology Challenge (DSTC9) Track 1 - Beyond Domain APIs: Task-oriented Conversational Modeling with Unstructured Knowledge Access. We leverage transfer learning with existing LLMs to accomplish the tasks in this challenge track. Specifically, we divided the task into four sub-tasks and fine-tuned several Transformer models on each of the sub-tasks. We made additional changes that yielded gains in both performance and efficiency, including the combination of the model with traditional entity-matching techniques, and the addition of a pointer network to the output layer of the LLM.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Weijie Zhang (8 papers)
  2. Jiaoxuan Chen (2 papers)
  3. Haipang Wu (5 papers)
  4. Sanhui Wan (1 paper)
  5. Gongfeng Li (1 paper)
Citations (4)