Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
51 tokens/sec
GPT-4o
60 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
8 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Training Conversational Agents with Generative Conversational Networks (2110.08383v1)

Published 15 Oct 2021 in cs.CL

Abstract: Rich, open-domain textual data available on the web resulted in great advancements for language processing. However, while that data may be suitable for language processing tasks, they are mostly non-conversational, lacking many phenomena that appear in human interactions and this is one of the reasons why we still have many unsolved challenges in conversational AI. In this work, we attempt to address this by using Generative Conversational Networks to automatically generate data and train social conversational agents. We evaluate our approach on TopicalChat with automatic metrics and human evaluators, showing that with 10% of seed data it performs close to the baseline that uses 100% of the data.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Yen-Ting Lin (117 papers)
  2. Alexandros Papangelis (23 papers)
  3. Seokhwan Kim (29 papers)
  4. Dilek Hakkani-Tur (94 papers)