Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

PVGRU: Generating Diverse and Relevant Dialogue Responses via Pseudo-Variational Mechanism (2212.09086v4)

Published 18 Dec 2022 in cs.CL

Abstract: We investigate response generation for multi-turn dialogue in generative-based chatbots. Existing generative models based on RNNs (Recurrent Neural Networks) usually employ the last hidden state to summarize the sequences, which makes models unable to capture the subtle variability observed in different dialogues and cannot distinguish the differences between dialogues that are similar in composition. In this paper, we propose a Pseudo-Variational Gated Recurrent Unit (PVGRU) component without posterior knowledge through introducing a recurrent summarizing variable into the GRU, which can aggregate the accumulated distribution variations of subsequences. PVGRU can perceive the subtle semantic variability through summarizing variables that are optimized by the devised distribution consistency and reconstruction objectives. In addition, we build a Pseudo-Variational Hierarchical Dialogue (PVHD) model based on PVGRU. Experimental results demonstrate that PVGRU can broadly improve the diversity and relevance of responses on two benchmark datasets.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Yongkang Liu (35 papers)
  2. Shi Feng (95 papers)
  3. Daling Wang (35 papers)
  4. Yifei Zhang (167 papers)
  5. Hinrich Schütze (250 papers)
Citations (6)

Summary

We haven't generated a summary for this paper yet.