Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Dialogue Generation on Infrequent Sentence Functions via Structured Meta-Learning (2010.01495v1)

Published 4 Oct 2020 in cs.CL, cs.AI, and cs.LG

Abstract: Sentence function is an important linguistic feature indicating the communicative purpose in uttering a sentence. Incorporating sentence functions into conversations has shown improvements in the quality of generated responses. However, the number of utterances for different types of fine-grained sentence functions is extremely imbalanced. Besides a small number of high-resource sentence functions, a large portion of sentence functions is infrequent. Consequently, dialogue generation conditioned on these infrequent sentence functions suffers from data deficiency. In this paper, we investigate a structured meta-learning (SML) approach for dialogue generation on infrequent sentence functions. We treat dialogue generation conditioned on different sentence functions as separate tasks, and apply model-agnostic meta-learning to high-resource sentence functions data. Furthermore, SML enhances meta-learning effectiveness by promoting knowledge customization among different sentence functions but simultaneously preserving knowledge generalization for similar sentence functions. Experimental results demonstrate that SML not only improves the informativeness and relevance of generated responses, but also can generate responses consistent with the target sentence functions.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Yifan Gao (69 papers)
  2. Piji Li (75 papers)
  3. Wei Bi (62 papers)
  4. Xiaojiang Liu (27 papers)
  5. Michael R. Lyu (176 papers)
  6. Irwin King (170 papers)
Citations (7)

Summary

We haven't generated a summary for this paper yet.