Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Keywords and Instances: A Hierarchical Contrastive Learning Framework Unifying Hybrid Granularities for Text Generation (2205.13346v1)

Published 26 May 2022 in cs.CL

Abstract: Contrastive learning has achieved impressive success in generation tasks to militate the "exposure bias" problem and discriminatively exploit the different quality of references. Existing works mostly focus on contrastive learning on the instance-level without discriminating the contribution of each word, while keywords are the gist of the text and dominant the constrained mapping relationships. Hence, in this work, we propose a hierarchical contrastive learning mechanism, which can unify hybrid granularities semantic meaning in the input text. Concretely, we first propose a keyword graph via contrastive correlations of positive-negative pairs to iteratively polish the keyword representations. Then, we construct intra-contrasts within instance-level and keyword-level, where we assume words are sampled nodes from a sentence distribution. Finally, to bridge the gap between independent contrast levels and tackle the common contrast vanishing problem, we propose an inter-contrast mechanism that measures the discrepancy between contrastive keyword nodes respectively to the instance distribution. Experiments demonstrate that our model outperforms competitive baselines on paraphrasing, dialogue generation, and storytelling tasks.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (11)
  1. Mingzhe Li (85 papers)
  2. XieXiong Lin (2 papers)
  3. Xiuying Chen (80 papers)
  4. Jinxiong Chang (4 papers)
  5. Qishen Zhang (7 papers)
  6. Feng Wang (408 papers)
  7. Taifeng Wang (22 papers)
  8. Zhongyi Liu (19 papers)
  9. Wei Chu (118 papers)
  10. Dongyan Zhao (144 papers)
  11. Rui Yan (250 papers)
Citations (10)

Summary

We haven't generated a summary for this paper yet.