Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Flexible and Creative Chinese Poetry Generation Using Neural Memory (1705.03773v1)

Published 10 May 2017 in cs.AI and cs.CL

Abstract: It has been shown that Chinese poems can be successfully generated by sequence-to-sequence neural models, particularly with the attention mechanism. A potential problem of this approach, however, is that neural models can only learn abstract rules, while poem generation is a highly creative process that involves not only rules but also innovations for which pure statistical models are not appropriate in principle. This work proposes a memory-augmented neural model for Chinese poem generation, where the neural model and the augmented memory work together to balance the requirements of linguistic accordance and aesthetic innovation, leading to innovative generations that are still rule-compliant. In addition, it is found that the memory mechanism provides interesting flexibility that can be used to generate poems with different styles.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Jiyuan Zhang (57 papers)
  2. Yang Feng (230 papers)
  3. Dong Wang (628 papers)
  4. Yang Wang (672 papers)
  5. Andrew Abel (7 papers)
  6. Shiyue Zhang (39 papers)
  7. Andi Zhang (15 papers)
Citations (72)

Summary

We haven't generated a summary for this paper yet.