Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

An Iterative Polishing Framework based on Quality Aware Masked Language Model for Chinese Poetry Generation (1911.13182v1)

Published 29 Nov 2019 in cs.CL and cs.AI

Abstract: Owing to its unique literal and aesthetical characteristics, automatic generation of Chinese poetry is still challenging in Artificial Intelligence, which can hardly be straightforwardly realized by end-to-end methods. In this paper, we propose a novel iterative polishing framework for highly qualified Chinese poetry generation. In the first stage, an encoder-decoder structure is utilized to generate a poem draft. Afterwards, our proposed Quality-Aware Masked LLM (QAMLM) is employed to polish the draft towards higher quality in terms of linguistics and literalness. Based on a multi-task learning scheme, QA-MLM is able to determine whether polishing is needed based on the poem draft. Furthermore, QAMLM is able to localize improper characters of the poem draft and substitute with newly predicted ones accordingly. Benefited from the masked LLM structure, QAMLM incorporates global context information into the polishing process, which can obtain more appropriate polishing results than the unidirectional sequential decoding. Moreover, the iterative polishing process will be terminated automatically when QA-MLM regards the processed poem as a qualified one. Both human and automatic evaluation have been conducted, and the results demonstrate that our approach is effective to improve the performance of encoder-decoder structure.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (8)
  1. Liming Deng (1 paper)
  2. Jie Wang (480 papers)
  3. Hangming Liang (1 paper)
  4. Hui Chen (298 papers)
  5. Zhiqiang Xie (15 papers)
  6. Bojin Zhuang (10 papers)
  7. Shaojun Wang (29 papers)
  8. Jing Xiao (267 papers)
Citations (22)

Summary

We haven't generated a summary for this paper yet.