Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Improving Chinese Story Generation via Awareness of Syntactic Dependencies and Semantics (2210.10618v1)

Published 19 Oct 2022 in cs.CL and cs.AI

Abstract: Story generation aims to generate a long narrative conditioned on a given input. In spite of the success of prior works with the application of pre-trained models, current neural models for Chinese stories still struggle to generate high-quality long text narratives. We hypothesise that this stems from ambiguity in syntactically parsing the Chinese language, which does not have explicit delimiters for word segmentation. Consequently, neural models suffer from the inefficient capturing of features in Chinese narratives. In this paper, we present a new generation framework that enhances the feature capturing mechanism by informing the generation model of dependencies between words and additionally augmenting the semantic representation learning through synonym denoising training. We conduct a range of experiments, and the results demonstrate that our framework outperforms the state-of-the-art Chinese generation models on all evaluation metrics, demonstrating the benefits of enhanced dependency and semantic representation learning.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Henglin Huang (2 papers)
  2. Chen Tang (94 papers)
  3. Tyler Loakman (13 papers)
  4. Frank Guerin (30 papers)
  5. Chenghua Lin (127 papers)
Citations (11)

Summary

We haven't generated a summary for this paper yet.