Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

EtriCA: Event-Triggered Context-Aware Story Generation Augmented by Cross Attention (2210.12463v1)

Published 22 Oct 2022 in cs.CL and cs.AI

Abstract: One of the key challenges of automatic story generation is how to generate a long narrative that can maintain fluency, relevance, and coherence. Despite recent progress, current story generation systems still face the challenge of how to effectively capture contextual and event features, which has a profound impact on a model's generation performance. To address these challenges, we present EtriCA, a novel neural generation model, which improves the relevance and coherence of the generated stories through residually mapping context features to event sequences with a cross-attention mechanism. Such a feature capturing mechanism allows our model to better exploit the logical relatedness between events when generating stories. Extensive experiments based on both automatic and human evaluations show that our model significantly outperforms state-of-the-art baselines, demonstrating the effectiveness of our model in leveraging context and event features.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Chen Tang (94 papers)
  2. Chenghua Lin (127 papers)
  3. Henglin Huang (2 papers)
  4. Frank Guerin (30 papers)
  5. Zhihao Zhang (61 papers)
Citations (17)

Summary

We haven't generated a summary for this paper yet.