Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Incorporating Commonsense Knowledge into Story Ending Generation via Heterogeneous Graph Networks (2201.12538v1)

Published 29 Jan 2022 in cs.CL and cs.AI

Abstract: Story ending generation is an interesting and challenging task, which aims to generate a coherent and reasonable ending given a story context. The key challenges of the task lie in how to comprehend the story context sufficiently and handle the implicit knowledge behind story clues effectively, which are still under-explored by previous work. In this paper, we propose a Story Heterogeneous Graph Network (SHGN) to explicitly model both the information of story context at different granularity levels and the multi-grained interactive relations among them. In detail, we consider commonsense knowledge, words and sentences as three types of nodes. To aggregate non-local information, a global node is also introduced. Given this heterogeneous graph network, the node representations are updated through graph propagation, which adequately utilizes commonsense knowledge to facilitate story comprehension. Moreover, we design two auxiliary tasks to implicitly capture the sentiment trend and key events lie in the context. The auxiliary tasks are jointly optimized with the primary story ending generation task in a multi-task learning strategy. Extensive experiments on the ROCStories Corpus show that the developed model achieves new state-of-the-art performances. Human study further demonstrates that our model generates more reasonable story endings.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (7)
  1. Jiaan Wang (35 papers)
  2. Beiqi Zou (2 papers)
  3. Zhixu Li (43 papers)
  4. Jianfeng Qu (17 papers)
  5. Pengpeng Zhao (25 papers)
  6. An Liu (91 papers)
  7. Lei Zhao (808 papers)
Citations (7)

Summary

We haven't generated a summary for this paper yet.