Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Diversifying Content Generation for Commonsense Reasoning with Mixture of Knowledge Graph Experts (2203.07285v1)

Published 14 Mar 2022 in cs.CL and cs.AI

Abstract: Generative commonsense reasoning (GCR) in natural language is to reason about the commonsense while generating coherent text. Recent years have seen a surge of interest in improving the generation quality of commonsense reasoning tasks. Nevertheless, these approaches have seldom investigated diversity in the GCR tasks, which aims to generate alternative explanations for a real-world situation or predict all possible outcomes. Diversifying GCR is challenging as it expects to generate multiple outputs that are not only semantically different but also grounded in commonsense knowledge. In this paper, we propose MoKGE, a novel method that diversifies the generative reasoning by a mixture of expert (MoE) strategy on commonsense knowledge graphs (KG). A set of knowledge experts seek diverse reasoning on KG to encourage various generation outputs. Empirical experiments demonstrated that MoKGE can significantly improve the diversity while achieving on par performance on accuracy on two GCR benchmarks, based on both automatic and human evaluations.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Wenhao Yu (139 papers)
  2. Chenguang Zhu (100 papers)
  3. Lianhui Qin (34 papers)
  4. Zhihan Zhang (54 papers)
  5. Tong Zhao (121 papers)
  6. Meng Jiang (126 papers)
Citations (26)