Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Context Reinforced Neural Topic Modeling over Short Texts (2008.04545v1)

Published 11 Aug 2020 in cs.IR and cs.CL

Abstract: As one of the prevalent topic mining tools, neural topic modeling has attracted a lot of interests for the advantages of high efficiency in training and strong generalisation abilities. However, due to the lack of context in each short text, the existing neural topic models may suffer from feature sparsity on such documents. To alleviate this issue, we propose a Context Reinforced Neural Topic Model (CRNTM), whose characteristics can be summarized as follows. Firstly, by assuming that each short text covers only a few salient topics, CRNTM infers the topic for each word in a narrow range. Secondly, our model exploits pre-trained word embeddings by treating topics as multivariate Gaussian distributions or Gaussian mixture distributions in the embedding space. Extensive experiments on two benchmark datasets validate the effectiveness of the proposed model on both topic discovery and text classification.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Jiachun Feng (1 paper)
  2. Zusheng Zhang (3 papers)
  3. Cheng Ding (16 papers)
  4. Yanghui Rao (10 papers)
  5. Haoran Xie (106 papers)
Citations (28)

Summary

We haven't generated a summary for this paper yet.