Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
102 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Deep Temporal-Recurrent-Replicated-Softmax for Topical Trends over Time (1711.05626v2)

Published 15 Nov 2017 in cs.CL, cs.AI, cs.IR, and cs.LG

Abstract: Dynamic topic modeling facilitates the identification of topical trends over time in temporal collections of unstructured documents. We introduce a novel unsupervised neural dynamic topic model named as Recurrent Neural Network-Replicated Softmax Model (RNNRSM), where the discovered topics at each time influence the topic discovery in the subsequent time steps. We account for the temporal ordering of documents by explicitly modeling a joint distribution of latent topical dependencies over time, using distributional estimators with temporal recurrent connections. Applying RNN-RSM to 19 years of articles on NLP research, we demonstrate that compared to state-of-the art topic models, RNNRSM shows better generalization, topic interpretation, evolution and trends. We also introduce a metric (named as SPAN) to quantify the capability of dynamic topic model to capture word evolution in topics over time.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Pankaj Gupta (33 papers)
  2. Subburam Rajaram (2 papers)
  3. Hinrich Schütze (250 papers)
  4. Bernt Andrassy (3 papers)
Citations (12)