Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Improving Neural Topic Models using Knowledge Distillation (2010.02377v1)

Published 5 Oct 2020 in cs.CL, cs.IR, and cs.LG

Abstract: Topic models are often used to identify human-interpretable topics to help make sense of large document collections. We use knowledge distillation to combine the best attributes of probabilistic topic models and pretrained transformers. Our modular method can be straightforwardly applied with any neural topic model to improve topic quality, which we demonstrate using two models having disparate architectures, obtaining state-of-the-art topic coherence. We show that our adaptable framework not only improves performance in the aggregate over all estimated topics, as is commonly reported, but also in head-to-head comparisons of aligned topics.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Alexander Hoyle (13 papers)
  2. Pranav Goel (10 papers)
  3. Philip Resnik (20 papers)
Citations (43)

Summary

We haven't generated a summary for this paper yet.