Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learning Hidden Markov Models from Pairwise Co-occurrences with Application to Topic Modeling (1802.06894v2)

Published 19 Feb 2018 in cs.CL, cs.LG, eess.SP, and stat.ML

Abstract: We present a new algorithm for identifying the transition and emission probabilities of a hidden Markov model (HMM) from the emitted data. Expectation-maximization becomes computationally prohibitive for long observation records, which are often required for identification. The new algorithm is particularly suitable for cases where the available sample size is large enough to accurately estimate second-order output probabilities, but not higher-order ones. We show that if one is only able to obtain a reliable estimate of the pairwise co-occurrence probabilities of the emissions, it is still possible to uniquely identify the HMM if the emission probability is \emph{sufficiently scattered}. We apply our method to hidden topic Markov modeling, and demonstrate that we can learn topics with higher quality if documents are modeled as observations of HMMs sharing the same emission (topic) probability, compared to the simple but widely used bag-of-words model.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Kejun Huang (23 papers)
  2. Xiao Fu (92 papers)
  3. Nicholas D. Sidiropoulos (70 papers)
Citations (23)

Summary

We haven't generated a summary for this paper yet.