Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Communication-Efficient Parallel Belief Propagation for Latent Dirichlet Allocation (1206.2190v1)

Published 11 Jun 2012 in cs.LG

Abstract: This paper presents a novel communication-efficient parallel belief propagation (CE-PBP) algorithm for training latent Dirichlet allocation (LDA). Based on the synchronous belief propagation (BP) algorithm, we first develop a parallel belief propagation (PBP) algorithm on the parallel architecture. Because the extensive communication delay often causes a low efficiency of parallel topic modeling, we further use Zipf's law to reduce the total communication cost in PBP. Extensive experiments on different data sets demonstrate that CE-PBP achieves a higher topic modeling accuracy and reduces more than 80% communication cost than the state-of-the-art parallel Gibbs sampling (PGS) algorithm.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Zhi-Qiang Liu (11 papers)
  2. Yang Gao (761 papers)
  3. Jia Zeng (45 papers)
  4. Jian-Feng Yan (2 papers)
Citations (7)

Summary

We haven't generated a summary for this paper yet.