Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 147 tok/s
Gemini 2.5 Pro 52 tok/s Pro
GPT-5 Medium 27 tok/s Pro
GPT-5 High 30 tok/s Pro
GPT-4o 96 tok/s Pro
Kimi K2 188 tok/s Pro
GPT OSS 120B 398 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

Belief propagation for general graphical models with loops (2411.04957v1)

Published 7 Nov 2024 in quant-ph

Abstract: Belief Propagation (BP) decoders for quantum error correcting codes are not always precise. There is a growing interest in the application of tensor networks to quantum error correction in general and, in particular, in degenerate quantum maximum likelihood decoding and the tensor network decoder. We develop a unified view to make the generalized BP proposal by Kirkley et. al explicit on arbitrary graphical models. We derive BP schemes and provide inference equations for BP on loopy tensor networks and, more generally, loopy graphical models. In doing so we introduce a tree-equivalent approach which allows us to relate the tensor network BlockBP to a generalized BP for loopy networks. Moreover, we show that the tensor network message passing approach relies essentially on the same approximation as the method by Kirkley. This allows us to make tensor network message passing available for degenerate quantum maximum likelihood decoding. Our method and results are key to obtaining guidelines regarding how the exchange between complexity and decoding accuracy works between BP and tensor network decoders. Finally, we discuss how the tree-equivalent method and the method by Kirkley can justify why message scheduling improves the performance of BP.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.