Papers
Topics
Authors
Recent
Detailed Answer
Quick Answer
Concise responses based on abstracts only
Detailed Answer
Well-researched responses based on abstracts and relevant paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses
Gemini 2.5 Flash
Gemini 2.5 Flash 82 tok/s
Gemini 2.5 Pro 47 tok/s Pro
GPT-5 Medium 14 tok/s Pro
GPT-5 High 16 tok/s Pro
GPT-4o 117 tok/s Pro
Kimi K2 200 tok/s Pro
GPT OSS 120B 469 tok/s Pro
Claude Sonnet 4 36 tok/s Pro
2000 character limit reached

Two-Dimensional Tail-Biting Convolutional Codes (1109.3876v1)

Published 18 Sep 2011 in cs.IT and math.IT

Abstract: The multidimensional convolutional codes are an extension of the notion of convolutional codes (CCs) to several dimensions of time. This paper explores the class of two-dimensional convolutional codes (2D CCs) and 2D tail-biting convolutional codes (2D TBCCs), in particular, from several aspects. First, we derive several basic algebraic properties of these codes, applying algebraic methods in order to find bijective encoders, create parity check matrices and to inverse encoders. Next, we discuss the minimum distance and weight distribution properties of these codes. Extending an existing tree-search algorithm to two dimensions, we apply it to find codes with high minimum distance. Word-error probability asymptotes for sample codes are given and compared with other codes. The results of this approach suggest that 2D TBCCs can perform better than comparable 1D TBCCs or other codes. We then present several novel iterative suboptimal algorithms for soft decoding 2D CCs, which are based on belief propagation. Two main approaches to decoding are considered. We first focus on a decoder which extends the concept of trellis decoding to two dimensions. Second, we investigate algorithms which use the code's parity check matrices. We apply conventional BP in the parity domain, but improve it with a novel modification. Next, we test the generalized belief propagation (GBP) algorithm. Performance results are presented and compared with optimum decoding techniques and bounds. The results show that our suboptimal algorithms achieve respectable results, in some cases coming as close as 0.2dB from optimal (maximum-likelihood) decoding. However for some of the codes there is still a large gap from the optimal decoder.

Citations (3)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-Up Questions

We haven't generated follow-up questions for this paper yet.