Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
194 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

An Innovations Approach to Viterbi Decoding of Convolutional Codes (1710.11310v3)

Published 31 Oct 2017 in cs.IT and math.IT

Abstract: We introduce the notion of innovations for Viterbi decoding of convolutional codes. First we define a kind of innovation corresponding to the received data, i.e., the input to a Viterbi decoder. Then the structure of a Scarce-State-Transition (SST) Viterbi decoder is derived in a natural manner. It is shown that the newly defined innovation is just the input to the main decoder in an SST Viterbi decoder and generates the same syndrome as the original received data does. A similar result holds for Quick-Look-In (QLI) codes as well. In this case, however, the precise innovation is not defined. We see that this innovation-like quantity is related to the linear smoothed estimate of the information. The essence of innovations approach to a linear filtering problem is first to whiten the observed data, and then to treat the resulting simpler white-noise observations problem. In our case, this corresponds to the reduction of decoding complexity in the main decoder in an SST Viterbi decoder. We show the distributions related to the main decoder (i.e., the input distribution and the state distribution in the code trellis for the main decoder) are much biased under moderately noisy conditions. We see that these biased distributions actually lead to the complexity reduction in the main decoder. Furthermore, it is shown that the proposed innovations approach can be extended to maximum-likelihood (ML) decoding of block codes as well.

Citations (7)

Summary

We haven't generated a summary for this paper yet.