Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Information Structures of Maximizing Distributions of Feedback Capacity for General Channels with Memory & Applications (1604.01063v2)

Published 4 Apr 2016 in cs.IT and math.IT

Abstract: For any class of channel conditional distributions, with finite memory dependence on channel input RVs $An {\stackrel{\triangle}{=}} {A_i: i=0, \ldots, n}$ or channel output RVs $Bn {\stackrel{\triangle}{=}} {B_i: i=0, \ldots, n}$ or both, we characterize the sets of channel input distributions, which maximize directed information defined by $ I(An \rightarrow Bn) {\stackrel{\triangle}{=}} \sum_{i=0}n I(Ai;B_i|B{i-1}) $ and we derive the corresponding expressions, called "characterizations of Finite Transmission Feedback Information (FTFI) capacity". The main theorems state that optimal channel input distributions occur in subsets ${\cal P}{[0,n]}{CI}\subseteq {\cal P}{[0,n]} {\stackrel{\triangle}{=}}\big{ {\bf P}{A_i|A{i-1}, B{i-1}}: i=0, \ldots, n\big}$, which satisfy conditional independence on past information. We derive similar characterizations, when general transmission cost constraints are imposed. Moreover, we also show that the structural properties apply to general nonlinear and linear autoregressive channel models defined by discrete-time recursions on general alphabet spaces, and driven by arbitrary distributed noise processes. We derive these structural properties by invoking stochastic optimal control theory and variational equalities of directed information, to identify tight upper bounds on $I(An \rightarrow Bn)$, which are achievable over subsets of conditional distributions ${\cal P}{[0,n]}{CI} \subseteq {\cal P}_{[0,n]}$, which satisfy conditional independence and they are specified by the dependence of channel distributions and transmission cost functions on inputs and output symbols. We apply the characterizations to recursive Multiple Input Multiple Output Gaussian Linear Channel Models with limited memory and we show a separation principle between the computation of the elements of the optimal strategies.

Summary

We haven't generated a summary for this paper yet.