Papers
Topics
Authors
Recent
2000 character limit reached

Bounded Capacity Theorem

Updated 24 December 2025
  • Bounded Capacity Theorem is a framework that establishes rigorous upper bounds on the capacity of systems via submultiplicative properties and LP relaxations, using tools like the Lovász theta function and Haemers’ rank bounds.
  • It demonstrates how feedback in queueing theory and fading conditions in MIMO and wireless channels can sharply alter capacity limits, leading to practical design implications.
  • In neural networks and semantic parsing, bounded parameters and memory constraints define finite effective degrees of freedom, influencing approximation capabilities and model expressiveness.

The Bounded Capacity Theorem refers to a collection of rigorous upper bounds on the capacity—broadly construed as the maximal rate of reliable information transfer, approximation, or expressive power—of a range of mathematical and engineering systems under explicit resource constraints. The term encompasses advances in Shannon-theoretic graph capacity, feedback in queueing theory, MIMO and wireless network communication, neural networks with parameter bounds, and syntactic or semantic mechanisms with bounded state or memory. Fundamentally, such results characterize the exact limits imposed by finiteness, boundedness, or submultiplicativity in the system, and often reveal thresholds or sharp separations between finite and unbounded capacity regimes.

1. The Bounded Capacity Theorem in Shannon Graph Theory

The Bounded Capacity Theorem in the context of graph theory strengthens the classic approach to upper bounding Shannon capacity via the independence number, fractional independence number, Lovász theta function θ(G), and Haemers’ minimum rank bound μ_F(G).

Let G=(V,E)G=(V,E) be a finite, undirected simple graph. The Shannon capacity Θ(G)\Theta(G) is defined as

Θ(G)=limnα(Gn)1/n=supnα(Gn)1/n,\Theta(G) = \lim_{n\to\infty} \alpha(G^{\otimes n})^{1/n} = \sup_n \alpha(G^{\otimes n})^{1/n},

where GnG^{\otimes n} is the strong product of nn copies of GG, and α()\alpha(\cdot) denotes the independence number.

The Bounded Capacity Theorem (Hu et al., 2018) states: Given any function ff on graphs satisfying (i) α(G)f(G)\alpha(G)\leq f(G) for all GG, and (ii) f(GH)f(G)f(H)f(G\otimes H)\leq f(G)f(H) (submultiplicativity), define f(G)f^*(G) as the value of the following linear program:

  • Primal form:

maximize xV(G)w(x)\sum_{x\in V(G)} w(x) subject to SV(G):xSw(x)f(GS)\forall S\subseteq V(G): \sum_{x\in S} w(x)\leq f(G_S), w(x)0w(x)\geq0;

  • Dual form:

minimize SV(G)q(S)f(GS)\sum_{S\subseteq V(G)} q(S)f(G_S) subject to xV(G):Sxq(S)1\forall x\in V(G): \sum_{S\ni x} q(S)\geq1, q(S)0q(S)\geq0.

Then,

Θ(G)f(G).\Theta(G) \leq f^*(G).

If ff is the Lovász theta function, f(G)=θ(G)f^*(G)=\theta(G). For Haemers’ minimum rank μF(G)\mu_F(G), the LP yields a novel μ_F*(G) that can strictly improve on both θ(G) and μ_F(G), as in concrete examples for various graph classes. The LP relaxation unifies and sharpens previous approaches. Additionally, the technique extends to new bounds on the index-coding broadcast rate (Hu et al., 2018).

2. Queueing Theory: Feedback and Capacity Under Bounded Support

In discrete-time or continuous-time single-server FIFO queues with i.i.d. service times SS supported within [a,b][a,b] (0<a<b<)(0<a<b<\infty), the Bounded Capacity Theorem (Sahasranand et al., 2023) establishes a strict separation between the capacities achievable with and without feedback.

Let CnoFbC_{noFb}, CwFbC_{wFb}, and CfFbC_{fFb} denote the capacities without feedback, with weak feedback, and with full feedback, respectively, defined via appropriate single-letter mutual information rates. In the bounded-support regime:

  • For any output-rate X<p=1/E[S]X<p=1/\mathbb E[S],

CfFb(X)>CnoFb(X).C_{fFb}(X) > C_{noFb}(X).

That is, full feedback strictly increases capacity whenever service times are supported on a finite interval. This arises from the non-invertibility of the integral operator induced by the compact support, which implies that constrained waiting-time strategies under weak feedback fail to exhaust the entropy maximization available under full feedback. The capacity separation can be numerically demonstrated for truncated exponential and uniform service-time distributions, and persists for both discrete and continuous time (Sahasranand et al., 2023).

3. Multi-Antenna and Wireless Channels: Sublinear and Bounded Capacity

a. Multi-Antenna MIMO Channels

The Bounded Capacity Theorem (Bentosela et al., 2012) addresses MIMO channels constructed from physical scattering models. When the fading matrix FF (obtained by Fourier-transforming HHHH^\ast) obeys:

  • (A1) Almost-diagonal structure: for all ii, the off-diagonal entries decay as a bounded multiple of fi,if_{i,i};
  • (A2) Power-law decay: fif1irf_i\leq f_1 i^{-r} for r>1r>1;

then, for any y>1y>1,

CMAM1/y+BlnM,C_M \leq A\,M^{1/y} + B\ln M,

for suitable A,BA,B, implying capacity scaling is strictly sub-linear in the number of antennas MM (Bentosela et al., 2012).

b. Noncoherent and Interference Networks

For discrete-time, noncoherent multipath fading channels with path variances {α}\{\alpha_\ell\} not decaying faster than geometrically, i.e.,

lim infα+1α>0,\liminf_{\ell\to\infty} \frac{\alpha_{\ell+1}}{\alpha_\ell} > 0,

the capacity C(SNR)C(SNR) is bounded uniformly over all SNR (0711.3152). Similarly, in infinite wireless interference networks with exponentially or slower decaying α\alpha_\ell (per-interferer fading variances), the SNR-independent upper bound holds:

C(SNR)O(1/ρ),SNR>0,C(\mathrm{SNR}) \leq \mathcal{O}(1/\rho),\quad \forall \mathrm{SNR}>0,

where ρ\rho denotes the geometric decay parameter (Villacrés et al., 2015).

c. Vector Gaussian Channels with Peak and Average Power

In deterministic vector Gaussian (AWGN) channels with identity channel matrix and constraints X2up\|X\|^2 \leq u_p, EX2ua\mathbb E\|X\|^2\leq u_a, the Bounded Capacity Theorem (Rassouli et al., 2014) shows that the capacity-achieving input law has a finite set of amplitude values (a discrete support in magnitude), and the capacity is correspondingly bounded. In the high-dimensional/relaxed average-power regime, constant-amplitude signaling is optimal, and the capacity converges to that achieved by Gaussian signaling as the number of antennas grows.

4. Bounded Capacity in Machine Learning and Neural Approximation

For deep feed-forward neural networks with real-analytic activation functions and all nonlinear parameters (weights and biases) bounded within a compact set SS, the Bounded Capacity Theorem (Liu et al., 2024) establishes:

  • For any fixed numerical tolerance ϵ>0\epsilon>0, there exists M=M(ϵ,g,S,L,n)<M=M(\epsilon,g,S,L,n)<\infty such that any LL-layer network of arbitrary width, with activations in SS, can realize at most MM effective degrees of freedom (Numerical Span Dimension) at approximation scale ϵ\epsilon.
  • Thus, neural networks with bounded weights are not universal in the practical, numeric sense—bounded numerical capacity replaces the unbounded approximation property of the classical Universal Approximation Theorem.
  • Empirically, as the width increases, the singular-value spectrum of the hidden-layer output matrix saturates rapidly, indicating severe redundancy once the width exceeds MM.
  • Additional theoretical results demonstrate equivalence (at infinite width) between back-propagation networks and random-parameter networks, and clarify why regularization, depth, and parameter condensation phenomena emerge.

5. Capacity Bounds under Adversarial and Stochastic Channel Models

In physical communication channels with continuous-time and delay/noise errors, Khanna & Sudan (Khanna et al., 2011) provide a dichotomy captured by the Bounded Capacity Theorem:

  • The capacity is finitely bounded if and only if at least one of the two error mechanisms (delay or noise) is adversarial (with sufficient power, e.g., adversarial delay or noise that knows the delay realization).
  • If both delay and noise are purely stochastic (independent), or if noise is adversarial but unaware of delay, the channel admits unbounded capacity through arbitrarily fine time discretization.
  • These results extend the classical amplitude-constrained capacity limits by showing that adversarial timing joins noise as an essential mechanism for enforcing finite per-unit-time bit rates.

6. Compositional Semantics, Memory Constraints, and Expressive Capacity

In compositional semantic parsing models, the Bounded Capacity Theorem (Venant et al., 2019) formally relates memory limits and the class of generable semantic structures:

  • For any nn-bounded, projective (adjacent-span combining) semantic mechanism, there exists a sentence of length $2(n+1)$ whose intended cross-serial semantic dependencies cannot be constructed by the mechanism.
  • This places a strict upper bound on the expressive capacity of projective, finite-state or bounded-memory models: they fail on semantic structures with unbounded cross-serial dependencies.
  • Nonprojective combinators or non-finite memory mechanisms are necessary to represent wider classes of semantic relations. This result clarifies capacity limitations in both grammar-based and neural parsing architectures.

7. Perspectives, Open Problems, and Broader Implications

The bounded capacity paradigm unifies a variety of settings wherein inherent physical or combinatorial constraints induce sharp, quantifiable upper bounds on system capacity:

  • In channel coding, such bounds guide practical system design by illuminating which physical effects—multipath, delay, feedback, or adversarial interference—cause capacity saturation or require fundamentally new architectures to overcome.
  • For neural function approximation, bounded capacity metrics such as the ϵ\epsilon-outer measure or NSdim explain empirical bottlenecks and supply a rigorous foundation for regularization and architectural decisions.
  • In semantics and formal language theory, bounded-state limits yield rigorous separation results regarding the necessity of unbounded memory or non-projective transformations.

Several technical challenges remain open, for example:

  • Developing efficient algorithms for the exponentially-constrained linear programs underlying graph capacity bounds (Hu et al., 2018).
  • Extending vector-minrank and LP bounds to directed graphs or capacity-like invariants (Hu et al., 2018).
  • Characterizing the gap between finite-parameter and infinite-parameter expressive power in deep architectures quantitatively (Liu et al., 2024).

A plausible implication is that, across communication, computation, and modeling, the search for tight bounded capacity results will remain central wherever scalable reliability, performance, or expressive power are required under real-world constraints.

Whiteboard

Topic to Video (Beta)

Follow Topic

Get notified by email when new papers are published related to Bounded Capacity Theorem.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube