Papers
Topics
Authors
Recent
2000 character limit reached

Sharp Bounds on Rényi Entropy

Updated 16 December 2025
  • The paper introduces explicit sharp bounds on Rényi entropy using reverse Young's inequalities and log-concave comparisons to establish tight entropy power inequalities.
  • It demonstrates implications across operational characterizations, including channel capacity estimation and quantum state analysis, bridging theory and practice.
  • Methodologies leverage convex optimization, extremal transport techniques, and majorization arguments to derive optimal bounds that recover classical results in limiting cases.

Rényi entropy, a one-parameter generalization of Shannon entropy, quantifies the diversity, uncertainty, or randomness of a probability distribution, with rich applications in information theory, probability, mathematical physics, and high-dimensional geometry. Sharp bounds on Rényi entropy play a critical role in establishing fundamental inequalities, continuity properties, and operational characterizations in classical, discrete, and quantum contexts.

1. Definition and Basic Properties

For a probability density ff on Rn\mathbb{R}^n (or mass function pp on a finite or countable set), the Rényi entropy of order α>0,α1\alpha > 0, \alpha \neq 1 is defined as

Hα(f)=11αlogf(x)αdx,H_\alpha(f) = \frac{1}{1-\alpha} \log \int f(x)^\alpha dx,

or, in discrete settings,

Hα(p)=11αlogxp(x)α.H_\alpha(p) = \frac{1}{1-\alpha} \log \sum_x p(x)^\alpha.

The limits α1\alpha \to 1, α0\alpha \to 0, and α\alpha \to \infty yield the Shannon entropy, Hartley entropy (support size), and min-entropy, respectively:

  • H1(p)=xp(x)logp(x)H_1(p) = -\sum_x p(x) \log p(x)
  • H0(p)=log{x:p(x)>0}H_0(p) = \log |\{ x : p(x) > 0 \}|
  • H(p)=logpH_\infty(p) = -\log \|p\|_\infty

Rényi entropy is monotonic in its parameter: for any probability distribution, the mapping αHα\alpha \mapsto H_\alpha is non-increasing.

2. Sharp Rényi Entropy Power Inequalities for $0 < r < 1$

Let XX and YY be independent random vectors in Rn\mathbb{R}^n with log-concave densities. For r(0,1)r \in (0,1), the Rényi entropy power is Nr(X)=exp(2nHr(X))N_r(X) = \exp\left(\frac{2}{n} H_r(X)\right). Using a sharp reverse Young's inequality and a comparison lemma due to Fradelizi, Madiman, and Wang, sharp entropy power inequalities (EPIs) for $0 < r < 1$ are established, notably:

Nr(X+Y)α(r)Nr(X)α(r)+Nr(Y)α(r),N_r(X+Y)^{\alpha(r)} \geq N_r(X)^{\alpha(r)} + N_r(Y)^{\alpha(r)},

with explicit exponent α(r)\alpha(r) and explicit constants: α(r)=(1r)ln2(1+r)ln(1+r)+rln(1/(4r)).\alpha(r) = \frac{(1-r)\ln 2}{(1+r)\ln(1+r) + r\ln(1/(4r))}. Equality holds asymptotically as r1r \to 1, recovering the Shannon EPI, while as r0r \to 0, α(r)\alpha(r) \to \infty, corresponding to the Brunn–Minkowski regime. For the special case of uniform distributions or i.i.d. log-concave variables, further sharp constants and bounds on kk-fold sums are provided, demonstrating tightness up to absolute constants. These results are sharp in the sense that matching lower bounds can be obtained by extremal exponential or Gaussian laws (Marsiglietti et al., 2017).

3. Rényi Entropy Bounds for Log-Concave Measures

Sharp lower bounds for Rényi entropy, particularly for symmetric log-concave random variables, play a central role in additive combinatorics, channel capacity estimation, and convex geometry. For one-dimensional, symmetric, log-concave XX and any α[0,1]\alpha \in [0,1], the optimal bound is:

Hα(X)logXp+log[2(p+1)1/p],H_\alpha(X) \geq \log \| X \|_p + \log \left[2 (p+1)^{1/p}\right],

where Xp=(EXp)1/p\| X \|_p = (\mathbb{E}|X|^p)^{1/p}, with equality if and only if XX is uniform on a symmetric interval. In the variance case (p=2p = 2), this becomes

Hα(X)logσ+log(23),H_\alpha(X) \geq \log \sigma + \log(2\sqrt{3}),

with sharpness achieved only by the uniform distribution (Madiman et al., 2018). These bounds extend to consequences for the capacity of additive noise channels and lead to explicit reverse entropy power inequalities.

For discrete log-concave integer-valued distributions, similar sharp reversals hold: for any XX log-concave on Z\mathbb{Z}: H(X)H(X)lne=1,H(X) - H_\infty(X) \leq \ln e = 1, with equality approached by geometric distributions as their parameter tends to zero. For all Rényi orders, a universal upper bound on the difference with the min-entropy is established (Melbourne et al., 2020).

4. Sharp Continuity and Order-Comparison Bounds

Uniform continuity and sharp comparison between Rényi entropies of different orders are fundamental for robustness, information stability, and operational interpretations. For any two finite-dimensional quantum states ρ\rho and σ\sigma, the sharp bound is provided in terms of their trace distance δ\delta:

Hα(ρ)Hα(σ)fα(δ,d),|H_\alpha(\rho) - H_\alpha(\sigma)| \leq f_\alpha(\delta, d),

where for α>1\alpha > 1,

fα(δ,d)=dα1α1[1(1δ)α(d1)1αδα]f_\alpha(\delta, d) = \frac{d^{\alpha-1}}{\alpha-1} \left[1 - (1-\delta)^\alpha - (d-1)^{1-\alpha} \delta^\alpha \right]

and similarly for 0<α<10 < \alpha < 1 (Chen et al., 2017). Equality is attained for commuting states with prescribed eigenvalues.

The relationship between Rényi entropies of distinct positive orders (αβ\alpha \neq \beta) for finite probability vectors is characterized via extremal permutations of "one hot plus flat" (vnv_n) and "as many pp's as fit" (wnw_n) distributions. For pΔnp \in \Delta_n, sharp inequalities of the form

Bw(β)Hβ(p)Bv(β)B_w(\beta) \leq H_\beta(p) \leq B_v(\beta)

hold in specific order regimes, with all bounds tight and explicit. Applications include sharp bounds for Arimoto's mutual information and Gallager's exponents (Sakai et al., 2016).

For conditional Rényi entropy, sharp uniform continuity bounds for Arimoto's definition are established:

Hα(XY)pHα(XY)q11αlog((1ε)α+(X1)1αεα)| H_\alpha(X|Y)_p - H_\alpha(X|Y)_q | \leq \frac{1}{1-\alpha} \log \big((1-ε)^\alpha + (|X|-1)^{1-\alpha} ε^\alpha \big)

for total variation distance at most εε, with extremizers constructed explicitly (Jabbour et al., 2020).

5. Sharp Rényi Entropy Power Inequalities for α>1\alpha > 1

The sharpest known (additive, multiplicative, or exponent-modified) entropy power inequalities for Rényi entropy are derived using sharpened Young's inequalities and convex optimization. For independent continuous random vectors XkX_k (k=1,,nk = 1, \dots, n), α>1\alpha > 1:

First Improvement (over Bobkov–Chistyakov):

Nα(Sn)cα(n)k=1nNα(Xk)N_\alpha(S_n) \geq c_\alpha^{(n)} \sum_{k=1}^n N_\alpha(X_k)

with constant

cα(n)=α1/(α1)(11nα)nα1,α=αα1.c_\alpha^{(n)} = \alpha^{1/(\alpha-1)} \left(1 - \frac{1}{n\alpha'}\right)^{n\alpha'-1},\quad \alpha' = \frac{\alpha}{\alpha-1}.

This sharp constant improves on all previous multiplicative REPIs (Ram et al., 2016).

Second (Tightest) Bound:

Involves a further optimized multiplicative factor (given semi-explicitly via convex analysis or in closed-form for n=2n=2), always achieving strict improvement and sharpness.

Limit behaviors recover the Shannon EPI (α1\alpha \to 1) and give optimal constants as α\alpha \to \infty. Equality holds for Gaussians with proportional covariances (Ram et al., 2016).

6. Sharp Bounds in Quantum Systems and Geometry

For quantum many-body systems, rigorously sharp growth bounds for the instantaneous rate of Rényi entropies are established. For non-local Hamiltonians and a subsystem AA of size A|A| with Hilbert space dimension d0Ad_0^{|A|}: Sα(t)2αα1Hd02A,|S_\alpha'(t)| \leq \frac{2\alpha}{|\alpha-1|}\|H\| d_0^{2|A|}, whereas the von Neumann entropy (α=1\alpha=1) scales only linearly in A|A|. In geometrically local models with sufficiently fast-decaying interactions, |S'_\alpha(t)| is controlled by the boundary size A|\partial A| (Shi, 2022).

In mathematical physics, sharp upper bounds for Rényi entropy in spherically symmetric potentials, and on Lie groups with anisotropy, have been derived, with explicit optimal constants in terms of moments and group structural data (Chatzakou et al., 15 Feb 2024, Sánchez-Moreno et al., 2013).

7. Technical Methods and Extremality Structures

The proofs of sharp bounds heavily utilize reverse Young's inequalities with optimal constants, comparison lemmas for log-concave measures, convex-analytic optimization, extremality and majorization arguments, information-theoretic representations, and normal (optimal) transport techniques. For many of the strongest results, the extremals (i.e., distributions attaining equality) are either uniform, exponential, or Gaussian, depending on the scenario and parameter regime. These techniques yield all previously known sharp constants and often underpin new, broader regimes (notably $0 < r < 1$) for generalized entropy inequalities (Marsiglietti et al., 2017, Rioul, 2019, Madiman et al., 2018).


Key Reference Papers

Paper Title Area arXiv ID
"On the entropy power inequality for the Rényi entropy of order [0,1]" Sharp Rényi EPI ($0 (Marsiglietti et al., 2017)
"Rényi entropy power inequality and a reverse" REPI and reverse, p>1p>1 (Li, 2017)
"Sharp moment-entropy inequalities and capacity bounds for log-concave distributions" Sharp lower bounds, applications (Madiman et al., 2018)
"Sharp Bounds Between Two Rényi Entropies of Distinct Positive Orders" Cross-order bounds, mutual information (Sakai et al., 2016)
"Sharp continuity bounds for entropy and conditional entropy" Continuity bounds (Chen et al., 2017)
"On Renyi Entropy Power Inequalities" Improved multiparameter REPI (Ram et al., 2016)
"A tight uniform continuity bound for the Arimoto-Rényi conditional entropy ..." Arimoto conditional Rényi bounds (Jabbour et al., 2020)
"Rigorous bounds for Renyi entropies of spherically symmetric potentials" Quantum/geometry, sharp upper bounds (Sánchez-Moreno et al., 2013)
"Bernoulli sums and Rényi entropy inequalities" Discrete/variance, Fourier methods (Madiman et al., 2021)
"Reversals of Rényi Entropy Inequalities under Log-Concavity" Min-entropy, discrete log-concave (Melbourne et al., 2020)
"Rényi Bounds on Information Combining" Conditional Rényi, polarization (Hirche, 2020)
"Sharp bounds on pp-norms for sums of independent uniform random variables, $0 Sums of uniforms, sharp lower bounds (Chasapis et al., 2021)
"Rényi Entropy Power and Normal Transport" Unification, info-theoretic proofs (Rioul, 2019)
"Bounds on Renyi entropy growth in many-body quantum systems" Quantum entropy dynamics (Shi, 2022)
"Sharp upper bound for anisotropic Rényi entropy and Heisenberg uncertainty principle" Lie groups, sharp upper bounds (Chatzakou et al., 15 Feb 2024)

In summary, the theory of sharp bounds on Rényi entropy encompasses explicit, tight inequalities governing the entropy of sums, transformations, or mixtures of random variables or quantum states, tightly connecting functional inequalities, operational criteria, convex geometry, and extremal analysis in a unified, principled framework.

Whiteboard

Follow Topic

Get notified by email when new papers are published related to Sharp Bounds on Rényi Entropy.