The Maximal Variance of Unilaterally Truncated Gaussian and Chi Distributions (2511.11566v1)
Abstract: This work explores the bounds of the variance of unilaterally truncated Gaussian distributions (UTGDs) and scaled chi distributions (UTSCDs) with fixed means. For any arbitrary Gaussian distribution function, $f(x;μ,σ)$, with a fixed, finite mean $M$ on the truncated domain $x \ge a$, where $a \in \mathbb{R}$, it is proven that the variance is bounded: specifically, $\sup \mathrm{Var}(x){|x \ge a}= \sup \mathrm{Var}(x){|x \le a} =(M-a)2$. For a fixed cutoff, $a$, the variance can be considered a function of only $M$, $a$, and the location parameter $μ$. Examples of such approximating functions, which can be used for model calibration, are developed in addition to other, related calibration methods. For UTSCDs, numerical evidence is presented indicating that for $n \in \mathbb{Z+}$ degrees of freedom, or dimensions, and a fixed, finite mean, the variance, $\mathrm{Var}(R)$, over $R \in [a,\infty)$ reaches its maximum value $M2(π-2)/2$ at $a=0$, $n=1$. For a fixed cutoff value, there is a local maximum in the variance as a function of $n$, and the number of dimensions resulting in the maximal variance, $n_{\mathrm{vmx}}$, increases with cutoff value. However, for $n \in \mathbb{R}$, as the cutoff approaches $0$, $n_{\mathrm{vmx}}$ approaches $-1$, while $\mathrm{Var}(R)$ appears to grow without bound.
Sponsored by Paperpile, the PDF & BibTeX manager trusted by top AI labs.
Get 30 days freePaper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.