Non-Gaussian Infinitely Divisible RVs
- Non-Gaussian infinitely divisible random variables are defined by a Lévy–Khintchine representation with a vanishing Gaussian component and a nontrivial Lévy measure that captures jump behavior.
- These variables exhibit diverse tail behaviors and are pivotal in applications like stochastic processes, extreme value modeling, actuarial science, and wireless communications.
- Their analytical structure enables closure under convolution and mixture operations, facilitating advanced probabilistic modeling and risk assessment.
A non-Gaussian infinitely divisible random variable is a random variable whose law admits a Lévy–Khintchine representation with vanishing Gaussian component (i.e., the continuous martingale term is absent) but a nontrivial Lévy measure, which governs jumps or singularities in its distribution. Such distributions exhibit a diverse range of tail behaviors, dependence structures, and density representations, and occupy a central position in modern probability theory and its applications, notably in stochastic processes, extreme value modeling, actuarial science, and wireless communications (Sibisi, 2022).
1. Canonical Structure: Lévy–Khintchine Representation
A random variable is infinitely divisible (ID) if, for every , can be represented in distribution as the sum of i.i.d. random variables. The characteristic function takes the Lévy–Khintchine form: where , , and is a Lévy measure satisfying (Sun et al., 2023). The Gaussian component corresponds to ; non-Gaussian ID laws are characterized by and nonzero (pure jump type), or by , (mixed type).
For nonnegative ID random variables on , the Laplace transform has the form: This structure underpins the theory of subordinators, generalized Gamma convolutions (GGCs), and related classes (Sibisi, 2022).
2. Distinguished Non-Gaussian ID Families
Gamma and Stable Laws
- Gamma: For shape , rate ,
- Laplace transform:
- Lévy measure:
- No drift, no Gaussian component (Sibisi, 2022).
- Positive -Stable: For ,
- Laplace transform:
- Lévy measure: (Sibisi, 2022, Rajan et al., 2015).
Generalized Gamma Convolutions (GGCs)
A random variable is a GGC if it is the limit in distribution of finite sums of independent gamma or positive stable random variables, with Laplace exponent
where is the Thorin measure. The Lévy density becomes , with (Sibisi, 2022).
Other Representative Families
- Laplace (double exponential): .
- Pareto: for .
- Student's : Lévy measure expressible via Bessel functions.
- Inverse Gaussian, Gumbel, Logistic, Log-normal: Each admits concrete Lévy measures and ID characterizations (Sun et al., 2023, Rajan et al., 2015).
Non-Gaussian ID laws also encapsulate the exp-normal ( with ) (1803.09838), variance Gamma, and CGMY families (Barman et al., 2 Aug 2024), as well as geometric infinitely divisible (gid) and Bondesson-class distributions (Dhull et al., 2023, Mai et al., 2018).
3. Generation, Closure, and Algebraic Structure
Non-Gaussian infinitely divisible classes possess remarkable closure and generation properties:
- Convolutions and Mixtures: The sum or independent mixture (random scaling) of ID variables remains ID (Sibisi, 2022, Rajan et al., 2015).
- Compound Poissonization: Any ID law with a finite Lévy measure is a compound Poisson law, i.e., summing i.i.d. nontrivial jumps.
- Mixture of Bernstein Functions: For subordinators, combinations of Laplace exponents (Bernstein functions) via integral mixtures generate rich non-Gaussian ID classes (Mai et al., 2018).
- Generalized Gamma Convolution (GGC) Operations: GGCs are closed under convolution, mixture, and weak limits, forming the minimal closed class containing all gamma laws under these operations (Sibisi, 2022, Pérez-Abreu et al., 2012).
4. Commutative Diagrams, Moment Identities, and Special Functions
A unified visual formalism termed the Lévy–Khintchine commutative diagram (LKCD) succinctly relates densities, Laplace transforms, Lévy measures, and limiting compound-Poisson densities for central ID classes. The LKCD enables systematic navigation among these objects via Laplace transforms, logarithmic derivatives, and limits (Sibisi, 2022).
Special functions arise naturally in explicit expressions for densities and transformations:
- Confluent hypergeometric, Bessel, Mittag-Leffler, and parabolic cylinder functions: Explicit in GGC convolution and mixture densities.
- Moment formulas: Explicit fractional moment identities for gamma, stable, and fractional-gamma laws (e.g., , exists iff for stable) (Sibisi, 2022).
5. Lévy Measure, Inversion, and Non-Gaussian Diagnostics
The Lévy measure is central for distinguishing non-Gaussianity: a nontrivial with signals a purely non-Gaussian ID law (Klebanov et al., 2015). The inversion formula provides a method for reconstructing directly from the characteristic function: ensuring uniqueness and direct construction of non-Gaussian ID laws by prescribing (Burnaev, 2021).
A necessary and sufficient condition for vanishing Gaussian component in a symmetric ID law is in the canonical decomposition, with the non-Gaussian class then governed entirely by jump activity ((Klebanov et al., 2015), Section 1.1).
6. Multivariate and Cone-Valued Extensions
Non-Gaussian ID distributions generalize to multivariate and cone-valued settings, notably through multivariate and matrix gamma laws, and generalized gamma convolutions on convex cones (including the cone of positive semidefinite matrices) (Pérez-Abreu et al., 2012):
- The characteristic function for a -variate Gamma distribution involves integration over the unit sphere and a radial Lévy measure.
- The cone-valued class admits a Wiener–Gamma (Itô–Wiener) representation, allowing explicit construction via Poisson random measures and mixing functions.
- New examples such as the law provide infinitely divisible positive definite random matrix analogues with explicit ties to Wishart distributions and spectral limits.
7. Applications, Inequalities, and Further Properties
Non-Gaussian ID laws are pivotal in:
- Signal Processing: Unification of classical fading distributions as GGC or ID laws—Nakagami-, Rayleigh-lognormal, generalized , etc.—enables succinct calculation of asymptotic SER, diversity order, and information-theoretic performance measures via Thorin measures (Rajan et al., 2015).
- Variance and Concentration Inequalities: For many non-Gaussian ID laws, the “one standard deviation” concentration inequality
holds, demonstrating high mass concentration even for heavy-tailed or skewed non-Gaussian families (Sun et al., 2023). General covariance identities and two-sided variance bounds in terms of (Cacoullos-type, Stein-type) enable fine control of risk, premiums, and statistical functionals (Barman et al., 2 Aug 2024).
- Learning Theory: Empirical process risk bounds for non-Gaussian ID data decay faster than classical i.i.d. rates under mild assumptions, due to tight concentration governed by the Lévy measure (Zhang et al., 2012).
The structure, explicit construction, and analytic tractability of non-Gaussian infinitely divisible laws make them a foundational object across probabilistic analysis, applied mathematics, and statistical modeling (Sibisi, 2022).