LENQD Random Variables: Theory & Applications
- LENQD random variables are defined by a controlled, linearly extendable form of negative quadrant dependence, bridging full independence and strong negative dependence.
- Their framework extends classical limit theorems, concentration inequalities, and stochastic orderings with explicit dominating constants and convergence rates.
- Applications span risk aggregation, nonparametric regression, machine learning, and numerical integration, quantifying performance degradation under dependence.
Linearly Extended Negative Quadrant Dependent (LENQD) random variables are a class of dependent random variables characterized by a controlled, linearly extendable negative quadrant dependence among components. This property interpolates between full independence and stronger negative dependence conditions and is particularly important for limit theorems, concentration inequalities, stochastic orderings, and risk aggregation under dependence uncertainty. LENQD variables enable the extension of classical probabilistic results to dependently structured data, with quantifiable degradation in performance as compared to the independent case.
1. Definition and Characterization
LENQD random variables generalize the classical notion of negative quadrant dependence (NQD) by relaxing the requirement that the joint probability of all lower (or upper) tail events is strictly less than or equal to the product of marginals. For an array , LENQD (also called extended negative quadrant dependent or END in the literature) means there is a dominating constant such that for any real numbers : The minimal possible quantifies the “strength” of the negative dependence, serving as a linear or multiplicative “extension” over standard NQD.
Variants in sub-linear expectation settings replace probabilities by expectations of products of monotone functions and admit analogous definitions, often introducing a multiplicative constant in the corresponding inequalities (Zhang, 2014, Zhang, 2016).
LENQD random variables subsume independent random variables (with ), and any further slack is explicitly accounted for in concentration and limit theorems.
2. Fundamental Limit Theorems
LENQD random variables permit the extension of strong laws of large numbers (SLLN), laws of the iterated logarithm (LIL), and central limit theorems (CLT) with explicit convergence rates and normalizing constants.
- Strong Laws: Under integrability and weak mean domination, SLLNs for LENQD triangular arrays (or sequences) mirror classical forms with norming constants adjusted for dependence and the dominating sequence (Silva, 2019). For pairwise NQD, the SLLN with sharp normalization is attainable for $1 < p < 2$, and the result extends to LENQD sequences via appropriate linearization (Silva, 2020).
- Law of the Iterated Logarithm: In sub-linear expectation spaces, with extended negative dependence, LIL-type results hold; normalized partial sums satisfy
where and variances are defined via sub-linear expectations (Zhang, 2016).
- Central Limit Theorem with Rate: For bounded LENQD sequences satisfying decay of covariances , and with non-degenerate normalized variance, the Berry–Esseen bound holds with rate : where and (Alem et al., 18 Sep 2025).
These theorems demonstrate that performance loss due to negative dependence is quantifiable and, in some cases, quite mild, allowing practical use of classical probabilistic methodology in the presence of LENQD.
3. Stochastic Orderings and Concentration Inequalities
LENQD structures yield powerful orderings and concentration bounds:
- Stochastic Orderings: If is a sum of LENQD indicators, then (size-biased version of ), which in turn ensures that (convex order to Poisson with mean ). This unlocks various entropic and approximation results:
- (entropy maximization)
- Poisson approximation bounds and Wasserstein/total variation estimates, e.g., (Daly, 2015)
- Tail bounds akin to exponents in the independent case.
- Concentration Inequalities: For bounded LENQD variables, high-probability bounds for deviations from the mean are available. Soft-cover arguments and “coloring numbers” from dependency graphs allow Hoeffding-type inequalities to be expressed as
where and measure dependence strength and “soft cover” complexity (Lampert et al., 2018). As , the independent case is recovered.
- Generalization to other concentration regimes: For binary-valued LENQD random variables, -negative dependence (with correlation number potentially growing with dimension) still supports Chernoff–Hoeffding-type bounds, valuable for randomized quasi-Monte Carlo integration using Latin hypercube or scrambled net sampling (Doerr et al., 2021).
4. Construction and Examples
The construction of strongly negatively dependent sequences such as END (Extremely Negatively Dependent) or SEND (Strongly Extremely Negatively Dependent) provides the archetype for LENQD:
- For any marginal with mean , one can build with such that for all , with depending only on (Wang et al., 2014).
- If is -completely mixable, this mixability can be propagated to an infinite END (and even SEND) sequence by periodic replication, ensuring minimal variance in sums.
In risk theory, LENQD structure arises naturally in risk aggregation with dependence uncertainty, yielding minimal aggregate variance and optimal lower bounds for quantities such as Value-at-Risk (VaR) and Expected Shortfall (ES). The asymptotic equivalence under LENQD is a central result in this direction (Wang et al., 2014).
5. Applications in Data Science, Statistical Modeling, and Machine Learning
LENQD random variables enable robust analysis and generalization in dependent data settings:
- Nonparametric Regression: The rate in the CLT for the wavelet estimator with LENQD errors in a regression model achieves under standard smoothness and design density assumptions. The estimator,
(with and LENQD), admits a Berry–Esseen bound for its normalized version, and the bound persists after bias correction under additional smoothness (Alem et al., 18 Sep 2025).
- Generalization Bounds in Learning: In machine learning or risk minimization tasks, when samples or losses are only LENQD (not independent), dependency-dependent concentration inequalities allow the use of classical empirical risk minimization bounds with explicit “penalties” that vanish as the pairwise dependence among samples decreases (Lampert et al., 2018).
- Probabilistic Numerical Methods: In randomized quasi-Monte Carlo, discrepancy bounds and convergence guarantees for integration are retained even when the underlying sampling (e.g., LHS or -nets with scrambling) results in only -negative dependence with moderate (Doerr et al., 2021).
6. Comparison with Other Dependence Structures and Literature Position
LENQD is situated within a family of negative dependence concepts, including NQD, END, SEND, and -extended negative dependence. The explicit inclusion of a dominating constant (or analogs , ) parametrizes the “gap to independence,” allowing sharp theorems with rates and thresholds depending explicitly on the degree of negative dependence. LENQD encompasses many important dependent structures encountered in actuarial mathematics, quasi-Monte Carlo integration, and machine learning.
Extensive recent work has addressed convergence rates for sums and general limit theorems, e.g., complete convergence for -extended negatively dependent (hence LENQD) sequences under optimal moment conditions and with regularly varying normalization (Dzung et al., 2021). The sharpness of moment conditions and normalization rates has been demonstrated in explicit counterexamples.
7. Summary Table of Key Results
Property or Theorem | Condition | Rate / Bound |
---|---|---|
SLLN for LENQD arrays | Weak mean domination, integrability; controlled dominating sequence | a.s. (Silva, 2019) |
CLT Berry–Esseen rate | Covariance decay, variance non-degeneracy, boundedness | (Alem et al., 18 Sep 2025) |
Weighted SLLN | , dominated by | a.s. (Silva, 2019) |
Entropy bound for sum | as sum of LENQD indicators | (Daly, 2015) |
Concentration (soft cover) | Small , low | Nearly independent tail decay (Lampert et al., 2018) |
LENQD random variables thereby provide a mathematically rigorous and practically versatile framework for analyzing negatively dependent systems that are not fully independent, with quantifiable control over convergence rates, tail behavior, and aggregate risk, and extensive applicability in modern probability, statistics, risk management, and machine learning.