Set-Valued Random Sequences
- Set-valued random sequences are stochastic processes whose values are closed subsets of Banach spaces, extending traditional single-valued random variables.
- They utilize measurable selections and support functions to rigorously analyze convergence through Hausdorff and Kuratowski–Mosco metrics.
- Key methods involve employing φ-mixing conditions and generalizing the laws of large numbers to support applications in random optimization and stochastic geometry.
A set-valued random sequence is a stochastic process with values in the family of closed subsets of a Banach space, extending the concept of random variables from single-valued to set-valued entities. Such sequences arise naturally in areas such as random optimization, stochastic geometry, and multivalued analysis, where uncertainty or imprecision is best modeled by random closed sets instead of random points. Rigorous study of their stability, convergence, and limit behavior requires generalizing classical probabilistic results—including strong and weak laws of large numbers—to accommodate the set-valued context as defined by measurable selections and convergence of set sequences in appropriate topologies (Tuyen, 14 Jan 2026, Guan et al., 2020).
1. Foundational Definitions
Given a complete probability space and a (real) separable Banach space , let denote the collection of all nonempty closed subsets of . A set-valued random variable or random closed set is a measurable mapping , where for any closed , the event belongs to .
A selection of is ordinary measurable with almost surely. The space of -integrable selections is , and is -integrably bounded if is in .
The expectation of a set-valued random variable, called the Aumann (or Bochner) integral, is defined as
If takes values in the compact convex subsets, support functions (for in the dual unit sphere ) characterize completely.
For sequences , weak stationarity requires for all , for some fixed closed set (Tuyen, 14 Jan 2026).
2. Dependence and Mixing in Set-Valued Sequences
Beyond independence, -mixing extends to set-valued random sequences as follows. For -algebras ,
and for a sequence , set . The -mixing coefficient at lag is
defining as as -mixing (Tuyen, 14 Jan 2026).
Alternatively, pairwise uncorrelatedness is formulated in terms of support functions: given two set-valued r.v.’s , they are uncorrelated if for each , and are uncorrelated real random variables (Guan et al., 2020).
3. Laws of Large Numbers for Set-Valued Random Sequences
Multiple versions of weak and strong laws of large numbers (WLLN, SLLN) are established for set-valued sequences, with convergence measured in the Hausdorff or Kuratowski–Mosco sense.
For -mixing, weakly stationary sequences in (compact convex values), with shared expectation :
- If ,
- And for every , ,
then almost surely,
where is the Hausdorff metric (Tuyen, 14 Jan 2026). Analogous SLLN results hold for compact nonconvex and general closed values, under additional selection and summability conditions, leading to Kuratowski–Mosco convergence.
For pairwise uncorrelated set-valued sequences in finite-dimensional Banach spaces, the SLLN states: if
then almost surely,
where is the Hausdorff metric (Guan et al., 2020).
Each result generalizes the classical single-valued (scalar or vector) SLLNs, using support function calculus and measurable selection techniques.
4. Key Methods and Proof Strategies
Central to these results is the reduction to scalar convergence via support functions. For compact convex sets ,
enabling comparison in the sup-norm on the dual unit sphere.
Given the mixing or uncorrelatedness condition, scalar-valued stochastic process SLLNs (e.g., for -mixing, with ) are applied pointwise in . Uniform convergence then follows by diagonalization or compactness.
Kuratowski–Mosco convergence for general closed sets exploits the structure of measurable selections and convex combinations to approximate all points in the closed convex hull . Separation arguments using the dual space allow the exclusion of points outside the asymptotic limit set.
For uncorrelated sequences, the Borel–Cantelli lemma and Chebyshev inequality are applied to show that maximal deviations of the average support functions become negligible almost surely, controlling rate and uniformity over (Guan et al., 2020).
5. Illustrative Examples and Necessity of Hypotheses
Concrete constructions illustrate both validity and optimality of the various theorems. For instance:
- Random intervals: Suppose is a bounded real -mixing sequence, , then and the SLLN yields convergence in Hausdorff distance to , provided variance and mixing decay are sufficient.
- “Needle + shrinking halo”: In , let and set where shrinks to $0$ as ; the average then collapses to in both Hausdorff and Kuratowski–Mosco senses.
- Necessity of support summability: If for certain directions the support function may be infinite with nonzero probability and the summability hypothesis fails, the SLLN and limit set collapse can fail, demonstrating the sharpness of the theorems’ conditions (Tuyen, 14 Jan 2026).
6. Relation to Single-Valued and Classical Theories
All principal limit theorems for set-valued sequences reduce in special cases to the classical SLLN for real/multivariate sequences. For i.i.d. or independent sequences, support functions become independent random variables and the results match the classical Hausdorff (or Mosco) limit theorems for random sets.
The extension to -mixing or merely uncorrelated (rather than fully independent) set-valued random sequences produces genuinely new regimes, as independence is sufficient but not necessary for uncorrelatedness of supports or for mixing. This hierarchical generalization mimics developments in the theory of scalar random variables (such as Taylor’s work on the SLLN for uncorrelated sequences).
7. Convergence Topologies, Applications, and Further Directions
Different topologies are used for convergence: the Hausdorff metric for compact convex or compact sets, and Kuratowski–Mosco convergence for more general closed noncompact values. The former translates set convergence into uniform convergence of support functions, while the latter requires deeper manipulation of measurable selections and set operations.
Practical applications include random optimization (especially in robust control and decision under uncertainty), stochastic geometry, and probabilistic programming models where set-valued noise or output arises. Ongoing research addresses extensions to infinite-dimensional spaces, weakening of dependence assumptions, and new quantitative rates.
The versatility of support function methods and convex analytical techniques ensures continuing development in the study of set-valued random sequences, with convergence results forming the backbone for further probabilistic and analytic investigations (Tuyen, 14 Jan 2026, Guan et al., 2020).