Grouped Variance Sensitivity Analysis
- Grouped variance-based sensitivity analysis is a method that decomposes the total output variance into additive contributions from groups of input parameters and their interactions.
- It employs unbiased Monte Carlo estimators, orthogonal projections, and nonlinearity coefficients to quantify both linear effects and higher-order interactions within complex systems.
- This framework facilitates model simplification, improves uncertainty quantification, and guides experimental planning by clearly distinguishing between individual and collective parameter influences.
Grouped variance-based sensitivity analysis is a class of methodologies for decomposing the variance of model outputs into additive contributions from groups of input parameters and their interactions, enabling researchers to assess collective and individual sources of uncertainty in stochastic or deterministic systems. The concept extends classical variance decomposition and Sobol indices to address conditional moments, stochastic models, and high-dimensional cases where the output may itself be a random variable conditioned on uncertain parameters. Rigorous Monte Carlo (MC) estimation schemes, orthogonal projections, and new metrics such as nonlinearity coefficients are employed to quantify not only additive (linear) but also higher-order (interaction and nonlinear) effects among groups of inputs. This framework enables model simplification, the design of MC estimators with quantifiable efficiency, and robust discrimination between linear and nonlinear sources of output variability.
1. ANOVA Decomposition and Grouped Sensitivity Indices
Variance-based sensitivity analysis begins with an ANOVA-type decomposition of a function with %%%%1%%%% a vector of input parameters. The function is expanded as
where is the set of input indices, denotes the subset of parameters in , and is the orthogonal effect function for .
The output variance admits a corresponding decomposition:
For each group , the grouped sensitivity index is given by
and for individual inputs, classical first-order () and total-effect () indices follow as special cases. The decomposition extends to arbitrary groups, enabling the quantification of main effects, joint interactions, and overall (total) impact of parameter subsets.
In models where output stochasticity arises via an additional noise variable , one defines and typically considers conditional moments or functionals as analysis targets.
2. Unbiased Estimation Schemes for Grouped Indices
A central methodological contribution is the construction of unbiased MC estimators (termed “estimation schemes”) for grouped variance-based indices and related quantities involving conditional moments. By employing independent samples of both parameters and noise , and judiciously pairing parameter swaps or perturbations, the schemes ensure unbiasedness and produce MC error estimates.
For first-order sensitivity with respect to parameter , a representative estimator is:
where denotes model outputs for different samples, and uses a swapped or “perturbed” parameter . Similar direct MC estimators are derived for total-effect, interaction, or grouped indices.
The generalization to vector-valued estimands enables the simultaneous estimation of sets of projection coefficients or conditional moments within one simulation campaign, leveraging the same underlying model evaluations.
3. Orthogonal Approximations and Projection Coefficients
Orthogonal projection approximations are used to characterize the dependence structure between parameters and output, and to construct efficient alternative estimators for grouped variance components. If is an orthonormal basis, the projection of onto takes the form:
The mean squared error of this approximation, , partitions the variance between the component explained by linear combinations of input functions and the unexplained residual.
The estimators for are constructed as MC averages, and the grouped structure is captured by selecting to be supported on functions of specified input groups. Such projections are especially useful for high-dimensional problems to separate main and joint effects and for model simplification when linear approximations are adequate.
4. Nonlinearity Coefficients and Quantification of Non-Additivity
To systematically quantify the non-additive (nonlinear) remainder after projection onto a specified linear space, the nonlinearity coefficient is defined for a group as:
where is the total variance attributable to (including all interactions), and are the orthogonal projection coefficients for a best linear predictor based on . The normalized form, , ranges from 0 (perfect linearity) to 1 (entirely nonlinear effect).
These coefficients have applications in providing lower bounds for the probability of output change under input perturbations and in identifying whether grouped effects are primarily linear or dominated by interactions and nonlinearities.
5. Monte Carlo Inefficiency Constants and Algorithmic Performance
The cost-effectiveness of estimation schemes is characterized using the inefficiency constant:
where is the average computation time per MC step and is the variance of the single-step estimator. For steps, the MC error variance decays as , so captures both algorithmic variance and computational expense.
Extensive comparisons using chemical reaction network models (SB, GTS, MBMD) simulated via the Gillespie Direct or Random Time Change algorithms reveal that the variance—and thus —of an estimator depends strongly on both the simulation strategy and ordering. In some cases, the new grouped sensitivity schemes outperform previously proposed estimators.
Scheme | Output | MC Variance | Inefficiency Constant () |
---|---|---|---|
SVar | SB | lower | lower |
SE | GTS | mixed | sensitive to simulation |
EM | MBMD | varies | depends on reaction order |
This tabular summary reflects the influence of algorithm and simulation method on grouped sensitivity analysis.
6. Applications and Implications in Stochastic Reaction Networks
The methodologies were demonstrated for several stochastic chemical kinetics models under MC evaluation. Grouped indices estimated via the developed schemes enabled:
- Identification of parameters and groups with dominant main/interaction effects
- Quantification of model nonlinearity and guidance for reduced-order modeling
- Practical assessment of estimator efficiency under realistic simulation conditions
Sensitivity coefficients for conditional expectations were particularly informative in isolating key kinetic parameters, validating that grouped indices provide actionable diagnostics beyond standard OAT sensitivity approaches.
7. Impact on Model Simplification, Experiment Planning, and Theoretical Advances
By generalizing variance-based sensitivity analysis to accommodate conditional moments and orthogonal projections in stochastic systems, this framework supports robust model reduction, prioritization of experimental measurements, and rigorous quantification of both additive and interactive uncertainty sources. The combined approach—spanning unbiased estimation, orthogonal approximation, and nonlinearity quantification—enables comprehensive grouped variance-based sensitivity analysis for complex models where standard deterministic methods are inadequate.
This work—through precise estimator construction, nonlinearity diagnostics, MC inefficiency characterization, and application to canonical stochastic models—establishes a rigorous foundation and unified language for grouped variance-based sensitivity analysis in contemporary stochastic modeling contexts (Badowski, 2013).