Fragility-Curve Modeling
- Fragility-curve modeling is a probabilistic framework that defines the likelihood of reaching damage states based on hazard intensity measures.
- It employs statistical methods, including lognormal/probit models, ordinal regressions, and surrogate approaches like kriging and machine learning.
- The methodology informs seismic, wind, material, and network reliability analyses, enhancing risk assessment and resilience planning.
Fragility-curve modeling is a central quantitative framework across structural engineering, materials science, and network reliability, describing the conditional probability that a system, component, or material reaches or exceeds a specified damage or failure state as a function of an external hazard intensity measure. Fragility curves are formulated, estimated, and interpreted differently in various contexts, ranging from seismic performance-based earthquake engineering, power grid infrastructure, and wind resilience assessment, to the fundamental relaxation kinetics of glass-forming liquids and the systemic tipping points of interacting networks.
1. Mathematical Definitions and Parametric Forms
A fragility curve, typically denoted or , represents the probability that a damage state is reached or exceeded given an intensity measure (IM) characterizing the loading or hazard. In classical applications, IM might be peak ground acceleration (PGA) for seismic events, 3-second gust wind speed for wind hazards, or a thermodynamic variable such as for materials under cooling. The form of fragility curves depends on the context:
- Lognormal/Probit Model (most common in engineering):
where is the standard normal cumulative distribution function, is the log-median IM for exceedance, and is the log-standard deviation controlling dispersion (Karagiannakis et al., 11 Apr 2025, Raj et al., 2021, Sudret et al., 2014). This model is also referred to as the lognormal CDF parametrization.
- Ordinal Regression Framework (multiple damage states):
The fragility curve for exceeding state is
Several ordinal generalized linear models are available—cumulative/proportional odds, sequential/continuation-ratio, and adjacent-category—each differing in the construction and interpretation of cut-points and latent variable scales. For example, the cumulative model assumes
with as a link function (probit or logit), and as parameters (Chen, 2024).
- Extensions for Power Systems and Compound Hazards:
Models incorporating multi-hazard IMs and explicit mechanical demand are parametrized as
with as wind, as ice thickness, as mechanical demand (Karagiannakis et al., 11 Apr 2025).
- Materials and Glass Transitions:
For glass-forming liquids and polymers, the fragility curve is the plot of the logarithm of relaxation time or viscosity versus inverse temperature reduced to , the glass transition temperature:
or,
(Ginzburg et al., 1 Jan 2025, Ciarella et al., 2019, Puosi et al., 2011, Premkumar et al., 2015).
2. Classical, Bayesian, and Nonparametric Estimation Methods
Parametric Inference
- Maximum Likelihood Estimation (MLE):
For binary (failure/no failure) data , the likelihood under the lognormal model is optimized to yield (Sudret et al., 2014, Mai et al., 2017).
- Bin-and-Fit and Least-squares:
Empirical exceedance probabilities in IM bins are regressed against the inverse probit/lognormal transformation for parameter estimation (Raj et al., 2021).
- Ordinal Regression and Model Selection:
Multistate fragility is modeled using cumulative, sequential, and adjacent-category ordinal regressions, with model validation via leave-one-out cross-validation (PSIS-LOO) and surrogate residuals to diagnose fit quality (Chen, 2024).
Bayesian Inference
- Jeffreys Reference Prior and Posterior Sampling:
The reference prior for in the lognormal-probit model is derived as , ensuring a proper posterior with robust credible intervals even for small binary data sets (Biesbroeck et al., 2023, Biesbroeck et al., 10 Mar 2025).
- Constrained Reference Prior and Sequential Design:
Additional constraints on the reference prior () eliminate degeneracy when the likelihood is flat. Active experimental design maximizes information gain per test, sharply reducing the number of required experiments for robust posterior fragility estimation (Biesbroeck et al., 10 Mar 2025).
Nonparametric Approaches
- Binned Monte Carlo Simulation (bMCS):
Empirical exceedance rates are computed in IM bins after local scaling of response variables, with minimal assumptions on functional form (Mai et al., 2017, Sudret et al., 2014).
- Kernel Density Estimation (KDE):
Joint and conditional densities and are estimated via smoothing kernels; the fragility is integrated over the response tail. This approach is unbiased and captures non-Gaussian features absent in lognormal fits (Sudret et al., 2014, Mai et al., 2017).
- Stochastic Polynomial Chaos Expansions (SPCE):
Aleatory and epistemic uncertainties are captured in surrogate models representing the conditional law of EDP given IM. The fragility function is efficiently computed from the surrogate, overcoming the computational costs of large-scale simulation (Zhu et al., 2022).
- Kriging (Gaussian Process Surrogates):
For high-dimensional uncertainty and simulation-based settings, nonparametric GP surrogates propagate both model and parameter uncertainties, yielding fragility curves with credible bands. Sobol and kernel-based indices provide global sensitivity analysis (Gauchy et al., 2022).
- Machine Learning Surrogates (Random Forests, SVM):
Random forests trained on simulation data provide rapid, nonparametric failure probability estimation in high-dimensional parameter spaces. Support vector machines with Platt calibration enable efficient active-learning estimation of fragility curves, optimizing expensive simulation budgets (Mangalathu et al., 2018, Sainct et al., 2018).
3. Fragility-Curve Modeling in Materials Science
In the context of glass-forming liquids and polymers, fragility curves formalize the non-Arrhenius temperature dependence of relaxation time or viscosity. Key results:
- Universal Master Curves and Fragility Index :
- In Lennard-Jones liquids, is calculated using density functional theory to predict configurational entropy, Adam–Gibbs relations for relaxation time, and solutions of fluctuating nonlinear hydrodynamics (Premkumar et al., 2015).
- For polymers, relaxation curves collapse onto a single functional form, with and obeying Fox–Flory linearity in $1/M$ (molecular weight), consistent with the SL-TS2 theory and thermodynamic pressure scaling (Ginzburg et al., 1 Jan 2025).
- Mode-coupling theory (MCT) links the T-sensitivity of the structural factor to the full spectrum of fragile-to-strong behavior, grounded in first-principles microscopic simulation (Ciarella et al., 2019).
- Elastic Softening and the Cavity Model:
The universal parabola
with parameterizes 18 decades of relaxation time by the softening of the plateau modulus , connecting thermodynamics and kinetics (Puosi et al., 2011).
4. Network Reliability and Lifeline Infrastructure
When fragility is defined at the system or network level, as in interconnected power grids or transportation systems, the modeling becomes more complex:
- Network Fragility via Subset Simulation:
The probability that a network is disconnected under hazard IM is expressed as
Specialized subset simulation algorithms reformulate the binary connectivity function into piecewise continuous limit-state surrogates (most-reliable-path or shortest-path) to guide rare-event estimation efficiently. A single simulation can yield the entire curve across IM levels, exploiting the affine dependence of safety margins on IM in the lognormal framework (Lee et al., 2023).
- Compound Hazards and Multi-IM Surfaces:
Fragility models are extended to account for simultaneous or sequential hazards (wind + ice, earthquake + wind), usually by multivariate lognormal or probit parameterizations, with parameters fitted via empirical or simulated data and validated against observed failures (Karagiannakis et al., 11 Apr 2025).
5. Practical Applications and Impact
Fragility-curve modeling underpins quantitative risk and resilience analysis in:
- Seismic Performance-Based Engineering: Integration with hazard curves for loss estimation, retrofit prioritization, and resilience improvement (Mai et al., 2017, Chen, 2024).
- Wind/Climate Risk to Power Systems: Empirical and analytical fragility curves for towers, poles, and substations inform climate adaptation strategies; uncertainty in IM quantification dominates total uncertainty (Raj et al., 2021, Karagiannakis et al., 11 Apr 2025).
- Material Design: MCT and structure-based models enable rational design of polymers and glasses with tailored fragility, important for compositional optimization and processibility (Ginzburg et al., 1 Jan 2025, Ciarella et al., 2019).
- Network Resilience: System-level fragility quantifies the probability of cascading failures, guiding prioritization of upgrades in power grids and transportation networks (Lee et al., 2023).
6. Limitations, Sensitivities, and Advanced Directions
- Model Selection and Assumptions: Strict lognormality is often empirically refuted, especially in multi-state or multi-modal cases; nonparametric and ordinal-regression approaches yield improved out-of-sample accuracy (Mai et al., 2017, Chen, 2024).
- Uncertainty Quantification: Bayesian frameworks with reference or constrained priors address credible-interval estimation and degeneracy under small sample sizes (Biesbroeck et al., 2023, Biesbroeck et al., 10 Mar 2025).
- Sensitivity Analysis: Kriging-based approaches allow global Sobol and kernel-based sensitivity analysis of the fragility surface to all random input parameters, supporting risk-informed design and parameter screening (Gauchy et al., 2022).
- Efficiency in Large-scale Systems: Machine learning surrogates and advanced importance sampling schemes (IS-AL) reduce the computational cost of estimating fragility curves in high-dimensional structural and network problems (Mangalathu et al., 2018, Gauchy et al., 2021).
- Adaptation to Climate Change: Nonstationary hazards and novel compound risk models are emerging frontiers for fragility-curve integration into infrastructure adaptation strategies (Karagiannakis et al., 11 Apr 2025).
Fragility-curve modeling thus provides a mathematically rigorous and contextually adaptable framework for quantifying and managing vulnerability under diverse loading, material, and systemic uncertainties, spanning both physical and engineered systems. Ongoing methodological advances continue to expand its predictive fidelity and computational tractability in complex, uncertainty-laden domains.