Dynamic Structural Causal Models
- Dynamic Structural Causal Models (DSCMs) generalize static causal models by modeling entire time-evolving processes with explicit causal interventions.
- They integrate discrete and continuous frameworks using time-series graphs, differential equations, and spectral methods to capture complex causal effects.
- DSCMs enhance structural learning and control in diverse fields such as robotics, econometrics, and epidemiology by addressing time-dependent dynamics and latent confounding.
Dynamic Structural Causal Models (DSCMs) generalize the classical SCM formalism to systems whose variables are entire time-evolving processes, allowing for the explicit modeling of causal relations and interventions in dynamical, often stochastic, environments. DSCMs are foundational for causal inference in time-series, control systems, physical processes, and any domain where dynamic evolution, feedback, latent confounding, or time-dependent interventions are essential. Their formalization encompasses both discrete-time and continuous-time domains, with close connections to differential equations, stochastic processes, and process graphs.
1. Formal Foundations and Process Graph Representations
DSCMs extend SCMs by assigning each endogenous variable to a trajectory, typically defined on a time-set , yielding models where the variables are processes rather than static scalars. In discrete time, the structure often starts with a time-series graph , where each vertex represents a process at time and edges encode both contemporaneous and time-lagged causal influences. To analyze this structure efficiently, the graph is collapsed in time, producing a "process graph" whose nodes correspond to entire processes and whose edges summarize all lagged or contemporaneous influences between processes. This representation is especially powerful for multivariate SVAR (structural vector autoregressive) models and enables direct reading of influence structure, path coefficients, and identification conditions by considering each process as a single graph node, rather than unrolling in time (Reiter et al., 2023).
The process graph formalism underlies both the classical time-domain filter approach and the frequency-domain analysis. Linear stationarity and bounded-lag assumptions allow time-domain expansion into filter-convolutions, whereas Fourier transformation produces compact expressions for causal effects in the frequency domain, enabling direct and efficient visualization of multivariate causal architectures.
2. DSCMs for Stochastic and Deterministic Dynamics
DSCMs have strong ties to dynamical systems modeled by ODEs, SDEs, or Markov processes. In continuous time, DSCMs are defined as collections of structural differential or stochastic differential equations, possibly with measurement or driving noise (Peters et al., 2020). Each equation specifies the evolution:
or, in stochastic form,
where is the parent set for component , and are drift and diffusion functions, and are independent Brownian motions.
Discrete-time DSCMs often emerge via subsampling or time-splitting operations on the continuous process, preserving much of the causal semantics but sometimes losing invertibility unless the sampling is sufficiently fine (Boeken et al., 3 Jun 2024).
Structural interventions in both formulations involve replacing functional modules (ODE/SDE terms) for a component with new functions or constants. These interventions, when modeled as “do-trajectories,” can be time-dependent and reconstruct the system's counterfactual or perturbed evolution.
For equilibrium settings (e.g., stationary states of ODEs or CTMCs), equilibrium-induced DSCMs take the form of causal constraints models (CCMs), capturing constraints that survive or are broken under specific interventions, and correctly reflect dependencies on initial conditions and conserved quantities (Blom et al., 2018, Ness et al., 2019).
3. Path, Trek, and Frequency-Domain Rules
DSCMs admit generalizations of Wright’s path-rule and trek-rule to dynamic processes, enabling the calculation of total causal effects, auto-covariances, and spectral decompositions. For a linear SVAR(p), the observed time-series block can be expressed as a matrix-convolution:
where and are the filter matrices (direct and latent-to-observed effects, respectively), and comprises internal innovations. Iteration yields a geometric sum for the total effect, under stability conditions.
The generalized trek-rule expresses the auto-covariance:
where each trek contributes a monomial filter composed of path filters and internal covariance.
Frequency-domain methods exploit the compactness of convolution as multiplication. The observed spectral density is given by:
where is the Fourier transform of the filter matrix. Direct transfer functions and path sums have rational-polynomial forms, enabling closed-form calculation of dynamic causal effects across all frequencies. Graphical identification criteria (backdoor, front-door, instrumental variables, etc.) follow verbatim from the SCM literature, with causal transfer functions recoverable from observed spectra under appropriate unconfounding (Reiter et al., 2023).
4. Structural Learning, Identification, and Discovery
Structural learning in DSCMs involves uncovering both the functional relationships and the time-dependent or time-invariant causal graph governing the system. Approaches include:
- Score-based and constraint-based graph discovery via time-series conditional independence tests, leveraging σ-separation in process graphs or DMGs. Standard algorithms (e.g., FCI, PAG) are sound and complete for DSCMs under faithfulness and Markov properties (Boeken et al., 3 Jun 2024).
- Distributional modeling with uncertainty over structure. DSCMs can model causal structure distributions, sampling graphs (e.g., encoder-multidecoder architectures for robots) that inform latent representations, yielding robust downstream policies under epistemic uncertainty (Murillo-Gonzalez et al., 8 Aug 2025).
- Simulation-free approaches (e.g., StructureFlow) use Schrödinger Bridge principles, optimal transport couplings, and flow-matching regressions to learn both structure and SDE dynamics from population snapshots under interventions, without trajectory simulation during training (Rimawi-Fine et al., 18 Oct 2025).
- Time-varying causal graphs are identified by parameterizing adjacency matrices as smooth functions of time (splines, B-splines), learning via minimal cyclic-penalized continuous optimization and variational inference. Both past and future graphs can be predicted, with smoothness, sparsity, and acyclicity constraints guaranteeing interpretable and dynamic causal recovery (Wang et al., 11 Jan 2025).
Table: Modes of Structural Learning in DSCMs
| Approach | Data Domain | Graph Dynamics |
|---|---|---|
| Score/constraint CI (σ-sep, FCI) | Time-series | Time-invariant/cyclic |
| Simulation-free flow-matching | Snapshot + Interventions | Process-inferred, stochastic |
| Graph distribution sampling | Episodic, robotics | Sparse, uncertain |
| Basis-function inference | Panel, time-series | Smooth time-varying |
5. Interventions, Counterfactuals, and Control
DSCMs support direct modeling of interventions, including time-dependent do-operations (trajectories, rates, module replacements). In continuous time, interventions may alter the drift/diffusion of targeted components. For discrete processes or robotic systems, interventions correspond to masking, latching, or direct assignment in the structural update rules. Counterfactual inference follows the classical SCM three-step recipe (abduction, action, prediction), though in dynamic settings inference is typically performed through posterior updates on process noise or latent representations, followed by recomputation of trajectories under the intervened process (Ness et al., 2019).
In control and planning, DSCMs enable robust sequential decision making by sampling plausible causal graphs and marginalizing performance across model-uncertainty. Probabilistic or sampling-based planners (e.g., CEM, MPPI) simulate outcomes under sampled dynamic causal graphs, improving reliability in novel or corrupted environments (Murillo-Gonzalez et al., 8 Aug 2025).
For systems governed by PDEs or high-dimensional dynamics (e.g., portfolio optimization), DSCMs underpin the causal structure in forward-backward PDE control problems. Conditional measures (scenario-driven risk-neutral distributions), causal projections, and completeness theorems support no-arbitrage, replicability, and robust optimization, with PINN architectures providing computational tractability in very high dimensions (Dominguez, 11 Sep 2025).
6. Graphical Markov Properties, Local Independence, and Subsampling
DSCMs exhibit graphical Markov properties analogous to those of static SCMs. σ-separation in directed-mixed graphs reads off process-level conditional independences among trajectories, preserving faithfulness and identification. Time-splitting operations enable analysis of continuous-time Granger non-causality as local independence, separating history, instantaneous, and future process segments for precise intervention or independence discovery. Subsampling yields discrete-time DSCMs, useful for practical time-series analysis, with the loss of invertibility rigorously documented unless sampling is sufficiently fine (Boeken et al., 3 Jun 2024).
7. Applications, Limitations, and Theoretical Extensions
DSCM frameworks span econometrics, robotics, gene regulatory dynamics, epidemiology, control engineering, and autonomous systems. Notable theoretical extensions include:
- Modular monadic architectures enabling encapsulation, late binding, and dynamic agent-set management while preserving stationarity of structural laws, critical for multi-agent and cyber-physical systems (Howard et al., 3 Jun 2024).
- Causal constraints models for equilibrium and functional laws, resolving the limitations of SCMs in semistable or functional-invariant settings (Blom et al., 2018).
- Generalized SEMs (GSEMs) for dynamical systems with infinitely many variables, making explicit the mapping from interventions and contexts to outcomes, particularly in hybrid systems or systems with piecewise trajectories (Peters et al., 2021).
Limitations arise when the system fails to stably converge under intervention (violating required regularity), when required trajectory spaces are not modular or sufficiently rich (e.g., in semistable oscillators), when certain constraints cannot be modeled as pure equations (necessitating CCMs), or when structure discovery is confounded by high-dimensionality, noise or nonfaithfulness.
Further theoretical challenges include the development of a full do-calculus for dynamic continuous-time systems, extensions to partial differential equations and hybrid models, robustness to latent confounding, and optimal experimental design for trajectory-based interventions (Peters et al., 2020, Boeken et al., 3 Jun 2024).
Bibliographic Overview
Major formal developments of DSCMs and related frameworks are provided by Boeken & Mooij (Boeken et al., 3 Jun 2024), Bongers et al., Peters et al. (Peters et al., 2020), Rubenstein et al. (Rubenstein et al., 2016), Halpern & Peters (Peters et al., 2021). Empirical and algorithmic advances are documented in robotics and reinforcement learning (Murillo-Gonzalez et al., 8 Aug 2025), time-varying causal structure discovery (Wang et al., 11 Jan 2025), simulation-free learning (Rimawi-Fine et al., 18 Oct 2025), modularization and encapsulation in autonomous systems (Howard et al., 3 Jun 2024), and the integration of process graphs for spectral and path-rule causal inference (Reiter et al., 2023).
Taken together, Dynamic Structural Causal Models unify the theory and practice of causal inference in dynamical systems. They provide a bridge between statistical time-series, stochastic dynamics, control theory, and structural causal semantics—supporting rigorous identification, efficient representation, and principled policy synthesis in complex temporal domains.