Papers
Topics
Authors
Recent
2000 character limit reached

Dynamic Structural Causal Models

Updated 5 December 2025
  • DSCMs are dynamic extensions of SCMs that model time-indexed, stochastic processes with evolving causal mechanisms.
  • They integrate trajectories, differential equations, and VAR models to enable rigorous intervention analysis and equilibrium representation.
  • Applications span robotic planning and high-dimensional control, demonstrating robust causal inference and structural modularity.

Dynamic Structural Causal Models (DSCMs) generalize classical Structural Causal Models (SCMs) to encompass temporally evolving systems, where each endogenous variable is a function or stochastic process over time. DSCMs integrate the intervention semantics and graphical representations of SCMs with the dynamics of stochastic processes, ordinary or stochastic differential equations, and time-varying structural equations. This enables principled reasoning about interventions and identification of causal effects in time-dependent, high-dimensional, and potentially cyclic or confounded domains.

1. Formal Definition and Core Constructs

A Dynamic Structural Causal Model formalizes temporal causality by treating each endogenous "variable" as a time-indexed process (discrete or continuous), possibly vector-valued and stochastic. In both discrete and continuous time, a DSCM is defined by:

  • A set of endogenous processes VpV_p and exogenous (noise) processes WW;
  • A time domain T\mathcal{T}, which may be discrete ({1,,T}\{1,\ldots,T\}) or continuous ([0,T][0,T]);
  • Domains of the processes, XV=(v,I)D(I,R)\mathcal{X}_V = \prod_{(v, \mathcal{I})} D(\mathcal{I}, \mathbb{R}) for Skorokhod-space trajectories;
  • Causal mechanisms f=(fvI)(v,I)f = (f_{v^{\mathcal{I}}})_{(v, \mathcal{I})} mapping parent process histories and exogenous input histories to each process, with a constraint that fvIf_{v^{\mathcal{I}}} at time tIt\in \mathcal{I} only depends on parent and noise histories up to time tt;
  • An exogenous noise law with a product structure.

This framework admits both cycles and bidirected edges (from shared exogenous sources), generalizing the acyclic structure of conventional SCMs. In continuous-time settings, these mechanisms correspond to collections of (possibly coupled) differential or stochastic differential equations, while in discrete time they reduce to systems of difference equations or VAR (vector autoregressive) models (Boeken et al., 3 Jun 2024, Cinquini et al., 14 Oct 2024, Bongers et al., 2018, Rubenstein et al., 2016).

2. Intervention Semantics and Markov Properties

Interventions in DSCMs extend the do-operator to trajectories or finite intervals:

  • Hard interventions: Replacing a process or derivative (over all or part of T\mathcal{T}) by a fixed value or exogenous trajectory.
  • Additive and forcing interventions (in discrete time): Shifting a component by a fixed value or forcing it towards a target process, respectively.

Upon intervention, the post-interventional DSCM replaces the mechanism(s) for the intervened variables by constants or functions reflecting the intervention, leaving other mechanisms invariant (Cinquini et al., 14 Oct 2024, Peters et al., 2020).

DSCMs inherit a graphical Markov property: conditional independence relations implied by d-separation (or σ-separation, in the cyclic or latent-confounded case) in the mixed graph of the DSCM reflect conditional independence in the joint law of the processes, conditional on the past or appropriate adjustment sets (Boeken et al., 3 Jun 2024, Bongers et al., 2018).

3. Discrete-Time Systems, VAR Models, and Equilibrium Representation

For discrete-time multivariate systems, such as those modeled by VAR(p) processes,

Xt=ν+A1Xt1++ApXtp+ϵt,X_t = \nu + A_1 X_{t-1} + \cdots + A_p X_{t-p} + \epsilon_t,

DSCMs provide a route to equilibrium SCMs. Under stability (all eigenvalues of A(z)A(z) outside the unit disk) and contractivity of the noise, the process XtX_t converges in distribution to a unique XX_\infty. At equilibrium, the law of XX_\infty can be represented by a (possibly cyclic and confounded) linear SCM,

X=AˉX+U,Aˉ=A1++Ap,X = \bar{A} X + U, \quad \bar{A} = A_1 + \cdots + A_p,

with UN(0,ΣU)U \sim N(0, \Sigma_U), and ΣU\Sigma_U derived from the original innovations and the impulse response of the process (Cinquini et al., 14 Oct 2024, Ferreira et al., 2023).

Interventional calculations (additive, forcing) at equilibrium correspond to modifications of ν\nu or Aˉ\bar{A}, and causal effects at any forecast horizon can be computed analytically using the impulse response matrices. The pipeline for DSCM inference thus involves estimation (e.g., OLS), equilibrium SCM construction, and causal effect computation via do-calculus (Cinquini et al., 14 Oct 2024, Ferreira et al., 2023).

4. Dynamic Structure, Time-Varying and Nonlinear Models

DSCMs can be extended to accommodate time-varying structures, where the contemporaneous and lagged coefficients evolve as smooth functions of time. In this context:

  • Time-varying graphs G(t)G(t) (with adjacency matrices BtB_t) encode causal relations that can drift temporally.
  • Basis expansions (e.g., B-splines) are utilized to parameterize BtB_t and lagged weights WtW_t, constraining the system to a low-dimensional, smoothly varying subspace.
  • Structure learning is formulated as a constrained variational inference problem, augmenting a structural VAE with continuous acyclicity penalization (e.g., via NOTEARS-type constraints).
  • The approach enables both forecasting of future system structure and counterfactual or interventional predictions with time-dependent effect estimation (Wang et al., 11 Jan 2025).

Identifiability of the contemporaneous graph at each tt is ensured under independent non-Gaussian or isotropic Gaussian noise. Dynamic causal contrasts (e.g., time-indexed average treatment effects) can be expressed in closed form and tracked over time (Wang et al., 11 Jan 2025).

5. Continuous-Time Systems, ODEs, and SDEs

Continuous-time DSCMs generalize to systems described by ODEs or SDEs:

  • Deterministic kinetic models: Each process evolves according to dXk(t)/dt=fk(Xpa(k)(t);θk)dX_k(t)/dt = f_k(X_{\mathrm{pa}(k)}(t); \theta_k), with initial values and (optionally) independent measurement noise (Peters et al., 2020).
  • Stochastic kinetic models: SDEs of the form dXk(t)=fk(...)dt+hk(...)dWk(t)dX_k(t) = f_k(...)dt + h_k(...)dW_k(t) capture continuous stochastic perturbations.
  • Structural equations in the DSCM correspond to maps from parent trajectories (and initial conditions) to response trajectories; interventions correspond to replacement (surgical) modifications of the relevant drift and/or diffusion terms (Rubenstein et al., 2016, Peters et al., 2020, Boeken et al., 3 Jun 2024).

DSCMs admit time-splitting and subsampling operations, making it possible to relate continuous-time structural models to vector-autoregressive forms at discrete time grids, and to deploy constraint-based discovery algorithms (e.g., FCI) post-discretization (Boeken et al., 3 Jun 2024).

At equilibrium (as tt \to \infty), under stability or attractivity, continuous-time (S)DCMs reduce to classical SCMs, allowing established identification and adjustment criteria to apply (Bongers et al., 2018, Rubenstein et al., 2016).

6. Modularization, Expressivity, and Computational Aspects

Dynamic SCMs achieve modularity and temporal representational efficiency via:

  • Socket and buffer variables, enabling dynamic attachment/detachment of agents, modular interaction, and constant-space temporal representation (as in multi-agent robotics and cyber-physical systems).
  • Time context variables and buffer monads, allowing efficient threading of time indices without full time-unrolled graphs.
  • Relaxed stationarity, permitting agent populations and structure to evolve while preserving local modeling invariances (Howard et al., 3 Jun 2024).

In robotic planning and high-dimensional control applications, DSCM-based learning leverages causal structure distributions, distributing causal and functional uncertainty explicitly across the system. Latent-variable parameterizations (encoder-multidecoder constructions) and distributional inference over graph structure yield robust, sample-efficient forecasting and effective adaptation to interventions or distributional shifts (Murillo-Gonzalez et al., 8 Aug 2025).

7. Identifiability, Graphical Criteria, and Estimation

Identifiability in DSCMs depends on the following:

  • Stationarity and stability of the underlying process (VARs, ODE/SDEs).
  • Known or estimable block structure (for noise orthogonalization or ICA-based identification in the presence of hidden confounders).
  • For time-varying causal graphs, identifiability of the DAG at each timepoint with independent non-Gaussian or isotropic Gaussian errors (Cinquini et al., 14 Oct 2024, Wang et al., 11 Jan 2025, Ferreira et al., 2023).

Graphical criteria from the summary or full-time causal graph (SCG or FTCG) provide systematic conditions for when direct effects are identifiable from observed data, including explicit adjustment set constructions (big and single-door adjusters) and partial regression estimators in linear-Gaussian cases (Ferreira et al., 2023).

DSCMs thus unify and extend core causal identification principles—back-door, front-door, and more general ID algorithms—to time-indexed, possibly cyclic and multi-agent domains.


References:

Slide Deck Streamline Icon: https://streamlinehq.com

Whiteboard

Forward Email Streamline Icon: https://streamlinehq.com

Follow Topic

Get notified by email when new papers are published related to Dynamic Structural Causal Models (DSCMs).