Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 134 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 35 tok/s Pro
GPT-5 High 26 tok/s Pro
GPT-4o 108 tok/s Pro
Kimi K2 190 tok/s Pro
GPT OSS 120B 438 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Kallianpur-Striebel Formula in Nonlinear Filtering

Updated 16 September 2025
  • The Kallianpur-Striebel formula is a fundamental tool in nonlinear filtering that normalizes the conditional distribution of a signal process via change-of-measure techniques.
  • It underpins both Zakai and Kushner–Stratonovich filtering equations, enabling reliable inference in models with diffusions, jumps, and rough paths.
  • Its relaxed technical conditions and broad extensions support practical applications in finance, engineering, and machine learning.

The Kallianpur–Striebel formula is a foundational result in nonlinear filtering theory, providing an explicit characterization of the conditional distribution of a signal process given noisy observations. Central to both the classical and modern stochastic filtering frameworks, the formula expresses the normalized conditional law—the filter—as the appropriately normalized conditional expectation of a weighted (unnormalized) process, often arising via a change-of-measure or as the solution to a linear stochastic (or rough) equation. Its generality spans Markov and non-Markov signals, diffusions, jump processes, Volterra Gaussian rough paths, and even abstract innovations-driven machine learning contexts, underlining its indispensable role in both theoretical developments and practical algorithms for inference in stochastic dynamical systems.

1. Fundamental Structure and Change-of-Measure Approach

The Kallianpur–Striebel formula asserts that the conditional distribution of a signal XtX_t given observations YY up to time tt (with filtration Vt\mathcal{V}_t) can be written as

Tt(φ)=E[φ(Xt)Vt]=E[Ztφ(Xt)]E[Zt],T_t(\varphi) = \mathbb{E}[\varphi(X_t) | \mathcal{V}_t] = \frac{\mathbb{E}[Z_t \varphi(X_t)]}{\mathbb{E}[Z_t]},

where ZtZ_t is an exponential martingale derived from the observation process and a link function hh: Zt=exp{0th(Xs)dWs120th(Xs)2ds}.Z_t = \exp \left\{ -\int_0^t h(X_s)^\top dW_s - \frac{1}{2} \int_0^t |h(X_s)|^2 ds \right\}. This martingale is constructed via a change-of-measure, typically making the observation process YY into a standard Brownian motion under the new measure (by Girsanov’s theorem), thereby simplifying the conditional law (Cass et al., 2014).

A transformed average energy condition ensures the integrability and martingale property of ZtZ_t, broadening the method’s scope beyond classical Novikov or Kazamaki-type assumptions. The formula’s abstract form is

Tt(φ)=pt(φ)/pt(1),withpt(φ)=E[Ztφ(Xt)Vt].T_t(\varphi) = p_t(\varphi) / p_t(1), \quad \text{with} \quad p_t(\varphi) = \mathbb{E}[Z_t \varphi(X_t) | \mathcal{V}_t].

This structure is robust to general Markov signal processes characterized via the martingale problem, allowing coverage of diffusions, jump-diffusions, and broad signal classes.

2. Relaxation of Technical Conditions and Martingale Criteria

The original form of the formula’s applicability required strong technical conditions on the integrability and martingale properties of the likelihood weight ZtZ_t. The transformed energy criterion,

E[0tZsh(Xs)2ds]<,\mathbb{E} \left[ \int_0^t Z_s |h(X_s)|^2 ds \right] < \infty,

serves to guarantee ZtZ_t’s true martingale property and is shown to be strictly weaker than Kazamaki's condition in certain regimes (Cass et al., 2014). This relaxation is critical for extending the filter equations to signal processes driven by, e.g., jump processes with only linear growth or to weak solutions in the sense of the general martingale problem.

Comparative analysis demonstrates that the new criterion certifies the well-posedness of the filter in cases where earlier criteria would fail, directly impacting the breadth of nonlinear filtering applications.

3. Derivation of Filtering Equations

By utilizing the changed measure (where the observation is a Brownian motion), coupled with the martingale problem formulation of the signal, one derives linear and nonlinear SPDEs for the evolution of the (unnormalized and normalized) filter: dpt(φ)=pt(Aφ)dt+(pt(φh)pt(φ)pt(h))dYt,dp_t(\varphi) = p_t(A\varphi)dt + (p_t(\varphi h) - p_t(\varphi)p_t(h)) dY_t,

dTt(φ)=Tt(Aφ)dt+(Tt(φh)Tt(φ)Tt(h))(dYtTt(h)dt),dT_t(\varphi) = T_t(A\varphi)dt + (T_t(\varphi h) - T_t(\varphi)T_t(h)) (dY_t - T_t(h)dt),

where AA denotes the infinitesimal generator of the Markov process. The first (unnormalized) equation (the Zakai equation) is linear, while the normalized version (nonlinear) constitutes the Kushner–Stratonovich equation.

In contemporary developments, analogous formulations hold for correlated Lévy-driven systems where the likelihood process includes both Brownian and jump components, and in rough-path settings, where the integral representations are constructed using rough path theory and the respective Zakai and Kushner–Stratonovich equations emerge as deterministic or random filtering equations (Bugini et al., 15 Sep 2025, Cass et al., 11 Jun 2025, Qiao, 2019).

4. Extensions: Innovations, Jump Processes, Volterra and Rough Path Filtering

The formula’s core is invariant under several modern extensions:

  • Correlated Lévy and Jump Filtering: For signal-observation systems driven by Lévy noise, the likelihood process encompasses the Girsanov correction for both continuous and jump parts. The Kallianpur–Striebel formula continues to serve as the normalization bridge, facilitating the derivation of both Zakai and Kushner–Stratonovich equations (Qiao, 2019).
  • Markov Chain Observations: In continuous-time hidden Markov models and models where the observation itself is a Markov chain (e.g., regime-switching or disease progression models), the unnormalized posterior is given by integrating with respect to a likelihood ratio process that compensates the change of measure for the observation’s jump rates. The filtering equations—DMZ (Duncan–Mortensen–Zakai) for unnormalized, FKK (Fujisaki–Kallianpur–Kunita) for normalized—are rigorously connected via the Kallianpur–Striebel formula (Kouritzin, 2023).
  • Rough Path and Volterra Gaussian Filtering: Recent work has established analogues of the formula in rough path theory (Bugini et al., 15 Sep 2025, Cass et al., 11 Jun 2025). Here, the signal and weight processes are defined via rough SDEs driven by deterministic rough paths or Volterra-type Gaussian noises. The resulting (rough) Kallianpur–Striebel formula,

ϑtY(φ)=μtY(φ)μtY(1),\vartheta_t^Y(\varphi) = \frac{\mu_t^Y(\varphi)}{\mu_t^Y(1)},

employs robust pathwise representations, ensuring continuity with respect to the observation and yielding well-posed (rough) Zakai equations with deterministic or stochastic correction terms.

  • Innovations and Nonparametric Forecasting: In nonparametric time series and machine learning contexts, the innovations representation provides a modern reinterpretation. By learning mappings (via deep generative models) that transform the time series into an independent sequence of “innovations” (i.i.d. uniform random variables), the Kallianpur–Striebel formula’s whitening effect is invoked for probabilistic forecasting, generalizing classical filter normalization to highly flexible, data-driven regimes (Wang et al., 2023).

5. Large Deviation Asymptotics and Variational Principles

The Kallianpur–Striebel formula underpins the analysis of filtering in small-noise regimes where Laplace asymptotics apply. As noise vanishes, the normalized likelihood process in the formula becomes sharply concentrated, and the conditional law is governed by a variational (minimum energy) principle. Specifically,

ϵ2logE[exp{1ϵ2φ(X)}Yt]infη{φ(η)+120Th(η(s))h(ξ(s))2ds+J(η)},-\epsilon^2 \log \mathbb{E}\bigl[ \exp\{-\frac{1}{\epsilon^2}\varphi(X)\} \mid \mathcal{Y}_t \bigr] \to \inf_{\eta} \biggl\{ \varphi(\eta) + \frac{1}{2} \int_0^T \|h(\eta(s)) - h(\xi^*(s))\|^2 ds + J(\eta) \biggr\},

where J(η)J(\eta) encodes prior costs (Freidlin–Wentzell rate function), and the infimum is realized at the most probable trajectory ξ\xi^*. This variational connection enables the computation of large-deviation rate functions for conditional expectations and links filtering to stochastic control (Reddy et al., 2021).

6. Significance for Applications and Practical Inference

The Kallianpur–Striebel formula consolidates the normalization, representation, and evolution of conditional laws in stochastic dynamical systems. Its impact is seen in:

  • Robustness and Continuity: Rough-path analogues and transformed energy conditions ensure pathwise stability under both model and observational perturbations (Bugini et al., 15 Sep 2025, Cass et al., 11 Jun 2025).
  • Algorithmic Design: The formula underlies particle and sequential Monte Carlo filtering, model selection, and estimation in contexts with continuous, jump, or rough observations (Kouritzin, 2023).
  • Statistical and Numerical Implementation: Its explicit structure permits efficient computation of conditional distributions, crucial for trend, parameter, and volatility estimation in financial time series, as well as prognosis in disease progression models.
  • Theoretical Unification: The formula’s generality unites classical SPDE-based filtering, jump process filtering, rough path theory, and modern nonparametric innovations approaches within a single theoretical framework.

7. Concrete Applications and Examples

Table 1 presents representative applications of the Kallianpur–Striebel formula in diverse filtering contexts.

Setting Unnormalized Process Normalized Filter
Brownian diffusions ZtZ_t (exp. martingale) Tt(φ)=E[Ztφ(Xt)]/E[Zt]T_t(\varphi) = \mathbb{E}[Z_t \varphi(X_t)] / \mathbb{E}[Z_t]
Lévy-driven SDEs AtA_t (with jump terms) πt(F)=P~t(F)/P~t(1)\pi_t(F) = \tilde{P}_t(F)/\tilde{P}_t(1)
Volterra Gaussian rough paths Rough integral-based ZtZ_t ϑtY(φ)=μtY(φ)/μtY(1)\vartheta_t^Y(\varphi) = \mu_t^Y(\varphi)/\mu_t^Y(1)
Markov chains (jump observations) AtA_t (likelihood ratio) πt(f)=σt(f)/σt(1)\pi_t(f) = \sigma_t(f)/\sigma_t(1)

In all cases, the formula facilitates the normalization required to transform linear (unnormalized) SPDEs or equations into nonlinear (normalized) filtering equations. The resulting framework encompasses general Markov processes, models with memory and roughness, jump dynamics, and data-driven innovations models.


In summary, the Kallianpur–Striebel formula provides a universal, explicit, and structurally robust representation for the conditional law in nonlinear filtering. By enabling a passage from unnormalized to normalized filtering equations, relaxing earlier restrictive conditions, and supporting extensions to rough and nonparametric filtering, the formula continues to be central in both foundational theory and cutting-edge applications of inference in stochastic systems.

Forward Email Streamline Icon: https://streamlinehq.com

Follow Topic

Get notified by email when new papers are published related to Kallianpur-Striebel Formula.