Kallianpur-Striebel Formula in Nonlinear Filtering
- The Kallianpur-Striebel formula is a fundamental tool in nonlinear filtering that normalizes the conditional distribution of a signal process via change-of-measure techniques.
- It underpins both Zakai and Kushner–Stratonovich filtering equations, enabling reliable inference in models with diffusions, jumps, and rough paths.
- Its relaxed technical conditions and broad extensions support practical applications in finance, engineering, and machine learning.
The Kallianpur–Striebel formula is a foundational result in nonlinear filtering theory, providing an explicit characterization of the conditional distribution of a signal process given noisy observations. Central to both the classical and modern stochastic filtering frameworks, the formula expresses the normalized conditional law—the filter—as the appropriately normalized conditional expectation of a weighted (unnormalized) process, often arising via a change-of-measure or as the solution to a linear stochastic (or rough) equation. Its generality spans Markov and non-Markov signals, diffusions, jump processes, Volterra Gaussian rough paths, and even abstract innovations-driven machine learning contexts, underlining its indispensable role in both theoretical developments and practical algorithms for inference in stochastic dynamical systems.
1. Fundamental Structure and Change-of-Measure Approach
The Kallianpur–Striebel formula asserts that the conditional distribution of a signal given observations up to time (with filtration ) can be written as
where is an exponential martingale derived from the observation process and a link function : This martingale is constructed via a change-of-measure, typically making the observation process into a standard Brownian motion under the new measure (by Girsanov’s theorem), thereby simplifying the conditional law (Cass et al., 2014).
A transformed average energy condition ensures the integrability and martingale property of , broadening the method’s scope beyond classical Novikov or Kazamaki-type assumptions. The formula’s abstract form is
This structure is robust to general Markov signal processes characterized via the martingale problem, allowing coverage of diffusions, jump-diffusions, and broad signal classes.
2. Relaxation of Technical Conditions and Martingale Criteria
The original form of the formula’s applicability required strong technical conditions on the integrability and martingale properties of the likelihood weight . The transformed energy criterion,
serves to guarantee ’s true martingale property and is shown to be strictly weaker than Kazamaki's condition in certain regimes (Cass et al., 2014). This relaxation is critical for extending the filter equations to signal processes driven by, e.g., jump processes with only linear growth or to weak solutions in the sense of the general martingale problem.
Comparative analysis demonstrates that the new criterion certifies the well-posedness of the filter in cases where earlier criteria would fail, directly impacting the breadth of nonlinear filtering applications.
3. Derivation of Filtering Equations
By utilizing the changed measure (where the observation is a Brownian motion), coupled with the martingale problem formulation of the signal, one derives linear and nonlinear SPDEs for the evolution of the (unnormalized and normalized) filter:
where denotes the infinitesimal generator of the Markov process. The first (unnormalized) equation (the Zakai equation) is linear, while the normalized version (nonlinear) constitutes the Kushner–Stratonovich equation.
In contemporary developments, analogous formulations hold for correlated Lévy-driven systems where the likelihood process includes both Brownian and jump components, and in rough-path settings, where the integral representations are constructed using rough path theory and the respective Zakai and Kushner–Stratonovich equations emerge as deterministic or random filtering equations (Bugini et al., 15 Sep 2025, Cass et al., 11 Jun 2025, Qiao, 2019).
4. Extensions: Innovations, Jump Processes, Volterra and Rough Path Filtering
The formula’s core is invariant under several modern extensions:
- Correlated Lévy and Jump Filtering: For signal-observation systems driven by Lévy noise, the likelihood process encompasses the Girsanov correction for both continuous and jump parts. The Kallianpur–Striebel formula continues to serve as the normalization bridge, facilitating the derivation of both Zakai and Kushner–Stratonovich equations (Qiao, 2019).
- Markov Chain Observations: In continuous-time hidden Markov models and models where the observation itself is a Markov chain (e.g., regime-switching or disease progression models), the unnormalized posterior is given by integrating with respect to a likelihood ratio process that compensates the change of measure for the observation’s jump rates. The filtering equations—DMZ (Duncan–Mortensen–Zakai) for unnormalized, FKK (Fujisaki–Kallianpur–Kunita) for normalized—are rigorously connected via the Kallianpur–Striebel formula (Kouritzin, 2023).
- Rough Path and Volterra Gaussian Filtering: Recent work has established analogues of the formula in rough path theory (Bugini et al., 15 Sep 2025, Cass et al., 11 Jun 2025). Here, the signal and weight processes are defined via rough SDEs driven by deterministic rough paths or Volterra-type Gaussian noises. The resulting (rough) Kallianpur–Striebel formula,
employs robust pathwise representations, ensuring continuity with respect to the observation and yielding well-posed (rough) Zakai equations with deterministic or stochastic correction terms.
- Innovations and Nonparametric Forecasting: In nonparametric time series and machine learning contexts, the innovations representation provides a modern reinterpretation. By learning mappings (via deep generative models) that transform the time series into an independent sequence of “innovations” (i.i.d. uniform random variables), the Kallianpur–Striebel formula’s whitening effect is invoked for probabilistic forecasting, generalizing classical filter normalization to highly flexible, data-driven regimes (Wang et al., 2023).
5. Large Deviation Asymptotics and Variational Principles
The Kallianpur–Striebel formula underpins the analysis of filtering in small-noise regimes where Laplace asymptotics apply. As noise vanishes, the normalized likelihood process in the formula becomes sharply concentrated, and the conditional law is governed by a variational (minimum energy) principle. Specifically,
where encodes prior costs (Freidlin–Wentzell rate function), and the infimum is realized at the most probable trajectory . This variational connection enables the computation of large-deviation rate functions for conditional expectations and links filtering to stochastic control (Reddy et al., 2021).
6. Significance for Applications and Practical Inference
The Kallianpur–Striebel formula consolidates the normalization, representation, and evolution of conditional laws in stochastic dynamical systems. Its impact is seen in:
- Robustness and Continuity: Rough-path analogues and transformed energy conditions ensure pathwise stability under both model and observational perturbations (Bugini et al., 15 Sep 2025, Cass et al., 11 Jun 2025).
- Algorithmic Design: The formula underlies particle and sequential Monte Carlo filtering, model selection, and estimation in contexts with continuous, jump, or rough observations (Kouritzin, 2023).
- Statistical and Numerical Implementation: Its explicit structure permits efficient computation of conditional distributions, crucial for trend, parameter, and volatility estimation in financial time series, as well as prognosis in disease progression models.
- Theoretical Unification: The formula’s generality unites classical SPDE-based filtering, jump process filtering, rough path theory, and modern nonparametric innovations approaches within a single theoretical framework.
7. Concrete Applications and Examples
Table 1 presents representative applications of the Kallianpur–Striebel formula in diverse filtering contexts.
| Setting | Unnormalized Process | Normalized Filter |
|---|---|---|
| Brownian diffusions | (exp. martingale) | |
| Lévy-driven SDEs | (with jump terms) | |
| Volterra Gaussian rough paths | Rough integral-based | |
| Markov chains (jump observations) | (likelihood ratio) |
In all cases, the formula facilitates the normalization required to transform linear (unnormalized) SPDEs or equations into nonlinear (normalized) filtering equations. The resulting framework encompasses general Markov processes, models with memory and roughness, jump dynamics, and data-driven innovations models.
In summary, the Kallianpur–Striebel formula provides a universal, explicit, and structurally robust representation for the conditional law in nonlinear filtering. By enabling a passage from unnormalized to normalized filtering equations, relaxing earlier restrictive conditions, and supporting extensions to rough and nonparametric filtering, the formula continues to be central in both foundational theory and cutting-edge applications of inference in stochastic systems.