Papers
Topics
Authors
Recent
2000 character limit reached

Complex Gaussian Processes

Updated 31 December 2025
  • Complex Gaussian Processes are stochastic processes over complex domains defined by Hermitian covariance and pseudo-covariance functions, allowing robust nonparametric modeling.
  • Their kernel design enforces conjugate symmetry, stationarity, and analyticity, which is essential for accurately modeling real LTI systems and robust control applications.
  • They are applied in system identification, time series analysis, and quantum noise simulation, with efficient algorithms such as circulant embedding for rapid sampling.

Complex Gaussian processes are stochastic processes defined over a complex domain, where for any finite collection of evaluation points, the resultant vectors are distributed according to a multivariate complex normal law. These processes are characterized by mean, Hermitian covariance, and pseudo-covariance functions. Complex Gaussian processes (CGPs) underlie a wide array of applications in time series analysis, communications, quantum dynamics, and system identification, providing principled nonparametric inference for signals and functions with both real and imaginary dynamics. The theory carefully distinguishes between proper (circular) and improper (noncircular) complex processes, with conjugate symmetry, stationarity, and analyticity being crucial for identifying connections to real-world linear time-invariant (LTI) systems and robust control.

1. Mathematical Formulation and Covariance Structure

A complex-valued Gaussian process f(z)f(z), with z∈D⊆Cz \in D \subseteq \mathbb{C}, satisfies for any collection z1,…,znz_1, \ldots, z_n that (f(z1),…,f(zn))⊤(f(z_1), \ldots, f(z_n))^\top is multivariate complex normal, fully specified by:

  • Mean: m(z)=E[f(z)]m(z) = \mathbb{E}[f(z)]
  • Hermitian covariance: k(z,w)=E[f(z)f∗(w)]k(z, w) = \mathbb{E}[f(z) f^*(w)], where k(w,z)=k∗(z,w)k(w, z) = k^*(z, w) and kk is positive definite
  • Pseudo-covariance (complementary): k~(z,w)=E[f(z)f(w)]\tilde{k}(z, w) = \mathbb{E}[f(z) f(w)]

Properness is the property k~≡0\tilde{k} \equiv 0, corresponding to circular symmetry (rotational invariance in the C\mathbb{C} plane). Proper processes have independent real and imaginary parts with the same covariance, whereas kk and k~\tilde{k} provide a complete second-order characterization for general (possibly improper) processes (Boloix-Tortosa et al., 2015, Boloix-Tortosa et al., 2015, Devonport et al., 2022).

Conjugate symmetry is required to model real LTI system transfer functions: f(z∗)=f∗(z)f(z^*) = f^*(z) almost surely, with constraints k(z,z)=k(z∗,z∗)k(z, z) = k(z^*, z^*) and k(z,z)=k~(z,z∗)k(z, z) = \tilde{k}(z, z^*).

2. Kernel Design, Stationarity, and Analytic Structure

Covariance functions for CGPs require careful design:

  • General construction: Any positive-definite Hermitian kernel kk and symmetric pseudo-kernel k~\tilde{k}, potentially with convolutional forms for prescribed real–imaginary correlations (Boloix-Tortosa et al., 2015, Boloix-Tortosa et al., 2015).
  • Hermitian stationarity: For frequency arguments z=ejθz = e^{j\theta}, stationarity means k(ejθ,ejφ)=K0(θ−φ)k(e^{j\theta}, e^{j\varphi}) = K_0(\theta-\varphi), which can be spectrally decomposed as

k(ejθ,ejφ)=∑n=0∞γne−in(θ−φ)k(e^{j\theta},e^{j\varphi}) = \sum_{n=0}^\infty \gamma_n e^{-i n(\theta-\varphi)}

with γn≥0\gamma_n \ge 0, where γn\gamma_n corresponds to the variance of the nthn^{\rm th} impulse-response coefficient. Summability ∑γn<∞\sum \gamma_n < \infty ensures BIBO stability and boundedness on the unit circle (Devonport et al., 2022).

  • Analyticity/H∞ property: Sufficient conditions for sample paths in H∞H_\infty (analytic and bounded on ∣z∣>1|z| > 1):
    • (i) Covariances kr,kik_r, k_i are continuous and satisfy Kolmogorov–type bounds for a.e. boundedness on the unit circle.
    • (ii) RKHS trace bounds guarantee membership in Hardy space H2H_2.
    • (iii) Together, these yield f∈H∞f \in H_\infty almost surely (a.s.).

Kernel hyperparameters can be learned via marginal likelihood maximization using Wirtinger calculus, guaranteeing efficient data-driven adaptation (Boloix-Tortosa et al., 2015, Boloix-Tortosa et al., 2015).

3. Regression, Inference, and Posterior Structure

Complex GP regression (CGPR) generalizes real-valued regression to accommodate pseudo-covariance and complex kernels. Key formulas:

  • Widely-linear prediction: Both yy and y∗y^* are conditioned, leveraging Schur complements of augmented covariance blocks (Boloix-Tortosa et al., 2015, Devonport et al., 2022).
  • Strictly-linear prediction: Only yy is used for posterior mean and variance, suitable for proper cases and conjugate-symmetric priors.

Convolutional kernel construction enables explicit design for prescribed real–imaginary cross-covariances, including proper/non-proper scenarios. CGPR yields posterior means and variances for both real and imaginary parts, with the predictive mean and variance given by

μ∗=K∗X(KXX+σ2I)−1y,\mu_* = K_{*X}(K_{XX} + \sigma^2 I)^{-1} y,

σ∗2=k(x∗,x∗)−K∗X(KXX+σ2I)−1KX∗\sigma^2_* = k(x_*, x_*) - K_{*X}(K_{XX} + \sigma^2 I)^{-1} K_{X*}

in proper scenarios. For improper cases, pseudo-covariances are retained throughout (Boloix-Tortosa et al., 2015, Boloix-Tortosa et al., 2015).

Recursive CGPR and dictionary-based updating schemes allow efficient online kernel-based system identification, particularly in communications and adaptive filtering (Boloix-Tortosa et al., 2015).

4. Simulation and Sampling Algorithms

Efficient simulation of CGPs, especially in the stationary case, is achieved via circulant embedding:

  • Embed the Toeplitz covariance matrix as the leading block of a larger circulant matrix.
  • Compute eigenvalues via FFT; sample spectral-domain Gaussian vectors; transform back via inverse FFT.
  • Exactness requires all eigenvalues to be nonnegative, with theoretical conditions (Craigmile, Dietrich–Newsam) guaranteeing validity for broad kernel classes (Coeurjolly et al., 2016, Sykulski et al., 2016).
  • For vector-valued or multivariate signals, the embedding method generalizes via block-circulant structure with analogous PSD requirements.

Sampling via covariance decomposition (Cholesky/eigenvalue) is standard for non-stationary scenarios or general covariance forms, crucial for quantum noise synthesis and open-system dynamics (Chen et al., 2013).

5. Applications in System Identification and Robust Control

Complex GPs are widely used as nonparametric priors in Bayesian system identification and uncertainty modeling:

  • Frequency-domain system identification: CGPs (especially those with sample paths in H∞H_\infty) serve as prior models for unknown transfer functions, yielding posterior estimators that are always analytic and BIBO stable post-conditioning.
  • Probabilistic robust control: The H∞H_\infty property allows random draws from the GP to be interpreted as causal stable transfer functions, integrating directly into robust-control frameworks (μ-analysis, disk margin, IQC certification), with explicit probabilistic bounds on worst-case system gain using tail and excursion probability inequalities (Devonport et al., 2022, Devonport et al., 2023).
  • Bayesian uncertainty bounds: Application of Belyaev’s formula and boundary-level crossing theory yields computable upper bounds on the probability that a random transfer function violates given constraints (circle-IQC), supporting certified robust inference post data (Devonport et al., 2023, Devonport et al., 2022).

Mixtures of stationary H∞H_\infty kernels and resonance-tuned kernels are often employed, with hyperparameters fit via empirical Bayes (Devonport et al., 2022).

6. Extensions: Topology, Higher-Order Structure, and Multidimensional Domains

CGPs naturally extend to topologically rich domains, such as graphs and cellular complexes, via generalizations of Laplacian and Dirac operators:

  • Cellular/graph Matérn kernels: Defined using the spectrum of the Hodge Laplacian for kk-cochains, enabling inference over vertices, edges, and higher cells simultaneously (Alain et al., 2023).
  • Reaction-diffusion kernels: Incorporate cross-dimensional (vertex–edge–face) covariance links, facilitating learning from polyadic interactions, finite element simulations, hypergraph social/biological data, and brain network motifs.

Spectral formulations and kernel constructions allow interpretable priors sensitive to underlying topology and homology, supporting large-scale high-dimensional inference (Alain et al., 2023).

7. Signal Processing, Quantum Noise, and Time Series Analysis

Complex GPs provide foundational Bayesian generalizations of analytic signal processing:

  • Quadrature kernels: Covariances enforcing ±π/2\pm\pi/2 phase relationships via the Hilbert transform, yielding analytic signals for time–frequency analysis (Ambrogioni et al., 2016).
  • Quasi-quadrature kernels: Relaxed analyticity for signals with approximate quadrature structure, offering advantages in amplitude and frequency estimation under realistic noise models.
  • Quantum environments: Construction of complex GPs whose covariances encode arbitrary spectral densities, directly reproducing thermal quantum phonon baths in open system dynamics (via the Feynman–Vernon influence functional) (Chen et al., 2013).

Enhanced accuracy over classical Hilbert/wavelet approaches and flexibility in prior design are demonstrated across oscillatory and nonlinear channel regimes (Ambrogioni et al., 2016, Boloix-Tortosa et al., 2015).


Relevant Papers:

Whiteboard

Topic to Video (Beta)

Follow Topic

Get notified by email when new papers are published related to Complex Gaussian Processes.