Papers
Topics
Authors
Recent
Search
2000 character limit reached

Hidden Quantum Markov Models (HQMMs)

Updated 18 February 2026
  • HQMMs are latent-variable quantum stochastic processes that generalize classical HMMs by encoding internal states as quantum density operators and evolving them via Kraus operators.
  • They offer significant advantages in representational power and state compression, enabling the modeling of complex dynamics beyond the capabilities of classical HMMs.
  • State-of-the-art learning algorithms and tensor network methods address computational challenges, making HQMMs practical for quantum simulation, control, and robust sequential modeling.

A Hidden Quantum Markov Model (HQMM) is a latent-variable quantum stochastic process that generalizes the classical hidden Markov model (HMM) by encoding the system's internal state as a quantum state and evolving it through quantum measurement channels. HQMMs provide a universal framework for modeling, simulating, and learning stochastic processes with quantum resources, strictly exceeding classical HMMs in representational power, state compression, and physical applicability—particularly in quantum biology, quantum information, and quantum machine learning.

1. Mathematical Formalism of HQMMs

An HQMM is specified by a finite-dimensional Hilbert space H\mathcal{H} (the “hidden” memory), an output alphabet A\mathcal{A}, and a family of quantum operations for each output symbol. The machine's internal state at time tt is a density operator ρtB(H)\rho_t \in \mathcal{B}(\mathcal{H}) with ρt0\rho_t \geq 0, Tr[ρt]=1\mathrm{Tr}[\rho_t]=1. For each xAx \in \mathcal{A} there is a quantum instrument—a completely positive (CP), trace-nonincreasing map—which can be written as a set of Kraus operators {Kx,μ}μ=1wx\{K_{x,\mu}\}_{\mu=1}^{w_x} satisfying the trace-preserving completeness relation: xAμ=1wxKx,μKx,μ=Id.\sum_{x \in \mathcal{A}} \sum_{\mu=1}^{w_x} K_{x,\mu}^\dagger K_{x,\mu} = \mathbb{I}_d. After emitting xx given ρt1\rho_{t-1}, the probability and post-measurement state are

P(xρt1)=Tr(μKx,μρt1Kx,μ),ρt=μKx,μρt1Kx,μP(xρt1)P(x \mid \rho_{t-1}) = \mathrm{Tr}\left(\sum_{\mu} K_{x,\mu} \rho_{t-1} K_{x,\mu}^\dagger \right),\qquad \rho_{t} = \frac{\sum_{\mu} K_{x,\mu} \rho_{t-1} K_{x,\mu}^\dagger}{P(x|\rho_{t-1})}

Iterating this procedure over a symbol sequence x1:Tx_{1:T} gives the joint probability

P(x1:T)=Tr(μT ⁣KxT,μTμ1 ⁣Kx1,μ1ρ0Kx1,μ1KxT,μT).P(x_{1:T}) = \operatorname{Tr} \Bigl( \sum_{\mu_T}\! K_{x_T,\mu_T} \cdots \sum_{\mu_1}\! K_{x_1,\mu_1}\, \rho_0\, K_{x_1,\mu_1}^\dagger \cdots K_{x_T,\mu_T}^\dagger \Bigr).

This construct strictly generalizes classical HMMs: any stochastic process that can be generated classically also admits an HQMM realization with diagonal Kraus operators, but HQMMs can realize strictly richer dynamics (e.g., oscillatory, complex-valued, or long-range correlated outputs) and compress certain processes more efficiently (Monras et al., 2010, Clark et al., 2014, Adhikary et al., 2019).

2. Quantum–Classical Comparisons and Expressivity

HQMMs admit a continuous quantum memory (the spectraplex of PSD matrices) instead of discrete hidden states. For latent space dimension nn:

  • Classical HMMs can be embedded into an HQMM of dimension nn by appropriate choice of commuting Kraus operators (O`Neill et al., 2012, Cholewa et al., 2015).
  • For certain processes, a quantum model of dimension dQd_Q can generate statistics which require a classical model of dimension dC=dQ2d_C = d_Q^2 (quadratic separation in state compression) (Zonnios et al., 2024, Sundar et al., 22 Jan 2026).
  • HQMMs form a trace-preserving subclass of observable operator models (OOMs) that never assign negative probabilities (Adhikary et al., 2019).
  • Quantum memory compression is operationally controlled by the entanglement (Schmidt spectrum) of the corresponding process tensor (Sundar et al., 22 Jan 2026).
  • Explicit constructions exist where a 2-dimensional HQMM matches the statistics of a 3-state classical HMM, demonstrating strict quantum memory advantage (Zonnios et al., 2024).

Expressivity also encompasses compact representations of non-Markovian processes, “probability clocks,” and symbolic dynamics of measured quantum many-body states (e.g., the AKLT chain) (Souissi et al., 21 Dec 2025, Monras et al., 2010).

3. Learning Algorithms and Computational Aspects

The learning task is to estimate the Kraus operators (collectively a matrix on the complex Stiefel manifold) from observed sequences. Practical algorithms include:

  • Retraction-based constrained gradient descent on the Stiefel manifold, which maintains physical (CPTP) constraints on Kraus parameters and achieves rapid convergence with favorable scaling (Adhikary et al., 2019, Adhikary et al., 2019).
  • Evolutionary and ansatz-based optimization procedures aligned with HQMM circuit representations via Stinespring dilation (Markov et al., 2022).
  • Robust extensions (e.g., RILA) integrate entropy filtering and L1-penalization to counteract adversarial data corruption while preserving CPTP constraints (Ning, 27 Oct 2025).
  • Nonparametric Hilbert-space-embedding HQMMs (HSE-HQMMs) extend the formalism to continuous-valued features and kernel methods, supporting regression and Bayesian inference in RKHS (Srinivasan et al., 2018).

The computational bottlenecks are dominated by matrix multiplications (O(wn3)O(w n^3) per sequence step) for the standard Kraus-parametrization HQMM, but scaling can be improved by exploiting tensor network and MPS structures—especially for high-order cyclic or circular HQMMs (c-HQMMs) (Javidian et al., 2021, Sundar et al., 22 Jan 2026).

4. Physical Realizations and Tensor Network Structures

HQMMs offer a direct mapping to quantum physical systems:

  • Open quantum systems with instantaneous feedback and monitored Lindblad dynamics instantiate HQMMs; measurement records correspond to symbol output and internal state evolution to Kraus maps (Clark et al., 2014).
  • Sequential nonadaptive projective measurements of a matrix product state (e.g., 1D cluster or AKLT state) realize the HQMM model, elucidating the connection between quantum spin chains, entanglement, and Markovian statistics (Monras et al., 2010, Souissi et al., 21 Dec 2025).
  • Tensor network generalizations, such as c-HQMMs, represent stationary models as circular locally purified states (c-LPS), unifying HQMMs with tensor ring/MPS models and providing efficient learning procedures (Javidian et al., 2021).

The HQMM formalism thus underpins both physical implementations (quantum transport/feedback, monitored open systems) and abstract representations (tensorized machine learning models, stochastic language generators).

5. Applications and Problem Domains

HQMMs target domains where classical HMMs are limited by memory or correlation constraints. Applications include:

  • Quantum simulation and control, capturing measurement-induced phase transitions and inference limits in monitored dynamics (e.g., phase boundaries in quantum error correction, monitored circuits, and tree-structured systems) (Kim et al., 11 Apr 2025).
  • Probabilistic safety analysis and scenario generation, where HQMMs yield higher description accuracy (DA) and more compact models than classical HMMs on real failure data (Zaiou et al., 2022).
  • Sequence modeling and density estimation in language, robotics, and genomics, leveraging the efficient representation of quantum and classical correlations (Srinivasan et al., 2017, Srinivasan et al., 2018).
  • Express modeling of symmetry-protected topological order and entanglement structure in many-body physics through the HQMM lens (Souissi et al., 21 Dec 2025).
  • Quantum compression of arbitrary (ergodic) HMMs via deterministic dilation, tensor network conversion, and variational bond-dimension truncation, reproducing sequence-level statistics at reduced memory cost (Sundar et al., 22 Jan 2026).

Additionally, HQMMs furnish tools for robust quantum sequential learning under adversarial and non-differentiable conditions (Ning, 27 Oct 2025).

6. Variants, Generalizations, and Open Problems

Multiple HQMM variants extend the foundational model:

  • Circular HQMMs (c-HQMMs) and tensor-network HQMMs handle stationary and periodic boundaries, providing state-of-the-art sequence modeling with lowest negative log-likelihood (NLL) in empirical datasets (Javidian et al., 2021).
  • Split HQMMs (SHQMMs), inspired by quantum conditional master equations, implement quantum Markov processes with environment-induced subspaces and explicit physical mappings to quantum transport systems (Li et al., 2023).
  • HQMM learning generalizes to nonparametric settings (HSE-HQMMs) and robust, adversarially secure optimization landscapes (Srinivasan et al., 2018, Ning, 27 Oct 2025).

Open research directions span:

  • Full characterization of the inclusion relations and gap between HQMM expressivity and OOMs.
  • Spectral invariants and algebraic characterizations of minimal quantum memory for a target process (Zonnios et al., 2024).
  • Integration with physical quantum hardware for native deployment and quantum-accelerated learning (Markov et al., 2022).
  • Algorithmic improvements for global nonconvex landscapes, e.g., second-order or Riemannian manifold methods; robust generalization in the presence of noise or data corruption (Ning, 27 Oct 2025).
  • Physical and mathematical exploration of measurement-induced and learnability phase transitions in quantum stochastic models (Kim et al., 11 Apr 2025).
Definition Search Book Streamline Icon: https://streamlinehq.com
References (17)

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Hidden Quantum Markov Models (HQMMs).