Computable Robustness Bounds to Transition and Measurement Kernel Perturbations and Approximations in Partially Observable Stochastic Control (2508.10658v1)
Abstract: Studying the stability of partially observed Markov decision processes (POMDPs) with respect to perturbations in either transition or observation kernels is a mathematically and practically important problem. While asymptotic robustness/stability results showing that as approximate transition kernels and/or measurement kernels converge to the true ones in appropriate senses have been previously reported, explicit and uniform bounds on value differences and mismatch costs have not been studied to our knowledge. In this paper, we provide such explicit bounds under both discounted and average cost criteria. The bounds are given in terms of Wasserstein and total variation distances between the original and approximate transition kernels, and total variation distances between observation channels. In particular, we show that control policies optimized for approximate models yield performance guarantees when applied to the true model with explicit bounds. As a particular application, we consider the case where the state space and the measurement spaces are quantized to obtain finite models, and we obtain explicit error bounds which decay to zero as the approximations get finer. This provides explicit performance guarantees for model reduction in POMDPs.