Calibrated Semantic Diffusion: A p-Laplacian Synthesis with Learnable Dissipation, Quantified Constants, and Graph-Aware Calibration (2508.13658v1)
Abstract: We develop a calibrated diffusion framework by synthesizing three established concepts: linear Laplacian smoothing, nonlinear graph p-Laplacian flows, and a learnable dissipation term derived from a strongly convex potential. This synthesis provides a general model for graph-based diffusion with controllable dynamics. Our key theoretical results include a quantified two-regime decay analysis for $p>2$, which provides stronger, p-dependent transient bounds not captured by standard ISS templates, and the first formalization of a "non-synonymy" impossibility principle, which proves that fixed-parameter models cannot meet universal performance targets across graphs with varying spectral properties. To address this, we propose a constructive calibration algorithm (SGPS) with formal guarantees for achieving target rates and mass. We derive explicit, closed-form lower bounds for the graph p-gap on canonical graphs a notable improvement over prior implicit estimates and provide sharp constants for discrete-time and stochastic stability, including a contextualized restatement of the necessary and sufficient Euler step-size and a strengthened analysis of the stochastic noise floor. Illustrative, small-scale empirical validations confirm the tightness of key theoretical bounds.
Collections
Sign up for free to add this paper to one or more collections.