Weakest Preexpectation Semantics for Bayesian Inference
Abstract: We present a semantics of a probabilistic while-language with soft conditioning and continuous distributions which handles programs diverging with positive probability. To this end, we extend the probabilistic guarded command language (pGCL) with draws from continuous distributions and a score operator. The main contribution is an extension of the standard weakest preexpectation semantics to support these constructs. As a sanity check of our semantics, we define an alternative trace-based semantics of the language, and show that the two semantics are equivalent. Various examples illustrate the applicability of the semantics.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.