Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
GPT-4o
Gemini 2.5 Pro Pro
o3 Pro
GPT-4.1 Pro
DeepSeek R1 via Azure Pro
2000 character limit reached

Probability Bracket Notation: Markov Sequence Projector of Visible and Hidden Markov Models in Dynamic Bayesian Networks (1212.3817v2)

Published 16 Dec 2012 in cs.AI and math.PR

Abstract: With the symbolic framework of Probability Bracket Notation (PBN), the Markov Sequence Projector (MSP) is introduced to expand the evolution formula of Homogeneous Markov Chains (HMCs). The well-known weather example, a Visible Markov Model (VMM), illustrates that the full joint probability of a VMM corresponds to a specifically projected Markov state sequence in the expanded evolution formula. In a Hidden Markov Model (HMM), the probability basis (P-basis) of the hidden Markov state sequence and the P-basis of the observation sequence exist in the sequential event space. The full joint probability of an HMM is the product of the (unknown) projected hidden sequence of Markov states and their transformations into the observation P-bases. The Viterbi algorithm is applied to the famous Weather-Stone HMM example to determine the most likely weather-state sequence given the observed stone-state sequence. Our results are verified using the Elvira software package. Using the PBN, we unify the evolution formulas for Markov models like VMMs, HMMs, and factorial HMMs (with discrete time). We briefly investigated the extended HMM, addressing the feedback issue, and the continuous-time VMM and HMM (with discrete or continuous states). All these models are subclasses of Dynamic Bayesian Networks (DBNs) essential for Machine Learning (ML) and AI.

Citations (3)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.

Authors (1)