Random-step Markov processes (1410.1457v1)
Abstract: We explore two notions of stationary processes. The first is called a random-step Markov process in which the stationary process of states, $(X_i){i \in \mathbb{Z}}$ has a stationary coupling with an independent process on the positive integers, $(L_i){i \in \mathbb{Z}}$ of random look-back distances'. That is, $L_0$ is independent of the
past states', $(X_i, L_i){i<0}$, and for every positive integer $n$, the probability distribution on the present', $X_0$, conditioned on the event $\{L_0 = n\}$ and on the past is the same as the probability distribution on $X_0$ conditioned on the
$n$-past', $(X_i){-n\leq i <0}$ and ${L_0 = n}$. A random Markov process is a generalization of a Markov chain of order $n$ and has the property that the distribution on the present given the past can be uniformly approximated given the $n$-past, for $n$ sufficiently large. Processes with the latter property are called uniform martingales, closely related to the notion of a continuous $g$-function'. We show that every stationary process on a countable alphabet that is a uniform martingale and is dominated by a finite measure is also a random Markov process and that the random variables $(L_i)_{i \in \mathbb{Z}}$ and associated coupling can be chosen so that the distribution on the present given the $n$-past and the event $\{L_0 = n\}$ is
deterministic': all probabilities are in ${0,1}$. In the case of finite alphabets, those random-step Markov processes for which $L_0$ can be chosen with finite expected value are characterized. For stationary processes on an uncountable alphabet, a stronger condition is also considered which is sufficient to imply that a process is a random Markov processes. In addition, a number of examples are given throughout to show the sharpness of the results.