An Information Theoretic Analysis of Sequential Decision-Making (1510.08952v3)
Abstract: We provide a novel analysis of Wald's sequential probability ratio test based on information theoretic measures for symmetric thresholds, symmetric noise, and equally likely hypotheses under the assumption that the test exactly terminates at one of the thresholds. This test is optimal in the sense that it yields the minimum mean decision time. To analyze the decision-making process we consider information densities, which represent the stochastic information content of the observations yielding a stochastic termination time of the test. Based on this, we show that the conditional probability to decide for hypothesis $\mathcal{H}_1$ (or the counter-hypothesis $\mathcal{H}_0$) given that the test terminates at time instant $k$ is independent of time $k$. An analogous property has been found for a continuous-time first passage problem with two absorbing boundaries in the contexts of non-equilibrium statistical physics and communication theory. Moreover, we study the evolution of the mutual information between the binary variable to be tested and the output of the Wald test. Notably, we show that the decision time of the Wald test contains no information on which hypothesis is true beyond the decision outcome.