2000 character limit reached
A Confirmation of a Conjecture on the Feldman's Two-armed Bandit Problem (2206.00821v1)
Published 2 Jun 2022 in math.ST, stat.ML, and stat.TH
Abstract: Myopic strategy is one of the most important strategies when studying bandit problems. In this paper, we consider the two-armed bandit problem proposed by Feldman. With general distributions and utility functions, we obtain a necessary and sufficient condition for the optimality of the myopic strategy. As an application, we could solve Nouiehed and Ross's conjecture for Bernoulli two-armed bandit problems that myopic strategy stochastically maximizes the number of wins.