Minimax Robust Hypothesis Testing
Abstract: The minimax robust hypothesis testing problem for the case where the nominal probability distributions are subject to both modeling errors and outliers is studied in twofold. First, a robust hypothesis testing scheme based on a relative entropy distance is designed. This approach provides robustness with respect to modeling errors and is a generalization of a previous work proposed by Levy. Then, it is shown that this scheme can be combined with Huber's robust test through a composite uncertainty class, for which the existence of a saddle value condition is also proven. The composite version of the robust hypothesis testing scheme as well as the individual robust tests are extended to fixed sample size and sequential probability ratio tests. The composite model is shown to extend to robust estimation problems as well. Simulation results are provided to validate the proposed assertions.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.