Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
156 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

High-Rate Quantization for the Neyman-Pearson Detection of Hidden Markov Processes (1003.2914v1)

Published 15 Mar 2010 in cs.IT and math.IT

Abstract: This paper investigates the decentralized detection of Hidden Markov Processes using the Neyman-Pearson test. We consider a network formed by a large number of distributed sensors. Sensors' observations are noisy snapshots of a Markov process to be detected. Each (real) observation is quantized on log2(N) bits before being transmitted to a fusion center which makes the final decision. For any false alarm level, it is shown that the miss probability of the Neyman-Pearson test converges to zero exponentially as the number of sensors tends to infinity. The error exponent is provided using recent results on Hidden Markov Models. In order to obtain informative expressions of the error exponent as a function of the quantization rule, we further investigate the case where the number N of quantization levels tends to infinity, following the approach developed in [Gupta & Hero, 2003]. In this regime, we provide the quantization rule maximizing the error exponent. Illustration of our results is provided in the case of the detection of a Gauss-Markov signal in noise. In terms of error exponent, the proposed quantization rule significantly outperforms the one proposed by [Gupta & Hero, 2003] for i.i.d. observations.

Citations (3)

Summary

We haven't generated a summary for this paper yet.