Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Compressive Link Acquisition in Multiuser Communications (1209.3804v2)

Published 17 Sep 2012 in cs.IT and math.IT

Abstract: An important receiver operation is to detect the presence specific preamble signals with unknown delays in the presence of scattering, Doppler effects and carrier offsets. This task, referred to as "link acquisition", is typically a sequential search over the transmitted signal space. Recently, many authors have suggested applying sparse recovery algorithms in the context of similar estimation or detection problems. These works typically focus on the benefits of sparse recovery, but not generally on the cost brought by compressive sensing. Thus, our goal is to examine the trade-off in complexity and performance that is possible when using sparse recovery. To do so, we propose a sequential sparsity-aware compressive sampling (C-SA) acquisition scheme, where a compressive multi-channel sampling (CMS) front-end is followed by a sparsity regularized likelihood ratio test (SR-LRT) module. The proposed C-SA acquisition scheme borrows insights from the models studied in the context of sub-Nyquist sampling, where a minimal amount of samples is captured to reconstruct signals with Finite Rate of Innovation (FRI). In particular, we propose an A/D conversion front-end that maximizes a well-known probability divergence measure, the average Kullback-Leibler distance, of all the hypotheses of the SR-LRT performed on the samples. We compare the proposed acquisition scheme vis-`{a}-vis conventional alternatives with relatively low computational cost, such as the Matched Filter (MF), in terms of performance and complexity.

Citations (20)

Summary

We haven't generated a summary for this paper yet.