Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Two-Dimensional Pattern-Coupled Sparse Bayesian Learning via Generalized Approximate Message Passing (1505.06270v1)

Published 23 May 2015 in cs.IT and math.IT

Abstract: We consider the problem of recovering two-dimensional (2-D) block-sparse signals with \emph{unknown} cluster patterns. Two-dimensional block-sparse patterns arise naturally in many practical applications such as foreground detection and inverse synthetic aperture radar imaging. To exploit the block-sparse structure, we introduce a 2-D pattern-coupled hierarchical Gaussian prior model to characterize the statistical pattern dependencies among neighboring coefficients. Unlike the conventional hierarchical Gaussian prior model where each coefficient is associated independently with a unique hyperparameter, the pattern-coupled prior for each coefficient not only involves its own hyperparameter, but also its immediate neighboring hyperparameters. Thus the sparsity patterns of neighboring coefficients are related to each other and the hierarchical model has the potential to encourage 2-D structured-sparse solutions. An expectation-maximization (EM) strategy is employed to obtain the maximum a posterior (MAP) estimate of the hyperparameters, along with the posterior distribution of the sparse signal. In addition, the generalized approximate message passing (GAMP) algorithm is embedded into the EM framework to efficiently compute an approximation of the posterior distribution of hidden variables, which results in a significant reduction in computational complexity. Numerical results are provided to illustrate the effectiveness of the proposed algorithm.

Citations (77)

Summary

We haven't generated a summary for this paper yet.