Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Iterative Cauchy Thresholding: Regularisation with a heavy-tailed prior (2003.12507v1)

Published 27 Mar 2020 in eess.IV

Abstract: In the machine learning era, sparsity continues to attract significant interest due to the benefits it provides to learning models. Algorithms aiming to optimise the (\ell_0)- and (\ell_1)-norm are the common choices to achieve sparsity. In this work, an alternative algorithm is proposed, which is derived based on the assumption of a Cauchy distribution characterising the coefficients in sparse domains. The Cauchy distribution is known to be able to capture heavy-tails in the data, which are linked to sparse processes. We begin by deriving the Cauchy proximal operator and subsequently propose an algorithm for optimising a cost function which includes a Cauchy penalty term. We have coined our contribution as Iterative Cauchy Thresholding (ICT). Results indicate that sparser solutions can be achieved using ICT in conjunction with a fixed over-complete discrete cosine transform dictionary under a sparse coding methodology.

Citations (1)

Summary

We haven't generated a summary for this paper yet.