Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
162 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Speckle Reduction using Stochastic Distances (1207.0704v1)

Published 3 Jul 2012 in cs.IT, cs.CV, cs.GR, math.IT, stat.AP, and stat.ML

Abstract: This paper presents a new approach for filter design based on stochastic distances and tests between distributions. A window is defined around each pixel, samples are compared and only those which pass a goodness-of-fit test are used to compute the filtered value. The technique is applied to intensity Synthetic Aperture Radar (SAR) data, using the Gamma model with varying number of looks allowing, thus, changes in heterogeneity. Modified Nagao-Matsuyama windows are used to define the samples. The proposal is compared with the Lee's filter which is considered a standard, using a protocol based on simulation. Among the criteria used to quantify the quality of filters, we employ the equivalent number of looks (related to the signal-to-noise ratio), line contrast, and edge preservation. Moreover, we also assessed the filters by the Universal Image Quality Index and the Pearson's correlation between edges.

Citations (16)

Summary

We haven't generated a summary for this paper yet.