Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Denoising Loss Bound for Neural Network based Universal Discrete Denoisers (1709.03657v2)

Published 12 Sep 2017 in cs.LG, cs.IT, and math.IT

Abstract: We obtain a denoising loss bound of the recently proposed neural network based universal discrete denoiser, Neural DUDE, which can adaptively learn its parameters solely from the noise-corrupted data, by minimizing the \emph{empirical estimated loss}. The resulting bound resembles the generalization error bound of the standard empirical risk minimizers (ERM) in supervised learning, and we show that the well-known bias-variance tradeoff also exists in our loss bound. The key tool we develop is the concentration of the unbiased estimated loss on the true denoising loss, which is shown to hold \emph{uniformly} for \emph{all} bounded network parameters and \emph{all} underlying clean sequences. For proving our main results, we make a novel application of the tools from the statistical learning theory. Finally, we show that the hyperparameters of Neural DUDE can be chosen from a small validation set to significantly improve the denoising performance, as predicted by the theoretical result of this paper.

Summary

We haven't generated a summary for this paper yet.