Papers
Topics
Authors
Recent
Search
2000 character limit reached

Trimmed Minimum Error Entropy for Robust Online Regression

Published 11 May 2021 in cs.IT and math.IT | (2105.05321v1)

Abstract: In this paper, online linear regression in environments corrupted by non-Gaussian noise (especially heavy-tailed noise) is addressed. In such environments, the error between the system output and the label also does not follow a Gaussian distribution and there might exist abnormally large error samples (or outliers) which mislead the learning process. The main challenge is how to keep the supervised learning problem least affected by these unwanted and misleading outliers. In recent years, an information theoretic algorithm based on Renyi's entropy, called minimum error entropy (MEE), has been employed to take on this issue. However, this minimization might not result in a desired estimator inasmuch as entropy is shift-invariant, i.e., by minimizing the error entropy, error samples may not be necessarily concentrated around zero. In this paper, a quantization technique is proposed by which not only aforementioned need of setting errors around the origin in MEE is addressed, but also major outliers are rejected from MEE-based learning and MEE performance is improved from convergence rate, steady state misalignment, and testing error points of view.

Citations (1)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.