Trimmed Minimum Error Entropy for Robust Online Regression
Abstract: In this paper, online linear regression in environments corrupted by non-Gaussian noise (especially heavy-tailed noise) is addressed. In such environments, the error between the system output and the label also does not follow a Gaussian distribution and there might exist abnormally large error samples (or outliers) which mislead the learning process. The main challenge is how to keep the supervised learning problem least affected by these unwanted and misleading outliers. In recent years, an information theoretic algorithm based on Renyi's entropy, called minimum error entropy (MEE), has been employed to take on this issue. However, this minimization might not result in a desired estimator inasmuch as entropy is shift-invariant, i.e., by minimizing the error entropy, error samples may not be necessarily concentrated around zero. In this paper, a quantization technique is proposed by which not only aforementioned need of setting errors around the origin in MEE is addressed, but also major outliers are rejected from MEE-based learning and MEE performance is improved from convergence rate, steady state misalignment, and testing error points of view.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.