Papers
Topics
Authors
Recent
Search
2000 character limit reached

Extending Label Smoothing Regularization with Self-Knowledge Distillation

Published 11 Sep 2020 in cs.LG and stat.ML | (2009.05226v1)

Abstract: Inspired by the strong correlation between the Label Smoothing Regularization(LSR) and Knowledge distillation(KD), we propose an algorithm LsrKD for training boost by extending the LSR method to the KD regime and applying a softer temperature. Then we improve the LsrKD by a Teacher Correction(TC) method, which manually sets a constant larger proportion for the right class in the uniform distribution teacher. To further improve the performance of LsrKD, we develop a self-distillation method named Memory-replay Knowledge Distillation (MrKD) that provides a knowledgeable teacher to replace the uniform distribution one in LsrKD. The MrKD method penalizes the KD loss between the current model's output distributions and its copies' on the training trajectory. By preventing the model learning so far from its historical output distribution space, MrKD can stabilize the learning and find a more robust minimum. Our experiments show that LsrKD can improve LSR performance consistently at no cost, especially on several deep neural networks where LSR is ineffectual. Also, MrKD can significantly improve single model training. The experiment results confirm that the TC can help LsrKD and MrKD to boost training, especially on the networks they are failed. Overall, LsrKD, MrKD, and their TC variants are comparable to or outperform the LSR method, suggesting the broad applicability of these KD methods.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.