Papers
Topics
Authors
Recent
Search
2000 character limit reached

ALReLU: A different approach on Leaky ReLU activation function to improve Neural Networks Performance

Published 11 Dec 2020 in cs.LG and cs.NE | (2012.07564v2)

Abstract: Despite the unresolved 'dying ReLU problem', the classical ReLU activation function (AF) has been extensively applied in Deep Neural Networks (DNN), in particular Convolutional Neural Networks (CNN), for image classification. The common gradient issues of ReLU pose challenges in applications on academy and industry sectors. Recent approaches for improvements are in a similar direction by just proposing variations of the AF, such as Leaky ReLU (LReLU), while maintaining the solution within the same unresolved gradient problems. In this paper, the Absolute Leaky ReLU (ALReLU) AF, a variation of LReLU, is proposed, as an alternative method to resolve the common 'dying ReLU problem' on NN-based algorithms for supervised learning. The experimental results demonstrate that by using the absolute values of LReLU's small negative gradient, has a significant improvement in comparison with LReLU and ReLU, on image classification of diseases such as COVID-19, text and tabular data classification tasks on five different datasets.

Citations (35)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.