2000 character limit reached
MorphoActivation: Generalizing ReLU activation function by mathematical morphology (2207.06413v1)
Published 13 Jul 2022 in cs.LG, cs.DM, eess.IV, eess.SP, and stat.AP
Abstract: This paper analyses both nonlinear activation functions and spatial max-pooling for Deep Convolutional Neural Networks (DCNNs) by means of the algebraic basis of mathematical morphology. Additionally, a general family of activation functions is proposed by considering both max-pooling and nonlinear operators in the context of morphological representations. Experimental section validates the goodness of our approach on classical benchmarks for supervised learning by DCNN.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.