PReLU: Yet Another Single-Layer Solution to the XOR Problem (2409.10821v1)
Abstract: This paper demonstrates that a single-layer neural network using Parametric Rectified Linear Unit (PReLU) activation can solve the XOR problem, a simple fact that has been overlooked so far. We compare this solution to the multi-layer perceptron (MLP) and the Growing Cosine Unit (GCU) activation function and explain why PReLU enables this capability. Our results show that the single-layer PReLU network can achieve 100\% success rate in a wider range of learning rates while using only three learnable parameters.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Collections
Sign up for free to add this paper to one or more collections.