2000 character limit reached
Any Deep ReLU Network is Shallow (2306.11827v1)
Published 20 Jun 2023 in cs.LG, cs.AI, and stat.ML
Abstract: We constructively prove that every deep ReLU network can be rewritten as a functionally identical three-layer network with weights valued in the extended reals. Based on this proof, we provide an algorithm that, given a deep ReLU network, finds the explicit weights of the corresponding shallow network. The resulting shallow network is transparent and used to generate explanations of the model s behaviour.