Conjecture on the effective search space of linear regions in ReLU networks
Determine whether, during optimization of a linear objective over a bounded input domain for a trained feedforward neural network with ReLU activations, the effective search space consisting of encountered linear regions is in fact smaller and simpler than worst-case expectations suggest, thereby explaining observed scalability of lightweight LP-based local search methods.
References
Hence, we may conjecture that the search space is actually smaller and simpler than expected, and thus that a leaner algorithm may produce good results faster.
— Optimization Over Trained Neural Networks: Taking a Relaxing Walk
(2401.03451 - Tong et al., 7 Jan 2024) in Section 1, Introduction