Identify weaker-than-continuity conditions ensuring non-universality for deep CVNNs
Determine natural assumptions on the activation function σ: C → C, strictly weaker than continuity, under which the necessary non-universality implication in the deep-network result holds; specifically, prove that if σ satisfies those assumptions and either (i) σ coincides almost everywhere with a polynomial p(z, z̄) ∈ C[X,Y] or (ii) σ coincides almost everywhere with an entire holomorphic function g or its complex conjugate, then for all input dimensions d ∈ N and depths L ∈ N the class NN_{σ,L}^d of complex-valued feedforward neural networks fails to have the universal approximation property.
Sponsor
References
We leave it as future work to determine natural conditions on σ that are weaker than continuity, but under which a necessary condition as in the above theorem still holds.