Parseval Proximal Neural Networks (1912.10480v2)
Abstract: The aim of this paper is twofold. First, we show that a certain concatenation of a proximity operator with an affine operator is again a proximity operator on a suitable Hilbert space. Second, we use our findings to establish so-called proximal neural networks (PNNs) and stable tight frame proximal neural networks. Let $\mathcal H$ and $\mathcal K$ be real Hilbert spaces, $b\in\mathcal K$ and $T\in\mathcal{B}(\mathcal H,\mathcal K)$ have closed range and Moore-Penrose inverse $T\dagger$. Based on the well-known characterization of proximity operators by Moreau, we prove that for any proximity operator $\text{Prox}\colon\mathcal K\to\mathcal K$ the operator $T\dagger\,\text{Prox} (T\cdot +b)$ is a proximity operator on $\mathcal H$ equipped with a suitable norm. In particular, it follows for the frequently applied soft shrinkage operator $\text{Prox} = S_{\lambda}\colon\ell_2 \rightarrow\ell_2$ and any frame analysis operator $T\colon\mathcal H\to\ell_2$ that the frame shrinkage operator $T\dagger\, S_\lambda\,T$ is a proximity operator on a suitable Hilbert space. The concatenation of proximity operators on $\mathbb Rd$ equipped with different norms establishes a PNN. If the network arises from tight frame analysis or synthesis operators, then it forms an averaged operator. Hence, it has Lipschitz constant 1 and belongs to the class of so-called Lipschitz networks, which were recently applied to defend against adversarial attacks. Moreover, due to its averaging property, PNNs can be used within so-called Plug-and-Play algorithms with convergence guarantee. In case of Parseval frames, we call the networks Parseval proximal neural networks (PPNNs). Then, the involved linear operators are in a Stiefel manifold and corresponding minimization methods can be applied for training. Finally, some proof-of-the concept examples demonstrate the performance of PPNNs.