Papers
Topics
Authors
Recent
2000 character limit reached

Quadratic neural networks for solving inverse problems (2401.09445v1)

Published 20 Dec 2023 in math.NA and cs.NA

Abstract: In this paper we investigate the solution of inverse problems with neural network ansatz functions with generalized decision functions. The relevant observation for this work is that such functions can approximate typical test cases, such as the Shepp-Logan phantom, better, than standard neural networks. Moreover, we show that the convergence analysis of numerical methods for solving inverse problems with shallow generalized neural network functions leads to more intuitive convergence conditions, than for deep affine linear neural networks.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (37)
  1. A.R. Barron “Universal approximation bounds for superpositions of a sigmoidal function” In IEEE Transactions on Information Theory 39.3, 1993, pp. 930–945 DOI: 10.1109/18.256500
  2. “Approximation and learning by greedy algorithms” In Annals of Statistics 36.1, 2008 DOI: 10.1214/009053607000000631
  3. M.D. Buhmann “Radial Basis Functions”, 2003 DOI: 10.1017/cbo9780511543241
  4. C.K. Chui “An Introduction to Wavelets” Academic Press, 1992
  5. G. Cybenko “Approximation by superpositions of a sigmoidal function” In Mathematics of Control, Signals, and Systems 2.4, 1989, pp. 303–314 DOI: 10.1007/bf02551274
  6. I. Daubechies “Ten Lectures on Wavelets” Philadelphia, PA: SIAM, 1992 DOI: 10.1137/1.9781611970104
  7. “Harmonic Analysis on Spaces of Homogeneous Type” Springer Berlin Heidelberg, 2009 DOI: 10.1007/978-3-540-88745-4
  8. “Numerical Analysis. A First Course in Scientific Computation” Transl. from the German by F.A. Potra and F. Schulz Berlin: De Gruyter, 1991
  9. J. Dugundji “Topology” Reprinting of the 1966 original AllynBacon, Inc., 1978
  10. F. Fan, J. Xiong and G. Wang “Universal approximation with quadratic deep networks” In Neural Networks 124, 2020, pp. 383–392 DOI: 10.1016/j.neunet.2020.01.007
  11. E. Gelenbe, Zhi-Wong Mao and Yan-Da Li “Approximation by random networks with bounded number of layers” In Neural Networks for Signal Processing IX: Proceedings of the 1999 IEEE Signal Processing Society Workshop (Cat. No.98TH8468), 1999 DOI: 10.1109/nnsp.1999.788135
  12. A. Graps “An introduction to wavelets” In IEEE Computational Science and Engineering 2.2, 1995, pp. 50–61 DOI: 10.1109/99.388960
  13. B. Hammer “Learning with Recurrent Neural Networks”, Lecture Notes in Control and Information Sciences Springer London, 2000 URL: https://books.google.li/books?id=8IjKhLs8siwC
  14. K. Hornik, M. Stinchcombe and H. White “Multilayer feedforward networks are universal approximators” In Neural Networks 2.5, 1989, pp. 359–366 DOI: 10.1016/0893-6080(89)90020-8
  15. “Stochastic choice of basis functions in adaptive function approximation and the functional-link net” In IEEE Transactions on Neural Networks 6.6, 1995, pp. 1320–1329 DOI: 10.1109/72.471375
  16. J.L. Kelley “General Topology” Toronto-New York-London: D. Van Nostrand Company, 1955
  17. A. Lamperski “Neural Network Independence Properties with Applications to Adaptive Control” In 2022 IEEE 61st Conference on Decision and Control (CDC), 2022 DOI: 10.1109/cdc51059.2022.9992994
  18. “Multilayer feedforward networks with a nonpolynomial activation function can approximate any function” In Neural Networks 6.6, 1993, pp. 861–867 DOI: 10.1016/s0893-6080(05)80131-5
  19. A.K. Louis, P. Maass and A. Rieder “Wavelets. Theorie und Anwendungen” Stuttgart: Teubner, 1998
  20. “Universal Approximation in Dropout Neural Networks” In Journal of Machine Learning Research 23.19, 2022, pp. 1–46 URL: http://jmlr.org/papers/v23/20-1433.html
  21. H.N. Mhaskar “Approximation properties of a multilayered feedforward artificial neural network” In Advances in Computational Mathematics 1.1 Springer US, 1993, pp. 61–80 DOI: 10.1007/bf02070821
  22. “Generalized inverses and applications” In Proceedings of an Advanced Seminar sponsored by the Mathematics Research Center at the University of Wisconsin, Madison, Wis., October 8-10, 1973 New York: Academic Press [Harcourt Brace Jovanovich Publishers], 1976, pp. xiv+1054
  23. F. Natterer “Regularisierung schlecht gestellter Probleme durch Projektionsverfahren” In Numerische Mathematik 28.3, 1977, pp. 329–341 DOI: 10.1007/bf01389972
  24. D. Obmann, J. Schwab and M. Haltmeier “Deep synthesis network for regularizing inverse problems” In Inverse Problems 37.1, 2021, pp. 015005 DOI: 10.1088/1361-6420/abc7cd
  25. Y.-H. Pao, G.-H. Park and D.J. Sobajic “Learning and generalization characteristics of the random vector functional-link net” In Neurocomputing 6.2, 1994, pp. 163–180 DOI: 10.1016/0925-2312(94)90053-1
  26. A. Pinkus “Approximation theory of the MLP model in neural networks” In Acta Numerica 8, 1999, pp. 143–195 DOI: 10.1017/s0962492900002919
  27. “Recurrent Neural Networks are universal approximators” In International Journal of Neural Systems 17.4, 2007, pp. 253–263 DOI: 10.1142/s0129065707001111
  28. O. Scherzer, B. Hofmann and Z. Nashed “Gauss–Newton method for solving linear inverse problems with neural network coders” In Sampling Theory, Signal Processing, and Data Analysis 21.2, 2023 DOI: 10.1007/s43670-023-00066-6
  29. U. Shaham, A. Cloninger and R.R. Coifman “Provable approximation properties for deep neural networks” In Applied and Computational Harmonic Analysis 44.3, 2018, pp. 537–557 DOI: 10.1016/j.acha.2016.04.003
  30. “The Fourier reconstruction of a head section” In IEEE Transactions on Nuclear Science 21.3, 1974, pp. 21–43 DOI: 10.1109/tns.1974.6499235
  31. “Characterization of the Variation Spaces Corresponding to Shallow Neural Networks” In Constructive Approximation 57.3, 2023, pp. 1109–1132 DOI: 10.1007/s00365-023-09626-4
  32. “Sharp Bounds on the Approximation Rates, Metric Entropy, and n-Widths of Shallow Neural Networks” In Foundations of Computational Mathematics, 2022 DOI: 10.1007/s10208-022-09595-3
  33. “Neurons With Paraboloid Decision Boundaries for Improved Neural Network Classification Performance” In IEEE Transactions on Neural Networks and Learning Systems 30.1, 2019, pp. 284–294 DOI: 10.1109/tnnls.2018.2839655
  34. S Wager, S Wang and P Liang “Dropout Training as Adaptive Regularization” In Proceedings of the 26th International Conference on Neural Information Processing Systems - Volume 1 Curran Associates Inc., 2013, pp. 351–359
  35. White “An additional hidden unit test for neglected nonlinearity in multilayer feedforward networks” In International Joint Conference on Neural Networks, 1989 DOI: 10.1109/ijcnn.1989.118281
  36. D.-X. Zhou “Deep distributed convolutional neural networks: Universality” In Analysis and Applications 16.6, 2018, pp. 895–919 DOI: 10.1142/s0219530518500124
  37. D.-X. Zhou “Universality of deep convolutional neural networks” In Applied and Computational Harmonic Analysis 48.2, 2020, pp. 787–794 DOI: 10.1016/j.acha.2019.06.004
Citations (1)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.