Papers
Topics
Authors
Recent
Search
2000 character limit reached

A neural network approach for solving the Monge-Ampère equation with transport boundary condition

Published 25 Oct 2024 in cs.LG | (2410.19496v1)

Abstract: This paper introduces a novel neural network-based approach to solving the Monge-Amp`ere equation with the transport boundary condition, specifically targeted towards optical design applications. We leverage multilayer perceptron networks to learn approximate solutions by minimizing a loss function that encompasses the equation's residual, boundary conditions, and convexity constraints. Our main results demonstrate the efficacy of this method, optimized using L-BFGS, through a series of test cases encompassing symmetric and asymmetric circle-to-circle, square-to-circle, and circle-to-flower reflector mapping problems. Comparative analysis with a conventional least-squares finite-difference solver reveals the competitive, and often superior, performance of our neural network approach on the test cases examined here. A comprehensive hyperparameter study further illuminates the impact of factors such as sampling density, network architecture, and optimization algorithm. While promising, further investigation is needed to verify the method's robustness for more complicated problems and to ensure consistent convergence. Nonetheless, the simplicity and adaptability of this neural network-based approach position it as a compelling alternative to specialized partial differential equation solvers.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (33)
  1. The Monge–Ampère equation and its link to optimal transportation. Bulletin of the American Mathematical Society, 51(4):527–580, 2014.
  2. Optical design of single reflector systems and the Monge–Kantorovich mass transfer problem. Journal of Mathematical Sciences, 117(3):4096–4108, 2003.
  3. Far-field reflector problem and intersection of paraboloids. Numerische Mathematik, 134:389–411, 2016.
  4. Lotte Bente Romijn. Generated Jacobian Equations in Freeform Optical Design: Mathematical Theory and Numerics. PhD thesis, Eindhoven University of Technology, 2021.
  5. Unified mathematical framework for a class of fundamental freeform optical systems. Opt. Express, 29(20):31650–31664, Sep 2021. doi: 10.1364/OE.438920. URL https://opg.optica.org/oe/abstract.cfm?URI=oe-29-20-31650.
  6. Neural-network-based approximations for solving partial differential equations. Communications in Numerical Methods in Engineering, 10(3):195–201, 1994.
  7. Artificial neural networks for solving ordinary and partial differential equations. IEEE Transactions on Neural Networks, 9(5):987–1000, 1998. doi: 10.1109/72.712178.
  8. Lucie P. Aarts and Peter Van Der Veer. Neural network method for solving partial differential equations. Neural Processing Letters, 14:261–271, 2001.
  9. Solving differential equations using deep neural networks. Neurocomputing, 399:193–212, 2020.
  10. Solving the Dirichlet problem for the Monge–Ampère equation using neural networks. Journal of Computational Mathematics and Data Science, page 100080, 2023.
  11. A least-squares method for optimal transport using the Monge–Ampère equation. SIAM Journal on Scientific Computing, 37(6):B937–B961, 2015.
  12. Brittany D. Froese. A numerical method for the elliptic Monge–Ampère equation with transport boundary conditions. SIAM Journal on Scientific Computing, 34(3):A1432–A1459, 2012.
  13. Izaak Neutelings. Neural networks — tikz.net. https://tikz.net/neural_networks/. [Accessed 30-08-2024].
  14. Multilayer feedforward networks are universal approximators. Neural Networks, 2(5):359–366, 1989. ISSN 0893-6080. doi: https://doi.org/10.1016/0893-6080(89)90020-8. URL https://www.sciencedirect.com/science/article/pii/0893608089900208.
  15. Robert Bridson. Fast Poisson disk sampling in arbitrary dimensions. SIGGRAPH sketches, 10(1):1, 2007.
  16. Self-adaptive loss balanced physics-informed neural networks. Neurocomputing, 496:11–34, 2022.
  17. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980, 2014.
  18. Roger Fletcher. Practical methods of optimization. John Wiley & Sons, 2000.
  19. On the limited memory BFGS method for large scale optimization. Mathematical programming, 45(1):503–528, 1989.
  20. Line search algorithms with guaranteed sufficient decrease. ACM Transactions on Mathematical Software (TOMS), 20(3):286–307, 1994.
  21. JAX: composable transformations of Python+NumPy programs, 2018. URL http://github.com/google/jax.
  22. Ahmed Mohammed. Existence and estimates of solutions to a singular Dirichlet problem for the Monge–Ampère equation. Journal of mathematical analysis and applications, 340(2):1226–1234, 2008.
  23. A linearly-convergent stochastic L-BFGS algorithm. In Artificial Intelligence and Statistics, pages 249–258. PMLR, 2016.
  24. Ke Li and Jitendra Malik. Learning to optimize. arXiv preprint arXiv:1606.01885, 2016.
  25. Learning to optimize: A primer and a benchmark. Journal of Machine Learning Research, 23(189):1–59, 2022.
  26. Learning to optimize quasi-Newton methods, 2023. URL https://openreview.net/forum?id=EqDnVOyiVX.
  27. On the spectral bias of neural networks. In International conference on machine learning, pages 5301–5310. PMLR, 2019.
  28. Fourier features let networks learn high frequency functions in low dimensional domains. Advances in Neural Information Processing Systems, 33:7537–7547, 2020.
  29. Random features for large-scale kernel machines. Advances in neural information processing systems, 20, 2007.
  30. Beyond periodicity: Towards a unifying framework for activations in coordinate-MLPs. In European Conference on Computer Vision, pages 142–158. Springer, 2022.
  31. A survey on evolutionary neural architecture search. IEEE transactions on neural networks and learning systems, 34(2):550–570, 2021.
  32. NAS-PINN: neural architecture search-guided physics-informed neural network for solving PDEs. Journal of Computational Physics, 496:112603, 2024.
  33. Where are we now? a large benchmark study of recent symbolic regression methods. In Proceedings of the Genetic and Evolutionary Computation Conference, pages 1183–1190, 2018.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 0 likes about this paper.