Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Quantum-Inspired Tensor Neural Networks for Option Pricing (2212.14076v2)

Published 28 Dec 2022 in q-fin.PR, cs.CE, cs.LG, and quant-ph

Abstract: Recent advances in deep learning have enabled us to address the curse of dimensionality (COD) by solving problems in higher dimensions. A subset of such approaches of addressing the COD has led us to solving high-dimensional PDEs. This has resulted in opening doors to solving a variety of real-world problems ranging from mathematical finance to stochastic control for industrial applications. Although feasible, these deep learning methods are still constrained by training time and memory. Tackling these shortcomings, Tensor Neural Networks (TNN) demonstrate that they can provide significant parameter savings while attaining the same accuracy as compared to the classical Dense Neural Network (DNN). In addition, we also show how TNN can be trained faster than DNN for the same accuracy. Besides TNN, we also introduce Tensor Network Initializer (TNN Init), a weight initialization scheme that leads to faster convergence with smaller variance for an equivalent parameter count as compared to a DNN. We benchmark TNN and TNN Init by applying them to solve the parabolic PDE associated with the Heston model, which is widely used in financial pricing theory.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (33)
  1. Maziar Raissi. Forward-backward stochastic neural networks: Deep learning of high-dimensional partial differential equations. arXiv preprint arXiv:1804.07010v1, 2018.
  2. Machine learning approximation algorithms for high-dimensional fully nonlinear partial differential equations and second-order backward stochastic differential equations. Journal of Nonlinear Science, 29(4):1563–1619, jan 2019.
  3. Solving high-dimensional partial differential equations using deep learning. Proceedings of the National Academy of Sciences, 115(34):8505–8510, aug 2018.
  4. Deep learning-based numerical methods for high-dimensional parabolic partial differential equations and backward stochastic differential equations. Communications in Mathematics and Statistics, 5(4):349–380, nov 2017.
  5. Second-order backward stochastic differential equations and fully nonlinear parabolic PDEs. Communications on Pure and Applied Mathematics, 60(7):1081–1110, nov 2006.
  6. Physics informed deep learning (part i): Data-driven solutions of nonlinear partial differential equations. arXiv preprint arXiv:1711.10561, 2017.
  7. Physics informed deep learning (part ii): Data-driven discovery of nonlinear partial differential equations. arXiv preprint arXiv:1711.10566, 2017.
  8. Tensorizing neural networks. Advances in Neural Information Processing Systems, 28, 2015.
  9. Very deep convolutional networks for large-scale image recognition. International Conference on Learning Representations (ICLR), 2015.
  10. Restructuring of deep neural network acoustic models with singular value decomposition. Interspeech, 2013, pages 2365–2369, 2013.
  11. Roman Orús. A practical introduction to tensor networks: Matrix product states and projected entangled pair states. Annals of Physics, 349:117–158, 2014.
  12. Supervised learning with tensor networks. Advances in Neural Information Processing Systems 29, pages 4799–4807, 2016.
  13. Stoudenmire Edwin. Learning relevant features of data with multi-scale tensor networks. Quantum Science and Technology, 3(3):034003, 2018.
  14. Supervised learning with generalized tensor networks. arXiv preprint arXiv:1806.05964, 2018.
  15. Tensor network for machine learning. arXiv preprint arXiv:1906.06329, 2019.
  16. Matrix product state–based quantum classifier. Neural computation, 31(7):1499–1517, 2019.
  17. Machine learning by unitary tensor network of hierarchical tree structure. New Journal of Physics, 21(7):073059, 2019.
  18. From probabilistic graphical models to generalized tensor networks for supervised learning. IEEE Access, 8:68169–68182, 2020.
  19. Unsupervised generative modeling using matrix product states. Physical Review X, 8:031012, 2018.
  20. Tree tensor networks for generative modeling. Physical Review B, 99:155131, 2019.
  21. Generative tensor network classification model for supervised machine learning. Physical Review B, 101:075135, 2020.
  22. Modeling sequences with quantum states: A look under the hood. Machine Learning: Science and Technology, 2020.
  23. I. V. Oseledets. Tensor-train decomposition. SIAM Journal on Scientific Computing, 33(5):2295–2317, 2011.
  24. Quantum-inspired tensor neural networks for partial differential equations. arXiv preprint arXiv:2208.02235, 2022.
  25. Transform analysis and asset pricing for affine jump-diffusions. Econometrica, 68:1343–1376, 02 2000.
  26. Option valuation using the fast fourier transform. J. Comput. Finance, 2, 03 2001.
  27. Ricardo Crisostomo. An analysis of the heston stochastic volatility model: Implementation and calibration using matlab. 2015.
  28. Kendall’s Advanced Theory of Statistics, Volume 1, Distribution Theory. Wiley, 6 edition, 1994.
  29. Train large, then compress: Rethinking model size for efficient training and inference of transformers. 2020.
  30. Valuing american options by simulation: A simple least-squares approach. Review of Financial Studies, 14:113–47, 02 2001.
  31. Neural network regression for bermudan option pricing. arXiv preprint arXiv:1907.06474, 2019.
  32. Optimal stopping via randomized neural networks. 2021.
  33. Yangang Chen and Justin W. L. Wan. Deep neural network framework based on backward stochastic differential equations for pricing and hedging american options in high dimensions. 2019.
Citations (3)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com