Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
149 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Modified memoryless spectral-scaling Broyden family on Riemannian manifolds (2307.08986v3)

Published 18 Jul 2023 in math.NA and cs.NA

Abstract: This paper presents modified memoryless quasi-Newton methods based on the spectral-scaling Broyden family on Riemannian manifolds. The method involves adding one parameter to the search direction of the memoryless self-scaling Broyden family on the manifold. Moreover, it uses a general map instead of vector transport. This idea has already been proposed within a general framework of Riemannian conjugate gradient methods where one can use vector transport, scaled vector transport, or an inverse retraction. We show that the search direction satisfies the sufficient descent condition under some assumptions on the parameters. In addition, we show global convergence of the proposed method under the Wolfe conditions. We numerically compare it with existing methods, including Riemannian conjugate gradient methods and the memoryless spectral-scaling Broyden family. The numerical results indicate that the proposed method with the BFGS formula is suitable for solving an off-diagonal cost function minimization problem on an oblique manifold.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (34)
  1. Joint diagonalization on the oblique manifold for independent component analysis. In 2006 IEEE international conference on acoustics speech and signal processing proceedings, volume 5, pages V–V. IEEE, 2006.
  2. Optimization Algorithms on Matrix Manifolds. Princeton University Press, 2008.
  3. Z. Chen and W. Cheng. Spectral-scaling quasi-Newton methods with updates from the one parameter of the broyden family. Journal of computational and applied mathematics, 248:88–98, 2013.
  4. W. Cheng and D. Li. Spectral scaling BFGS method. Journal of optimization Theory and Applications, 146(2):305–319, 2010.
  5. Benchmarking optimization software with performance profiles. Mathematical programming, 91(2):201–213, 2002.
  6. A riemannian BFGS method without differentiated retraction for nonconvex optimization problems. SIAM Journal on Optimization, 28(1):470–495, 2018.
  7. A Broyden class of quasi-Newton methods for Riemannian optimization. SIAM Journal on Optimization, 25(3):1660–1685, 2015.
  8. Riemannian optimization for registration of curves in elastic shape analysis. Journal of Mathematical Imaging and Vision, 54:320–343, 2016.
  9. A modified self-scaling memoryless Broyden–Fletcher–Goldfarb–Shanno method for unconstrained optimization. Journal of Optimization Theory and Applications, 165:209–224, 2015.
  10. Low-rank tensor completion by Riemannian optimization. BIT Numerical Mathematics, 54:447–468, 2014.
  11. D.-H. Li and M. Fukushima. A modified bfgs method and its global convergence in nonconvex minimization. Journal of Computational and Applied Mathematics, 129(1-2):15–35, 2001.
  12. A sufficient descent three-term conjugate gradient method via symmetric rank-one update for large-scale optimization. Optimization, 65(1):121–143, 2016.
  13. S. Nakayama. A hybrid method of three-term conjugate gradient method and memoryless quasi-Newton method for unconstrained optimization. SUT journal of Mathematics, 54(1):79–98, 2018.
  14. A memoryless symmetric rank-one method with sufficient descent property for unconstrained optimization. Journal of the Operations Research Society of Japan, 61(1):53–70, 2018.
  15. Memoryless quasi-Newton methods based on spectral-scaling broyden family for unconstrained optimization. Journal of Industrial and Management Optimization, 15(4):1773–1793, 2019.
  16. Memoryless quasi-newton methods based on the spectral-scaling broyden family for Riemannian optimization. Journal of Optimization Theory and Applications, pages 1–26, 2023.
  17. M. Nickel and D. Kiela. Poincaré embeddings for learning hierarchical representations. Advances in Neural Information Processing Systems, 30, 2017.
  18. J. Nocedal and S. J. Wright. Numerical Optimization. Springer, 1999.
  19. W. Ring and B. Wirth. Optimization methods on Riemannian manifolds and their application to shape space. SIAM Journal on Optimization, 22(2):596–627, 2012.
  20. H. Sakai and H. Iiduka. Hybrid Riemannian conjugate gradient methods with global convergence properties. Computational Optimization and Applications, 77:811–830, 2020.
  21. H. Sakai and H. Iiduka. Sufficient descent riemannian conjugate gradient methods. Journal of Optimization Theory and Applications, 190(1):130–150, 2021.
  22. Global convergence of hager–zhang type riemannian conjugate gradient method. Applied Mathematics and Computation, 441:127685, 2023.
  23. H. Sato. A Dai-Yuan-type Riemannian conjugate gradient method with the weak Wolfe conditions. Computational Optimization and Applications, 64(1):101–118, 2016.
  24. H. Sato. Riemannian Optimization and Its Applications. Springer, 2021.
  25. H. Sato. Riemannian conjugate gradient methods: General framework and specific algorithms with convergence analyses. SIAM Journal on Optimization, 32(4):2690–2717, 2022.
  26. H. Sato and T. Iwai. A new, globally convergent riemannian conjugate gradient method. Optimization, 64(4):1011–1031, 2015.
  27. D. F. Shanno. Conjugate gradient methods with inexact searches. Mathematics of operations research, 3(3):244–256, 1978.
  28. W. Sun and Y.-X. Yuan. Optimization Theory and Methods: Nonlinear Programming, volume 1. Springer, 2006.
  29. Pymanopt: A python toolbox for optimization on manifolds using automatic differentiation. J Mach Learn Res, 17(1):4755–4759, 2016.
  30. B. Vandereycken. Low-rank matrix completion by Riemannian optimization. SIAM Journal on Optimization, 23(2):1214–1236, 2013.
  31. P. Wolfe. Convergence conditions for ascent methods. SIAM Review, 11(2):226–235, 1969.
  32. P. Wolfe. Convergence conditions for ascent methods. II: Some corrections. SIAM Review, 13(2):185–188, 1971.
  33. Y. Zhang and R. Tewarson. Quasi-Newton algorithms with updates from the preconvex part of Broyden’s family. IMA Journal of Numerical Analysis, 8(4):487–509, 1988.
  34. X. Zhu and H. Sato. Riemannian conjugate gradient methods with inverse retraction. Computational Optimization and Applications, 77:779–810, 2020.

Summary

We haven't generated a summary for this paper yet.