An Adaptive Regularized Proximal Newton-Type Methods for Composite Optimization over the Stiefel Manifold (2404.11112v3)
Abstract: Recently, the proximal Newton-type method and its variants have been generalized to solve composite optimization problems over the Stiefel manifold whose objective function is the summation of a smooth function and a nonsmooth function. In this paper, we propose an adaptive quadratically regularized proximal quasi-Newton method, named ARPQN, to solve this class of problems. Under some mild assumptions, the global convergence, the local linear convergence rate and the iteration complexity of ARPQN are established. Numerical experiments and comparisons with other state-of-the-art methods indicate that ARPQN is very promising. We also propose an adaptive quadratically regularized proximal Newton method, named ARPN. It is shown the ARPN method has a local superlinear convergence rate under certain reasonable assumptions, which demonstrates attractive convergence properties of regularized proximal Newton methods.
- Absil, P.-A., Malick, J.: Projection-like retractions on matrix manifolds. SIAM Journal on Optimization 22(1), 135–158 (2012) https://doi.org/10.1137/100802529 Absil et al. [2008] Absil, P.-A., Mahony, R., Sepulchre, R.: Optimization Algorithms on Matrix Manifolds. Princeton University Press, Princeton, NJ (2008) Aravkin et al. [2022] Aravkin, A.Y., Baraldi, R., Orban, D.: A proximal quasi-Newton trust-region method for nonsmooth regularized optimization. SIAM Journal on Optimization 32(2), 900–929 (2022) https://doi.org/10.1137/21M1409536 Borckmans et al. [2014] Borckmans, P.B., Easter Selvan, S., Boumal, N., Absil, P.-A.: A Riemannian subgradient algorithm for economic dispatch with valve-point effect. Journal of Computational and Applied Mathematics 255, 848–866 (2014) https://doi.org/10.1016/j.cam.2013.07.002 Chen et al. [2020] Chen, S., Ma, S., So, A.M.-C., Zhang, T.: Proximal gradient method for nonsmooth optimization over the Stiefel manifold. SIAM Journal on Optimization 30(1), 210–239 (2020) https://doi.org/10.1137/18M122457X Chen et al. [2017] Chen, X., Guo, L., Lu, Z., Ye, J.J.: An augmented Lagrangian method for non-Lipschitz nonconvex programming. SIAM Journal on Numerical Analysis 55(1), 168–193 (2017) https://doi.org/10.1137/15M1052834 Dai [2002] Dai, Y.-H.: A nonmonotone conjugate gradient algorithm for unconstrained optimization. Journal of Systems Science and Complexity 15, 139–145 (2002) Ferreira and Oliveira [1998] Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Absil, P.-A., Mahony, R., Sepulchre, R.: Optimization Algorithms on Matrix Manifolds. Princeton University Press, Princeton, NJ (2008) Aravkin et al. [2022] Aravkin, A.Y., Baraldi, R., Orban, D.: A proximal quasi-Newton trust-region method for nonsmooth regularized optimization. SIAM Journal on Optimization 32(2), 900–929 (2022) https://doi.org/10.1137/21M1409536 Borckmans et al. [2014] Borckmans, P.B., Easter Selvan, S., Boumal, N., Absil, P.-A.: A Riemannian subgradient algorithm for economic dispatch with valve-point effect. Journal of Computational and Applied Mathematics 255, 848–866 (2014) https://doi.org/10.1016/j.cam.2013.07.002 Chen et al. [2020] Chen, S., Ma, S., So, A.M.-C., Zhang, T.: Proximal gradient method for nonsmooth optimization over the Stiefel manifold. SIAM Journal on Optimization 30(1), 210–239 (2020) https://doi.org/10.1137/18M122457X Chen et al. [2017] Chen, X., Guo, L., Lu, Z., Ye, J.J.: An augmented Lagrangian method for non-Lipschitz nonconvex programming. SIAM Journal on Numerical Analysis 55(1), 168–193 (2017) https://doi.org/10.1137/15M1052834 Dai [2002] Dai, Y.-H.: A nonmonotone conjugate gradient algorithm for unconstrained optimization. Journal of Systems Science and Complexity 15, 139–145 (2002) Ferreira and Oliveira [1998] Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Aravkin, A.Y., Baraldi, R., Orban, D.: A proximal quasi-Newton trust-region method for nonsmooth regularized optimization. SIAM Journal on Optimization 32(2), 900–929 (2022) https://doi.org/10.1137/21M1409536 Borckmans et al. [2014] Borckmans, P.B., Easter Selvan, S., Boumal, N., Absil, P.-A.: A Riemannian subgradient algorithm for economic dispatch with valve-point effect. Journal of Computational and Applied Mathematics 255, 848–866 (2014) https://doi.org/10.1016/j.cam.2013.07.002 Chen et al. [2020] Chen, S., Ma, S., So, A.M.-C., Zhang, T.: Proximal gradient method for nonsmooth optimization over the Stiefel manifold. SIAM Journal on Optimization 30(1), 210–239 (2020) https://doi.org/10.1137/18M122457X Chen et al. [2017] Chen, X., Guo, L., Lu, Z., Ye, J.J.: An augmented Lagrangian method for non-Lipschitz nonconvex programming. SIAM Journal on Numerical Analysis 55(1), 168–193 (2017) https://doi.org/10.1137/15M1052834 Dai [2002] Dai, Y.-H.: A nonmonotone conjugate gradient algorithm for unconstrained optimization. Journal of Systems Science and Complexity 15, 139–145 (2002) Ferreira and Oliveira [1998] Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Borckmans, P.B., Easter Selvan, S., Boumal, N., Absil, P.-A.: A Riemannian subgradient algorithm for economic dispatch with valve-point effect. Journal of Computational and Applied Mathematics 255, 848–866 (2014) https://doi.org/10.1016/j.cam.2013.07.002 Chen et al. [2020] Chen, S., Ma, S., So, A.M.-C., Zhang, T.: Proximal gradient method for nonsmooth optimization over the Stiefel manifold. SIAM Journal on Optimization 30(1), 210–239 (2020) https://doi.org/10.1137/18M122457X Chen et al. [2017] Chen, X., Guo, L., Lu, Z., Ye, J.J.: An augmented Lagrangian method for non-Lipschitz nonconvex programming. SIAM Journal on Numerical Analysis 55(1), 168–193 (2017) https://doi.org/10.1137/15M1052834 Dai [2002] Dai, Y.-H.: A nonmonotone conjugate gradient algorithm for unconstrained optimization. Journal of Systems Science and Complexity 15, 139–145 (2002) Ferreira and Oliveira [1998] Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Chen, S., Ma, S., So, A.M.-C., Zhang, T.: Proximal gradient method for nonsmooth optimization over the Stiefel manifold. SIAM Journal on Optimization 30(1), 210–239 (2020) https://doi.org/10.1137/18M122457X Chen et al. [2017] Chen, X., Guo, L., Lu, Z., Ye, J.J.: An augmented Lagrangian method for non-Lipschitz nonconvex programming. SIAM Journal on Numerical Analysis 55(1), 168–193 (2017) https://doi.org/10.1137/15M1052834 Dai [2002] Dai, Y.-H.: A nonmonotone conjugate gradient algorithm for unconstrained optimization. Journal of Systems Science and Complexity 15, 139–145 (2002) Ferreira and Oliveira [1998] Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Chen, X., Guo, L., Lu, Z., Ye, J.J.: An augmented Lagrangian method for non-Lipschitz nonconvex programming. SIAM Journal on Numerical Analysis 55(1), 168–193 (2017) https://doi.org/10.1137/15M1052834 Dai [2002] Dai, Y.-H.: A nonmonotone conjugate gradient algorithm for unconstrained optimization. Journal of Systems Science and Complexity 15, 139–145 (2002) Ferreira and Oliveira [1998] Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Dai, Y.-H.: A nonmonotone conjugate gradient algorithm for unconstrained optimization. Journal of Systems Science and Complexity 15, 139–145 (2002) Ferreira and Oliveira [1998] Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430
- Absil, P.-A., Mahony, R., Sepulchre, R.: Optimization Algorithms on Matrix Manifolds. Princeton University Press, Princeton, NJ (2008) Aravkin et al. [2022] Aravkin, A.Y., Baraldi, R., Orban, D.: A proximal quasi-Newton trust-region method for nonsmooth regularized optimization. SIAM Journal on Optimization 32(2), 900–929 (2022) https://doi.org/10.1137/21M1409536 Borckmans et al. [2014] Borckmans, P.B., Easter Selvan, S., Boumal, N., Absil, P.-A.: A Riemannian subgradient algorithm for economic dispatch with valve-point effect. Journal of Computational and Applied Mathematics 255, 848–866 (2014) https://doi.org/10.1016/j.cam.2013.07.002 Chen et al. [2020] Chen, S., Ma, S., So, A.M.-C., Zhang, T.: Proximal gradient method for nonsmooth optimization over the Stiefel manifold. SIAM Journal on Optimization 30(1), 210–239 (2020) https://doi.org/10.1137/18M122457X Chen et al. [2017] Chen, X., Guo, L., Lu, Z., Ye, J.J.: An augmented Lagrangian method for non-Lipschitz nonconvex programming. SIAM Journal on Numerical Analysis 55(1), 168–193 (2017) https://doi.org/10.1137/15M1052834 Dai [2002] Dai, Y.-H.: A nonmonotone conjugate gradient algorithm for unconstrained optimization. Journal of Systems Science and Complexity 15, 139–145 (2002) Ferreira and Oliveira [1998] Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Aravkin, A.Y., Baraldi, R., Orban, D.: A proximal quasi-Newton trust-region method for nonsmooth regularized optimization. SIAM Journal on Optimization 32(2), 900–929 (2022) https://doi.org/10.1137/21M1409536 Borckmans et al. [2014] Borckmans, P.B., Easter Selvan, S., Boumal, N., Absil, P.-A.: A Riemannian subgradient algorithm for economic dispatch with valve-point effect. Journal of Computational and Applied Mathematics 255, 848–866 (2014) https://doi.org/10.1016/j.cam.2013.07.002 Chen et al. [2020] Chen, S., Ma, S., So, A.M.-C., Zhang, T.: Proximal gradient method for nonsmooth optimization over the Stiefel manifold. SIAM Journal on Optimization 30(1), 210–239 (2020) https://doi.org/10.1137/18M122457X Chen et al. [2017] Chen, X., Guo, L., Lu, Z., Ye, J.J.: An augmented Lagrangian method for non-Lipschitz nonconvex programming. SIAM Journal on Numerical Analysis 55(1), 168–193 (2017) https://doi.org/10.1137/15M1052834 Dai [2002] Dai, Y.-H.: A nonmonotone conjugate gradient algorithm for unconstrained optimization. Journal of Systems Science and Complexity 15, 139–145 (2002) Ferreira and Oliveira [1998] Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Borckmans, P.B., Easter Selvan, S., Boumal, N., Absil, P.-A.: A Riemannian subgradient algorithm for economic dispatch with valve-point effect. Journal of Computational and Applied Mathematics 255, 848–866 (2014) https://doi.org/10.1016/j.cam.2013.07.002 Chen et al. [2020] Chen, S., Ma, S., So, A.M.-C., Zhang, T.: Proximal gradient method for nonsmooth optimization over the Stiefel manifold. SIAM Journal on Optimization 30(1), 210–239 (2020) https://doi.org/10.1137/18M122457X Chen et al. [2017] Chen, X., Guo, L., Lu, Z., Ye, J.J.: An augmented Lagrangian method for non-Lipschitz nonconvex programming. SIAM Journal on Numerical Analysis 55(1), 168–193 (2017) https://doi.org/10.1137/15M1052834 Dai [2002] Dai, Y.-H.: A nonmonotone conjugate gradient algorithm for unconstrained optimization. Journal of Systems Science and Complexity 15, 139–145 (2002) Ferreira and Oliveira [1998] Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Chen, S., Ma, S., So, A.M.-C., Zhang, T.: Proximal gradient method for nonsmooth optimization over the Stiefel manifold. SIAM Journal on Optimization 30(1), 210–239 (2020) https://doi.org/10.1137/18M122457X Chen et al. [2017] Chen, X., Guo, L., Lu, Z., Ye, J.J.: An augmented Lagrangian method for non-Lipschitz nonconvex programming. SIAM Journal on Numerical Analysis 55(1), 168–193 (2017) https://doi.org/10.1137/15M1052834 Dai [2002] Dai, Y.-H.: A nonmonotone conjugate gradient algorithm for unconstrained optimization. Journal of Systems Science and Complexity 15, 139–145 (2002) Ferreira and Oliveira [1998] Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Chen, X., Guo, L., Lu, Z., Ye, J.J.: An augmented Lagrangian method for non-Lipschitz nonconvex programming. SIAM Journal on Numerical Analysis 55(1), 168–193 (2017) https://doi.org/10.1137/15M1052834 Dai [2002] Dai, Y.-H.: A nonmonotone conjugate gradient algorithm for unconstrained optimization. Journal of Systems Science and Complexity 15, 139–145 (2002) Ferreira and Oliveira [1998] Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Dai, Y.-H.: A nonmonotone conjugate gradient algorithm for unconstrained optimization. Journal of Systems Science and Complexity 15, 139–145 (2002) Ferreira and Oliveira [1998] Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430
- Aravkin, A.Y., Baraldi, R., Orban, D.: A proximal quasi-Newton trust-region method for nonsmooth regularized optimization. SIAM Journal on Optimization 32(2), 900–929 (2022) https://doi.org/10.1137/21M1409536 Borckmans et al. [2014] Borckmans, P.B., Easter Selvan, S., Boumal, N., Absil, P.-A.: A Riemannian subgradient algorithm for economic dispatch with valve-point effect. Journal of Computational and Applied Mathematics 255, 848–866 (2014) https://doi.org/10.1016/j.cam.2013.07.002 Chen et al. [2020] Chen, S., Ma, S., So, A.M.-C., Zhang, T.: Proximal gradient method for nonsmooth optimization over the Stiefel manifold. SIAM Journal on Optimization 30(1), 210–239 (2020) https://doi.org/10.1137/18M122457X Chen et al. [2017] Chen, X., Guo, L., Lu, Z., Ye, J.J.: An augmented Lagrangian method for non-Lipschitz nonconvex programming. SIAM Journal on Numerical Analysis 55(1), 168–193 (2017) https://doi.org/10.1137/15M1052834 Dai [2002] Dai, Y.-H.: A nonmonotone conjugate gradient algorithm for unconstrained optimization. Journal of Systems Science and Complexity 15, 139–145 (2002) Ferreira and Oliveira [1998] Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Borckmans, P.B., Easter Selvan, S., Boumal, N., Absil, P.-A.: A Riemannian subgradient algorithm for economic dispatch with valve-point effect. Journal of Computational and Applied Mathematics 255, 848–866 (2014) https://doi.org/10.1016/j.cam.2013.07.002 Chen et al. [2020] Chen, S., Ma, S., So, A.M.-C., Zhang, T.: Proximal gradient method for nonsmooth optimization over the Stiefel manifold. SIAM Journal on Optimization 30(1), 210–239 (2020) https://doi.org/10.1137/18M122457X Chen et al. [2017] Chen, X., Guo, L., Lu, Z., Ye, J.J.: An augmented Lagrangian method for non-Lipschitz nonconvex programming. SIAM Journal on Numerical Analysis 55(1), 168–193 (2017) https://doi.org/10.1137/15M1052834 Dai [2002] Dai, Y.-H.: A nonmonotone conjugate gradient algorithm for unconstrained optimization. Journal of Systems Science and Complexity 15, 139–145 (2002) Ferreira and Oliveira [1998] Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Chen, S., Ma, S., So, A.M.-C., Zhang, T.: Proximal gradient method for nonsmooth optimization over the Stiefel manifold. SIAM Journal on Optimization 30(1), 210–239 (2020) https://doi.org/10.1137/18M122457X Chen et al. [2017] Chen, X., Guo, L., Lu, Z., Ye, J.J.: An augmented Lagrangian method for non-Lipschitz nonconvex programming. SIAM Journal on Numerical Analysis 55(1), 168–193 (2017) https://doi.org/10.1137/15M1052834 Dai [2002] Dai, Y.-H.: A nonmonotone conjugate gradient algorithm for unconstrained optimization. Journal of Systems Science and Complexity 15, 139–145 (2002) Ferreira and Oliveira [1998] Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Chen, X., Guo, L., Lu, Z., Ye, J.J.: An augmented Lagrangian method for non-Lipschitz nonconvex programming. SIAM Journal on Numerical Analysis 55(1), 168–193 (2017) https://doi.org/10.1137/15M1052834 Dai [2002] Dai, Y.-H.: A nonmonotone conjugate gradient algorithm for unconstrained optimization. Journal of Systems Science and Complexity 15, 139–145 (2002) Ferreira and Oliveira [1998] Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Dai, Y.-H.: A nonmonotone conjugate gradient algorithm for unconstrained optimization. Journal of Systems Science and Complexity 15, 139–145 (2002) Ferreira and Oliveira [1998] Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430
- Borckmans, P.B., Easter Selvan, S., Boumal, N., Absil, P.-A.: A Riemannian subgradient algorithm for economic dispatch with valve-point effect. Journal of Computational and Applied Mathematics 255, 848–866 (2014) https://doi.org/10.1016/j.cam.2013.07.002 Chen et al. [2020] Chen, S., Ma, S., So, A.M.-C., Zhang, T.: Proximal gradient method for nonsmooth optimization over the Stiefel manifold. SIAM Journal on Optimization 30(1), 210–239 (2020) https://doi.org/10.1137/18M122457X Chen et al. [2017] Chen, X., Guo, L., Lu, Z., Ye, J.J.: An augmented Lagrangian method for non-Lipschitz nonconvex programming. SIAM Journal on Numerical Analysis 55(1), 168–193 (2017) https://doi.org/10.1137/15M1052834 Dai [2002] Dai, Y.-H.: A nonmonotone conjugate gradient algorithm for unconstrained optimization. Journal of Systems Science and Complexity 15, 139–145 (2002) Ferreira and Oliveira [1998] Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Chen, S., Ma, S., So, A.M.-C., Zhang, T.: Proximal gradient method for nonsmooth optimization over the Stiefel manifold. SIAM Journal on Optimization 30(1), 210–239 (2020) https://doi.org/10.1137/18M122457X Chen et al. [2017] Chen, X., Guo, L., Lu, Z., Ye, J.J.: An augmented Lagrangian method for non-Lipschitz nonconvex programming. SIAM Journal on Numerical Analysis 55(1), 168–193 (2017) https://doi.org/10.1137/15M1052834 Dai [2002] Dai, Y.-H.: A nonmonotone conjugate gradient algorithm for unconstrained optimization. Journal of Systems Science and Complexity 15, 139–145 (2002) Ferreira and Oliveira [1998] Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Chen, X., Guo, L., Lu, Z., Ye, J.J.: An augmented Lagrangian method for non-Lipschitz nonconvex programming. SIAM Journal on Numerical Analysis 55(1), 168–193 (2017) https://doi.org/10.1137/15M1052834 Dai [2002] Dai, Y.-H.: A nonmonotone conjugate gradient algorithm for unconstrained optimization. Journal of Systems Science and Complexity 15, 139–145 (2002) Ferreira and Oliveira [1998] Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Dai, Y.-H.: A nonmonotone conjugate gradient algorithm for unconstrained optimization. Journal of Systems Science and Complexity 15, 139–145 (2002) Ferreira and Oliveira [1998] Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430
- Chen, S., Ma, S., So, A.M.-C., Zhang, T.: Proximal gradient method for nonsmooth optimization over the Stiefel manifold. SIAM Journal on Optimization 30(1), 210–239 (2020) https://doi.org/10.1137/18M122457X Chen et al. [2017] Chen, X., Guo, L., Lu, Z., Ye, J.J.: An augmented Lagrangian method for non-Lipschitz nonconvex programming. SIAM Journal on Numerical Analysis 55(1), 168–193 (2017) https://doi.org/10.1137/15M1052834 Dai [2002] Dai, Y.-H.: A nonmonotone conjugate gradient algorithm for unconstrained optimization. Journal of Systems Science and Complexity 15, 139–145 (2002) Ferreira and Oliveira [1998] Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Chen, X., Guo, L., Lu, Z., Ye, J.J.: An augmented Lagrangian method for non-Lipschitz nonconvex programming. SIAM Journal on Numerical Analysis 55(1), 168–193 (2017) https://doi.org/10.1137/15M1052834 Dai [2002] Dai, Y.-H.: A nonmonotone conjugate gradient algorithm for unconstrained optimization. Journal of Systems Science and Complexity 15, 139–145 (2002) Ferreira and Oliveira [1998] Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Dai, Y.-H.: A nonmonotone conjugate gradient algorithm for unconstrained optimization. Journal of Systems Science and Complexity 15, 139–145 (2002) Ferreira and Oliveira [1998] Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430
- Chen, X., Guo, L., Lu, Z., Ye, J.J.: An augmented Lagrangian method for non-Lipschitz nonconvex programming. SIAM Journal on Numerical Analysis 55(1), 168–193 (2017) https://doi.org/10.1137/15M1052834 Dai [2002] Dai, Y.-H.: A nonmonotone conjugate gradient algorithm for unconstrained optimization. Journal of Systems Science and Complexity 15, 139–145 (2002) Ferreira and Oliveira [1998] Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Dai, Y.-H.: A nonmonotone conjugate gradient algorithm for unconstrained optimization. Journal of Systems Science and Complexity 15, 139–145 (2002) Ferreira and Oliveira [1998] Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430
- Dai, Y.-H.: A nonmonotone conjugate gradient algorithm for unconstrained optimization. Journal of Systems Science and Complexity 15, 139–145 (2002) Ferreira and Oliveira [1998] Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430
- Ferreira, O., Oliveira, P.: Subgradient algorithm on Riemannian manifolds. Journal of Optimization Theory and Applications 97(1), 93–104 (1998) https://doi.org/10.1023/A:1022675100677 Gao et al. [2019] Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430
- Gao, B., Liu, X., Yuan, Y.-X.: Parallelizable algorithms for optimization problems with orthogonality constraints. SIAM Journal on Scientific Computing 41(3), 1949–1983 (2019) https://doi.org/10.1137/18M1221679 https://doi.org/10.1137/18M1221679 Ghanbari and Scheinberg [2018] Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430
- Ghanbari, H., Scheinberg, K.: Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates. Computational Optimization and Applications 69(3), 597–627 (2018) https://doi.org/10.1007/s10589-017-9964-z Grapiglia and Nesterov [2017] Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430
- Grapiglia, G.N., Nesterov, Y.: Regularized Newton methods for minimizing functions with Hölder continuous Hessians. SIAM Journal on Optimization 27(1), 478–506 (2017) https://doi.org/10.1137/16M1087801 Grapiglia and Nesterov [2019] Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430
- Grapiglia, G.N., Nesterov, Y.: Accelerated regularized Newton methods for minimizing composite convex functions. SIAM Journal on Optimization 29(1), 77–99 (2019) https://doi.org/10.1137/17M1142077 Grippo et al. [1986] Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430
- Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM Journal on Numerical Analysis 23(4), 707–716 (1986) https://doi.org/10.1137/0723046 Grohs and Hosseini [2016] Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430
- Grohs, P., Hosseini, S.: ε𝜀\varepsilonitalic_ε-subgradient algorithms for locally Lipschitz functions on Riemannian manifolds. Advances in Computational Mathematics 42(2), 333–360 (2016) https://doi.org/10.1007/s10444-015-9426-z Hosseini and Pouryayevali [2011] Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430
- Hosseini, S., Pouryayevali, M.R.: Generalized gradients and characterization of epi-Lipschitz sets in Riemannian manifolds. Nonlinear Analysis: Theory, Methods &\&& Applications 74(12), 3884–3895 (2011) https://doi.org/10.1016/j.na.2011.02.023 Hu et al. [2018] Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430
- Hu, J., Milzarek, A., Wen, Z., Yuan, Y.: Adaptive quadratically regularized Newton method for Riemannian optimization. SIAM Journal on Matrix Analysis and Applications 39(3), 1181–1207 (2018) https://doi.org/10.1137/17M1142478 Huang and Wei [2022] Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430
- Huang, W., Wei, K.: Riemannian proximal gradient methods. Mathematical Programming 194, 371–413 (2022) https://doi.org/10.1007/s10107-021-01632-3 Huang and Wei [2023] Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430
- Huang, W., Wei, K.: An inexact Riemannian proximal gradient method. Computational Optimization and Applications 85(1), 1–32 (2023) https://doi.org/10.1007/s10589-023-00451-w Huang et al. [2022] Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430
- Huang, W., Wei, M., Gallivan, K.A., Dooren, P.V.: A Riemannian optimization approach to clustering problems. ArXiv (2022) arXiv:2208.03858 Kovnatsky et al. [2016] Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430
- Kovnatsky, A., Glashoff, K., Bronstein, M.M.: MADMM: A generic algorithm for non-smooth optimization on manifolds. In: Leibe, B., Matas, J., Sebe, N., Welling, M. (eds.) Computer Vision – ECCV 2016, pp. 680–696. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-46454-1_41 Lai and Osher [2014] Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430
- Lai, R., Osher, S.: A splitting method for orthogonality constrained problems. Journal of Scientific Computing 58(2), 431–449 (2014) https://doi.org/10.1007/s10915-013-9740-x Lee et al. [2014] Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430
- Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. SIAM Journal on Optimization 24(3), 1420–1443 (2014) https://doi.org/10.1137/130921428 Liu et al. [2019] Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430
- Liu, H., Wu, W., So, A.M.-C.: Quadratic optimization with orthogonality constraints: Explicit Łojasiewicz exponent and linear convergence of retraction-based line-search and stochastic variance-reduced gradient methods. Mathematical Programming 178, 215–262 (2019) https://doi.org/10.1007/s10107-018-1285-1 Nesterov and Polyak [2006] Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430
- Nesterov, Y., Polyak, B.T.: Cubic regularization of Newton method and its global performance. Mathematical Programming 108, 177–205 (2006) https://doi.org/10.1007/s10107-006-0706-8 Nocedal and Wright [2006] Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430
- Nocedal, J., Wright, S.J.: Numerical Optimization, 2nd edn. Springer Series in Operations Research and Financial Engineering. Springer, New York (2006) Ozoliņš et al. [2013] Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430
- Ozoliņš, V., Lai, R., Caflisch, R., Osher, S.: Compressed modes for variational problems in mathematics and physics. Proceedings of the National Academy of Sciences of the United States of America 110(46), 18368–18373 (2013) https://doi.org/10.1073/pnas.1318679110 Powell [1978] Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430
- Powell, M.J.: Algorithms for nonlinear constraints that use Lagrangian functions. Math. Program. 14(1), 224–248 (1978) https://doi.org/10.1007/BF01588967 Si et al. [2023] Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430
- Si, W., Absil, P.-A., Huang, W., Jiang, R., Vary, S.: A Riemannian proximal Newton method. ArXiv (2023) arXiv:2304.04032 Wang and Yang [2023] Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430
- Wang, Q., Yang, W.H.: Proximal quasi-Newton method for composite optimization over the Stiefel manifold. Journal of Scientific Computing 95 (2023) https://doi.org/10.1007/s10915-023-02165-x Wang et al. [2017] Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430
- Wang, X., Ma, S., Goldfarb, D., Liu, W.: Stochastic quasi-Newton methods for nonconvex stochastic optimization. SIAM Journal on Optimization 27(2), 927–956 (2017) https://doi.org/10.1137/15M1053141 Wen and Yin [2013] Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430
- Wen, Z., Yin, W.: A feasible method for optimization with orthogonality constraints. Mathematical Programming 142, 397–434 (2013) https://doi.org/10.1007/s10107-012-0584-1 Xiao et al. [2018] Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430
- Xiao, X., Li, Y., Wen, Z., Zhang, L.W.: A regularized semi-smooth Newton method with projection steps for composite convex programs. Journal of Scientific Computing 76(1), 364–389 (2018) https://doi.org/10.1007/s10915-017-0624-3 Yang et al. [2014] Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430
- Yang, W.H., Zhang, L., Song, R.: Optimality conditions for the nonlinear programming problems on Riemannian manifolds. Pacific Journal of Optimization 10, 415–434 (2014) Zhou et al. [2022] Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430
- Zhou, Y., Bao, C., Ding, C., Zhu, J.-L.: A semismooth Newton based augmented Lagrangian method for nonsmooth optimization on matrix manifolds. Mathematical Programming (2022) https://doi.org/10.1007/s10107-022-01898-1 Zou et al. [2006] Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430 Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430
- Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15(2), 265–286 (2006) https://doi.org/10.1198/106186006X113430