Nonsmooth Rate-of-Convergence Analyses of Algorithms for Eigenvalue Optimization (1805.04393v1)
Abstract: Non-smoothness at optimal points is a common phenomenon in many eigenvalue optimization problems. We consider two recent algorithms to minimize the largest eigenvalue of a Hermitian matrix dependent on one parameter, both proven to be globally convergent unaffected by non-smoothness. One of these models the eigenvalue function with a piece-wise quadratic function, and effective in dealing with non-convex problems. The other projects the Hermitian matrix into subspaces formed of eigenvectors, and effective in dealing with large-scale problems. We generalize the latter slightly to cope with non-smoothness. For both algorithms, we analyze the rate-of-convergence in the non-smooth setting, when the largest eigenvalue is multiple at the minimizer and zero is strictly in the interior of the generalized Clarke derivative, and prove that both algorithms converge rapidly. The algorithms are applied to, and the deduced results are illustrated on the computation of the inner numerical radius, the modulus of the point on the boundary of the field of values closest to the origin, which carries significance for instance for the numerical solution of a definite generalized symmetric eigenvalue problem.
Collections
Sign up for free to add this paper to one or more collections.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.