Papers
Topics
Authors
Recent
2000 character limit reached

Convergence of gradient-based block coordinate descent algorithms for non-orthogonal joint approximate diagonalization of matrices

Published 28 Sep 2020 in math.NA and cs.NA | (2009.13377v2)

Abstract: In this paper, we propose a gradient-based block coordinate descent (BCD-G) framework to solve the joint approximate diagonalization of matrices defined on the product of the complex Stiefel manifold and the special linear group. Instead of the cyclic fashion, we choose a block optimization based on the Riemannian gradient. To update the first block variable in the complex Stiefel manifold, we use the well-known line search descent method. To update the second block variable in the special linear group, based on four kinds of different elementary transformations, we construct three classes: GLU, GQU and GU, and then get three BCD-G algorithms: BCD-GLU, BCD-GQU and BCD-GU. We establish the global and weak convergence of these three algorithms using the \L{}ojasiewicz gradient inequality under the assumption that the iterates are bounded. We also propose a gradient-based Jacobi-type framework to solve the joint approximate diagonalization of matrices defined on the special linear group. As in the BCD-G case, using the GLU and GQU classes of elementary transformations, we focus on the Jacobi-GLU and Jacobi-GQU algorithms and establish their global and weak convergence. All the algorithms and convergence results described in this paper also apply to the real case.

Citations (1)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.