Learning Explicitly Conditioned Sparsifying Transforms (2403.03168v1)
Abstract: Sparsifying transforms became in the last decades widely known tools for finding structured sparse representations of signals in certain transform domains. Despite the popularity of classical transforms such as DCT and Wavelet, learning optimal transforms that guarantee good representations of data into the sparse domain has been recently analyzed in a series of papers. Typically, the conditioning number and representation ability are complementary key features of learning square transforms that may not be explicitly controlled in a given optimization model. Unlike the existing approaches from the literature, in our paper, we consider a new sparsifying transform model that enforces explicit control over the data representation quality and the condition number of the learned transforms. We confirm through numerical experiments that our model presents better numerical behavior than the state-of-the-art.
- Kohei Adachi. A restrained condition number least squares technique with its applications to avoiding rank deficiency. Journal of the Japanese Society of Computational Statistics, 26(1):39–51, 2013.
- On a differential equation approach to the weighted orthogonal procrustes problem. Statistics and Computing, 8:125–133, 1998.
- The orthogonally constrained regression revisited. Journal of Computational and Graphical Statistics, 10(4):746–771, 2001.
- B. Dumitrescu and P. Irofti. Dictionary learning algorithms and applications. Springer, 2018.
- Image denoising via sparse and redundant representations over learned dictionaries. IEEE Transactions on Image Processing, 15(12):3736–3745, 2006.
- Efficient algorithms for solving condition number-constrained matrix minimization problems. Linear Algebra and its Applications, 607:190–230, 2020.
- Spectral projected gradient method for the Procrustes problem. TEMA (São Carlos), 15:83–96, 2014.
- Solving constrained Procrustes problems: a conic optimization approach. arXiv preprint arXiv:2304.14961, 2023.
- Learning sparsity-promoting regularizers using bilevel optimization. SIAM Journal on Imaging Sciences, 17(1):31–60, 2024.
- Transform based subspace interpolation for unsupervised domain adaptation applied to machine inspection. In 2023 31st European Signal Processing Conference (EUSIPCO), pages 1708–1712. IEEE, 2023.
- Kernelized transformed subspace clustering with geometric weights for non-linear manifolds. Neurocomputing, 520:141–151, 2023.
- Boris T. Polyak. Introduction to Optimization. Optimization Software Inc., Publications Division, New York, 1987.
- Learning sparsifying transforms. IEEE Transactions on Signal Processing, 61(5):1072–1086, 2012.
- Learning doubly sparse transforms for images. IEEE Transactions on Image Processing, 22(12):4598–4612, 2013.
- ℓ0subscriptℓ0\ell_{0}roman_ℓ start_POSTSUBSCRIPT 0 end_POSTSUBSCRIPT sparsifying transform learning with efficient optimal updates and convergence guarantees. IEEE Transactions on Signal Processing, 63(9):2389–2404, 2015.
- Online sparsifying transform learning—part II: Convergence analysis. IEEE Journal of Selected Topics in Signal Processing, 9(4):637–646, 2015.
- Online sparsifying transform learning—part I: Algorithms. IEEE Journal of Selected Topics in Signal Processing, 9(4):625–636, 2015.
- Block orthonormal overcomplete dictionary learning. In 21st European Signal Processing Conference (EUSIPCO 2013), pages 1–5. IEEE, 2013.
- Peter H Schönemann. A generalized solution of the orthogonal Procrustes problem. Psychometrika, 31(1):1–10, 1966.
- Sequential transform learning. ACM Trans. Knowl. Discov. Data, 15(5), may 2021.
- Shaoxin Wang. An efficient numerical method for condition number constrained covariance matrix approximation. Applied Mathematics and Computation, 397:125925, 2021.
- Image quality assessment: from error visibility to structural similarity. IEEE Transactions on Image Processing, 13(4):600–612, 2004.