SVDinsTN: A Tensor Network Paradigm for Efficient Structure Search from Regularized Modeling Perspective (2305.14912v6)
Abstract: Tensor network (TN) representation is a powerful technique for computer vision and machine learning. TN structure search (TN-SS) aims to search for a customized structure to achieve a compact representation, which is a challenging NP-hard problem. Recent "sampling-evaluation"-based methods require sampling an extensive collection of structures and evaluating them one by one, resulting in prohibitively high computational costs. To address this issue, we propose a novel TN paradigm, named SVD-inspired TN decomposition (SVDinsTN), which allows us to efficiently solve the TN-SS problem from a regularized modeling perspective, eliminating the repeated structure evaluations. To be specific, by inserting a diagonal factor for each edge of the fully-connected TN, SVDinsTN allows us to calculate TN cores and diagonal factors simultaneously, with the factor sparsity revealing a compact TN structure. In theory, we prove a convergence guarantee for the proposed method. Experimental results demonstrate that the proposed method achieves approximately 100 to 1000 times acceleration compared to the state-of-the-art TN-SS methods while maintaining a comparable level of representation ability.
- Tensor decompositions for learning latent variable models. The Journal of Machine Learning Research, 15(1):2773–2832, 2014.
- Proximal alternating minimization and projection methods for nonconvex problems: An approach based on the Kurdyka-Łojasiewicz inequality. Mathematics of Operations Research, 35(2):438–457, 2010.
- Efficient tensor completion for color image and video recovery: Low-rank tensor train. IEEE Transactions on Image Processing, 26(5):2466–2479, 2017.
- Tensor decompositions for signal processing applications: From two-way to multiway component analysis. IEEE Signal Processing Magazine, 32(2):145–163, 2015.
- Tensor networks for dimensionality reduction and large-scale optimization: Part 1 low-rank tensor decompositions. Foundations and Trends® in Machine Learning, 9(4-5):249–429, 2016.
- A dual algorithm for the solution of nonlinear variational problems via finite element approximation. Computers and Mathematics with Applications, 2(1):17–40, 1976.
- Typicality in random matrix product states. Physical Review A, 81:032336, 2010.
- Approximately optimal core shapes for tensor decompositions. arXiv preprint arXiv:2302.03886, 2023.
- Expressive power of tensor-network factorizations for probabilistic modeling. In Advances in Neural Information Processing Systems, 2019.
- Tikhonov regularization and total least squares. SIAM journal on matrix analysis and applications, 21(1):185–194, 1999.
- Multiscale tensor decomposition and rendering equation encoding for view synthesis. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pages 4232–4241, 2023.
- Adaptive learning of tensor network structures. arXiv preprint arXiv:2008.05437, 2020.
- Most tensor problems are NP-hard. Journal of the ACM, 60(6):1–39, 2013.
- Tensor decompositions and applications. SIAM Review, 51(3):455–500, 2009.
- Evolutionary topology search for tensor network decomposition. In Proceedings of the 37th International Conference on Machine Learning, pages 5947–5957, 2020.
- Is rank minimization of the essence to learn tensor network structure? In Second Workshop on Quantum Tensor Networks in Machine Learning (QTNML), Neurips, 2021.
- Permutation search of tensor network structures via local sampling. In Proceedings of the 39th International Conference on Machine Learning, pages 13106–13124, 2022a.
- Alternating local enumeration (tnale): Solving tensor network structure search with fewer evaluations. In Proceedings of the 40th International Conference on Machine Learning, 2023.
- Heuristic rank selection with progressively searching tensor ring network. Complex & Intelligent Systems, 8(2):771–785, 2022b.
- Adaptively topological tensor network for multi-view subspace clustering. arXiv preprint arXiv:2305.00716, 2023.
- Hlrtf: Hierarchical low-rank tensor factorization for inverse problems in multi-dimensional imaging. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pages 19281–19290, 2022.
- Adaptive tensor networks decomposition. In British Machine Vision Conference, 2021.
- Román Orús. A practical introduction to tensor networks: Matrix product states and projected entangled pair states. Annals of Physics, 349:117–158, 2014.
- Ivan Oseledets. Tensor-train decomposition. SIAM Journal on Scientific Computing, 33(5):2295–2317, 2011.
- Scalable Bayesian low-rank decomposition of incomplete multiway tensors. In Proceedings of the 31st International Conference on International Conference on Machine Learning, page II–1800–II–1808, 2014.
- Adaptive rank selection for tensor ring decomposition. IEEE Journal of Selected Topics in Signal Processing, 15(3):454–463, 2021.
- Moving object detection under discontinuous change in illumination using tensor low-rank and invariant sparse decomposition. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pages 7221–7230, 2019.
- Tensor wheel decomposition and its tensor completion application. In Advances in Neural Information Processing Systems, pages 27008–27020, 2022.
- Parallel matrix factorization for low-rank tensor completion. Inverse Problems and Imaging, 9(2):601–624, 2015.
- Fast algorithm for low-rank tensor completion in delay-embedded space. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pages 2048–2056, 2022.
- Tensor network ranks. arXiv preprint arXiv:1801.02662, 2018.
- Higher-dimension tensor completion via low-rank tensor ring decomposition. In Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, pages 1071–1076, 2018.
- Tensor ring decomposition with rank minimization on latent space: An efficient approach for tensor completion. In Proceedings of the AAAI Conference on Artificial Intelligence, pages 9151–9158, 2019a.
- High-order tensor completion via gradient-based optimization under tensor train format. Signal Processing: Image Communication, 73:53–61, 2019b.
- Learning tensor low-rank prior for hyperspectral image reconstruction. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pages 12001–12010, 2021.
- Nonlocal low-rank tensor factor analysis for image restoration. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pages 8232–8241, 2018.
- Bayesian CP factorization of incomplete tensors with automatic rank determination. IEEE Transactions on Pattern Analysis and Machine Intelligence, 37(9):1751–1763, 2015.
- Tensor ring decomposition. arXiv preprint arXiv:1606.05535, 2016.
- Spatial-spectral-temporal connective tensor network decomposition for thick cloud removal. ISPRS Journal of Photogrammetry and Remote Sensing, 199:182–194, 2023.
- Tensor N-tubal rank and its convex relaxation for low-rank tensor recovery. Information Sciences, 532:170–189, 2020.
- Fully-connected tensor network decomposition and its application to higher-order tensor completion. In Proceedings of the AAAI Conference on Artificial Intelligence, pages 11071–11078, 2021.
- Tensor completion via fully-connected tensor network decomposition with regularized factors. Journal of Scientific Computing, 92(8):1–35, 2022.