Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
158 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

SVDinsTN: A Tensor Network Paradigm for Efficient Structure Search from Regularized Modeling Perspective (2305.14912v6)

Published 24 May 2023 in cs.LG

Abstract: Tensor network (TN) representation is a powerful technique for computer vision and machine learning. TN structure search (TN-SS) aims to search for a customized structure to achieve a compact representation, which is a challenging NP-hard problem. Recent "sampling-evaluation"-based methods require sampling an extensive collection of structures and evaluating them one by one, resulting in prohibitively high computational costs. To address this issue, we propose a novel TN paradigm, named SVD-inspired TN decomposition (SVDinsTN), which allows us to efficiently solve the TN-SS problem from a regularized modeling perspective, eliminating the repeated structure evaluations. To be specific, by inserting a diagonal factor for each edge of the fully-connected TN, SVDinsTN allows us to calculate TN cores and diagonal factors simultaneously, with the factor sparsity revealing a compact TN structure. In theory, we prove a convergence guarantee for the proposed method. Experimental results demonstrate that the proposed method achieves approximately 100 to 1000 times acceleration compared to the state-of-the-art TN-SS methods while maintaining a comparable level of representation ability.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (42)
  1. Tensor decompositions for learning latent variable models. The Journal of Machine Learning Research, 15(1):2773–2832, 2014.
  2. Proximal alternating minimization and projection methods for nonconvex problems: An approach based on the Kurdyka-Łojasiewicz inequality. Mathematics of Operations Research, 35(2):438–457, 2010.
  3. Efficient tensor completion for color image and video recovery: Low-rank tensor train. IEEE Transactions on Image Processing, 26(5):2466–2479, 2017.
  4. Tensor decompositions for signal processing applications: From two-way to multiway component analysis. IEEE Signal Processing Magazine, 32(2):145–163, 2015.
  5. Tensor networks for dimensionality reduction and large-scale optimization: Part 1 low-rank tensor decompositions. Foundations and Trends® in Machine Learning, 9(4-5):249–429, 2016.
  6. A dual algorithm for the solution of nonlinear variational problems via finite element approximation. Computers and Mathematics with Applications, 2(1):17–40, 1976.
  7. Typicality in random matrix product states. Physical Review A, 81:032336, 2010.
  8. Approximately optimal core shapes for tensor decompositions. arXiv preprint arXiv:2302.03886, 2023.
  9. Expressive power of tensor-network factorizations for probabilistic modeling. In Advances in Neural Information Processing Systems, 2019.
  10. Tikhonov regularization and total least squares. SIAM journal on matrix analysis and applications, 21(1):185–194, 1999.
  11. Multiscale tensor decomposition and rendering equation encoding for view synthesis. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pages 4232–4241, 2023.
  12. Adaptive learning of tensor network structures. arXiv preprint arXiv:2008.05437, 2020.
  13. Most tensor problems are NP-hard. Journal of the ACM, 60(6):1–39, 2013.
  14. Tensor decompositions and applications. SIAM Review, 51(3):455–500, 2009.
  15. Evolutionary topology search for tensor network decomposition. In Proceedings of the 37th International Conference on Machine Learning, pages 5947–5957, 2020.
  16. Is rank minimization of the essence to learn tensor network structure? In Second Workshop on Quantum Tensor Networks in Machine Learning (QTNML), Neurips, 2021.
  17. Permutation search of tensor network structures via local sampling. In Proceedings of the 39th International Conference on Machine Learning, pages 13106–13124, 2022a.
  18. Alternating local enumeration (tnale): Solving tensor network structure search with fewer evaluations. In Proceedings of the 40th International Conference on Machine Learning, 2023.
  19. Heuristic rank selection with progressively searching tensor ring network. Complex & Intelligent Systems, 8(2):771–785, 2022b.
  20. Adaptively topological tensor network for multi-view subspace clustering. arXiv preprint arXiv:2305.00716, 2023.
  21. Hlrtf: Hierarchical low-rank tensor factorization for inverse problems in multi-dimensional imaging. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pages 19281–19290, 2022.
  22. Adaptive tensor networks decomposition. In British Machine Vision Conference, 2021.
  23. Román Orús. A practical introduction to tensor networks: Matrix product states and projected entangled pair states. Annals of Physics, 349:117–158, 2014.
  24. Ivan Oseledets. Tensor-train decomposition. SIAM Journal on Scientific Computing, 33(5):2295–2317, 2011.
  25. Scalable Bayesian low-rank decomposition of incomplete multiway tensors. In Proceedings of the 31st International Conference on International Conference on Machine Learning, page II–1800–II–1808, 2014.
  26. Adaptive rank selection for tensor ring decomposition. IEEE Journal of Selected Topics in Signal Processing, 15(3):454–463, 2021.
  27. Moving object detection under discontinuous change in illumination using tensor low-rank and invariant sparse decomposition. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pages 7221–7230, 2019.
  28. Tensor wheel decomposition and its tensor completion application. In Advances in Neural Information Processing Systems, pages 27008–27020, 2022.
  29. Parallel matrix factorization for low-rank tensor completion. Inverse Problems and Imaging, 9(2):601–624, 2015.
  30. Fast algorithm for low-rank tensor completion in delay-embedded space. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pages 2048–2056, 2022.
  31. Tensor network ranks. arXiv preprint arXiv:1801.02662, 2018.
  32. Higher-dimension tensor completion via low-rank tensor ring decomposition. In Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, pages 1071–1076, 2018.
  33. Tensor ring decomposition with rank minimization on latent space: An efficient approach for tensor completion. In Proceedings of the AAAI Conference on Artificial Intelligence, pages 9151–9158, 2019a.
  34. High-order tensor completion via gradient-based optimization under tensor train format. Signal Processing: Image Communication, 73:53–61, 2019b.
  35. Learning tensor low-rank prior for hyperspectral image reconstruction. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pages 12001–12010, 2021.
  36. Nonlocal low-rank tensor factor analysis for image restoration. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pages 8232–8241, 2018.
  37. Bayesian CP factorization of incomplete tensors with automatic rank determination. IEEE Transactions on Pattern Analysis and Machine Intelligence, 37(9):1751–1763, 2015.
  38. Tensor ring decomposition. arXiv preprint arXiv:1606.05535, 2016.
  39. Spatial-spectral-temporal connective tensor network decomposition for thick cloud removal. ISPRS Journal of Photogrammetry and Remote Sensing, 199:182–194, 2023.
  40. Tensor N-tubal rank and its convex relaxation for low-rank tensor recovery. Information Sciences, 532:170–189, 2020.
  41. Fully-connected tensor network decomposition and its application to higher-order tensor completion. In Proceedings of the AAAI Conference on Artificial Intelligence, pages 11071–11078, 2021.
  42. Tensor completion via fully-connected tensor network decomposition with regularized factors. Journal of Scientific Computing, 92(8):1–35, 2022.
Citations (4)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets