Cons-training tensor networks (2405.09005v2)
Abstract: In this study, we introduce a novel family of tensor networks, termed \textit{constrained matrix product states} (MPS), designed to incorporate exactly arbitrary discrete linear constraints, including inequalities, into sparse block structures. These tensor networks are particularly tailored for modeling distributions with support strictly over the feasible space, offering benefits such as reducing the search space in optimization problems, alleviating overfitting, improving training efficiency, and decreasing model size. Central to our approach is the concept of a quantum region, an extension of quantum numbers traditionally used in U(1) symmetric tensor networks, adapted to capture any linear constraint, including the unconstrained scenario. We further develop a novel canonical form for these new MPS, which allow for the merging and factorization of tensor blocks according to quantum region fusion rules and permit optimal truncation schemes. Utilizing this canonical form, we apply an unsupervised training strategy to optimize arbitrary objective functions subject to discrete linear constraints. Our method's efficacy is demonstrated by solving the quadratic knapsack problem, achieving superior performance compared to a leading nonlinear integer programming solver. Additionally, we analyze the complexity and scalability of our approach, demonstrating its potential in addressing complex constrained combinatorial optimization problems.
- M. B. Hastings, “Solving gapped hamiltonians locally,” Phys. Rev. B 73, 085115 (2006).
- Matthew B Hastings, “An area law for one-dimensional quantum systems,” Journal of statistical mechanics: theory and experiment 2007, P08024 (2007).
- Frank Verstraete and J Ignacio Cirac, “Renormalization algorithms for quantum-many body systems in two and higher dimensions,” arXiv preprint cond-mat/0407066 (2004).
- David Perez-Garcia, Frank Verstraete, J Ignacio Cirac, and Michael M Wolf, “Peps as unique ground states of local hamiltonians,” arXiv preprint arXiv:0707.2260 (2007).
- Y-Y Shi, L-M Duan, and Guifre Vidal, “Classical simulation of quantum many-body systems with a tree tensor network,” Physical review a 74, 022320 (2006).
- Sukhwinder Singh, Robert N. C. Pfeifer, and Guifre Vidal, “Tensor network states and algorithms in the presence of a global u(1) symmetry,” Phys. Rev. B 83, 115125 (2011).
- Sukhwinder Singh and Guifre Vidal, “Tensor network states and algorithms in the presence of a global su(2) symmetry,” Phys. Rev. B 86, 195114 (2012).
- Andreas Weichselbaum, “Non-abelian symmetries in tensor networks: A quantum symmetry space approach,” Annals of Physics 327, 2972–3047 (2012).
- Pietro Silvi, Ferdinand Tschirsich, Matthias Gerster, Johannes Jünemann, Daniel Jaschke, Matteo Rizzi, and Simone Montangero, “The Tensor Networks Anthology: Simulation techniques for many-body quantum lattice systems,” SciPost Phys. Lect. Notes , 8 (2019).
- Javier Lopez-Piqueres, Jing Chen, and Alejandro Perdomo-Ortiz, “Symmetric tensor networks for generative modeling and constrained combinatorial optimization,” Machine Learning: Science and Technology 4 (2022).
- Ulrich Schollwöck, “The density-matrix renormalization group in the age of matrix product states,” Annals of physics 326, 96–192 (2011).
- Zhao-Yu Han, Jun Wang, Heng Fan, Lei Wang, and Pan Zhang, “Unsupervised generative modeling using matrix product states,” Physical Review X 8, 031012 (2018).
- Fred Glover, Gary Kochenberger, and Yu Du, “A tutorial on formulating and using qubo models,” (2019), arXiv:1811.11538 [cs.DS] .
- Javier Alcazar, Mohammad Ghazi Vakili, Can B Kalayci, and Alejandro Perdomo-Ortiz, “Enhancing combinatorial optimization with classical and quantum generative models,” Nature Communications 15, 2761 (2024).
- Andrew J. Ferris and Guifre Vidal, “Perfect sampling with unitary tensor networks,” Phys. Rev. B 85, 165146 (2012).
- Ksenia Bestuzheva, Antonia Chmiela, Benjamin Müller, Felipe Serrano, Stefan Vigerske, and Fabian Wegscheider, “Global optimization of mixed-integer nonlinear programs with scip 8,” Journal of Global Optimization , 1–24 (2023).
- Miles Lubin, Oscar Dowson, Joaquim Dias Garcia, Joey Huchette, Benoît Legat, and Juan Pablo Vielma, “Jump 1.0: Recent improvements to a modeling language for mathematical optimization,” Mathematical Programming Computation 15, 581–589 (2023).
- Matthew Fishman, Steven R. White, and E. Miles Stoudenmire, “Codebase release 0.3 for ITensor,” SciPost Phys. Codebases , 4–r0.3 (2022).
- https://github.com/JaviLoPiq/ConstrainTNet.jl.git.
- Jacob D Biamonte, Jason Morton, and Jacob Turner, “Tensor network contractions for# sat,” Journal of Statistical Physics 160, 1389–1404 (2015).
- Stefanos Kourtis, Claudio Chamon, Eduardo R. Mucciolo, and Andrei E. Ruckenstein, “Fast counting with tensor networks,” SciPost Phys. 7, 060 (2019).
- Jin-Guo Liu, Lei Wang, and Pan Zhang, “Tropical tensor network for ground states of spin glasses,” Phys. Rev. Lett. 126, 090506 (2021).
- Tianyi Hao, Xuxin Huang, Chunjing Jia, and Cheng Peng, “A quantum-inspired tensor network algorithm for constrained combinatorial optimization problems,” Frontiers in Physics 10, 906590 (2022).
- Gleb Ryzhakov and Ivan Oseledets, “Constructive tt-representation of the tensors given as index interaction functions with applications,” (2022), arXiv:2206.03832 [math.NA] .
- Jin-Guo Liu, Xun Gao, Madelyn Cain, Mikhail D Lukin, and Sheng-Tao Wang, “Computing solution space properties of combinatorial optimization problems via generic tensor networks,” SIAM Journal on Scientific Computing 45, A1239–A1270 (2023).