NeuroBack: Improving CDCL SAT Solving using Graph Neural Networks (2110.14053v7)
Abstract: Propositional satisfiability (SAT) is an NP-complete problem that impacts many research fields, such as planning, verification, and security. Mainstream modern SAT solvers are based on the Conflict-Driven Clause Learning (CDCL) algorithm. Recent work aimed to enhance CDCL SAT solvers using Graph Neural Networks (GNNs). However, so far this approach either has not made solving more effective, or required substantial GPU resources for frequent online model inferences. Aiming to make GNN improvements practical, this paper proposes an approach called NeuroBack, which builds on two insights: (1) predicting phases (i.e., values) of variables appearing in the majority (or even all) of the satisfying assignments are essential for CDCL SAT solving, and (2) it is sufficient to query the neural model only once for the predictions before the SAT solving starts. Once trained, the offline model inference allows NeuroBack to execute exclusively on the CPU, removing its reliance on GPU resources. To train NeuroBack, a new dataset called DataBack containing 120,286 data samples is created. NeuroBack is implemented as an enhancement to a state-of-the-art SAT solver called Kissat. As a result, it allowed Kissat to solve up to 5.2% and 7.4% more problems on two recent SAT competition problem sets, SATCOMP-2022 and SATCOMP-2023, respectively. NeuroBack therefore shows how machine learning can be harnessed to improve SAT solving in an effective and practical manner.
- Model counting competition 2020 url, 2020. https://mccompetition.org/2021/mc_description.html.
- Model counting competition 2021 url, 2021. https://mccompetition.org/2021/mc_description.html.
- Model counting competition 2022 url, 2022. https://mccompetition.org/2022/mc_description.html.
- Sat competition 2022. https://satcompetition.github.io/2022/, 2022. Accessed: 2023-08-10.
- Sat competition 2023. https://satcompetition.github.io/2023/, 2023. Accessed: 2023-11-23.
- Boosting the performance of cdcl-based sat solvers by exploiting backbones and backdoors. Algorithms, 15(9):302, 2022.
- On the glucose sat solver. International Journal on Artificial Intelligence Tools, 27(01):1840001, 2018.
- Layer normalization. arXiv preprint arXiv:1607.06450, 2016.
- Relational inductive biases, deep learning, and graph networks. arXiv preprint arXiv:1806.01261, 2018.
- Chasing target phases. In Workshop on the Pragmatics of SAT, 2020.
- Gimsatul, IsaSAT and Kissat entering the SAT Competition 2022. In Tomas Balyo, Marijn Heule, Markus Iser, Matti Järvisalo, and Martin Suda (eds.), Proc. of SAT Competition 2022 – Solver and Benchmark Descriptions, volume B-2022-1 of Department of Computer Science Series of Publications B, pp. 10–11. University of Helsinki, 2022.
- Cadical, kissat, paracooba entering the sat competition 2021. 2021. URL https://api.semanticscholar.org/CorpusID:238996423.
- Cadiback: Extracting backbones with cadical. In 26th International Conference on Theory and Applications of Satisfiability Testing (SAT 2023). Schloss Dagstuhl-Leibniz-Zentrum für Informatik, 2023.
- Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In Proceedings of the AAAI conference on artificial intelligence, volume 34, pp. 3438–3445, 2020.
- A machine program for theorem-proving. Communications of the ACM, 5(7):394–397, 1962.
- An image is worth 16x16 words: Transformers for image recognition at scale. arXiv preprint arXiv:2010.11929, 2020.
- An extensible sat-solver. In International conference on theory and applications of satisfiability testing, pp. 502–518. Springer, 2003.
- Fast graph representation learning with PyTorch Geometric. In ICLR Workshop on Representation Learning on Graphs and Manifolds, 2019.
- conv.gatconv. https://pytorch-geometric.readthedocs.io/en/latest/generated/torch_geometric.nn.conv.GATConv.html, 2023a.
- conv.ginconv. https://pytorch-geometric.readthedocs.io/en/latest/generated/torch_geometric.nn.conv.GINConv.html, 2023b.
- The model counting competition 2020. Journal of Experimental Algorithmics (JEA), 26:1–26, 2021.
- Cadical, kissat, paracooba, plingeling and treengeling entering the sat competition 2020. SAT COMPETITION, 2020:50, 2020.
- Neural message passing for quantum chemistry. In International conference on machine learning, pp. 1263–1272. PMLR, 2017.
- A new model for learning in graph domains. In Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005., volume 2, pp. 729–734. IEEE, 2005.
- Manysat: a parallel sat solver. Journal on Satisfiability, Boolean Modeling and Computation, 6(4):245–262, 2010.
- Inductive representation learning on large graphs. In Proceedings of the 31st International Conference on Neural Information Processing Systems, pp. 1025–1035, 2017.
- Jesse Michael Han. Enhancing sat solvers with glue variable predictions. arXiv preprint arXiv:2007.02559, 2020.
- Proceedings of SAT competition 2018: Solver and benchmark descriptions. 2018.
- Satlib: An online resource for research on sat. Sat, 2000:283–292, 2000.
- Mikoláš Janota. SAT solving in interactive configuration. PhD thesis, University College Dublin, 2010.
- Neural heuristics for sat solving. arXiv preprint arXiv:2005.13406, 2020.
- Backbones and backdoors in satisfiability. In AAAI, volume 5, pp. 1368–1373, 2005.
- Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907, 2016.
- Improving SAT solver heuristics with graph networks and reinforcement learning. 2019.
- Improving SAT solver heuristics with graph networks and reinforcement learning. In Advances in Neural Information Processing Systems, 2020.
- Cnfgen: A generator of crafted benchmarks. In International Conference on Theory and Applications of Satisfiability Testing, pp. 464–473. Springer, 2017.
- Learning rate based branching heuristic for sat solvers. In Theory and Applications of Satisfiability Testing–SAT 2016: 19th International Conference, Bordeaux, France, July 5-8, 2016, Proceedings 19, pp. 123–140. Springer, 2016.
- Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101, 2017.
- Conflict-driven clause learning sat solvers. In Handbook of Satisfiability: Second Edition. Part 1/Part 2, pp. 133–182. IOS Press BV, 2021.
- Grasp—a new search algorithm for satisfiability. In The Best of ICCAD, pp. 73–89. Springer, 2003.
- An overview of parallel sat solving. Constraints, 17:304–347, 2012.
- Graphit: Encoding graph structure in transformers. arXiv preprint arXiv:2106.05667, 2021.
- Transformer for graphs: An overview from architecture perspective. arXiv preprint arXiv:2202.08455, 2022.
- Chaff: Engineering an efficient sat solver. In Proceedings of the 38th Annual Design Automation Conference, DAC ’01, pp. 530–535, New York, NY, USA, 2001. Association for Computing Machinery.
- Scaling vision transformers to 22 billion parameters. https://ai.googleblog.com/2023/03/scaling-vision-transformers-to-22.html, 2023.
- Pytorch: An imperative style, high-performance deep learning library. arXiv preprint arXiv:1912.01703, 2019.
- A lightweight component caching scheme for satisfiability solvers. In Theory and Applications of Satisfiability Testing–SAT 2007: 10th International Conference, Lisbon, Portugal, May 28-31, 2007. Proceedings 10, pp. 294–299. Springer, 2007.
- Self-supervised graph transformer on large-scale molecular data. Advances in Neural Information Processing Systems, 33:12559–12571, 2020.
- The graph neural network model. IEEE Transactions on Neural Networks, 20(1):61–80, 2008.
- Scalable sat solving in the cloud. In Theory and Applications of Satisfiability Testing–SAT 2021: 24th International Conference, Barcelona, Spain, July 5-9, 2021, Proceedings 24, pp. 518–534. Springer, 2021.
- Guiding high-performance SAT solvers with unsat-core predictions. In International Conference on Theory and Applications of Satisfiability Testing, pp. 336–353. Springer, 2019.
- Learning a SAT solver from single-bit supervision. arXiv preprint arXiv:1802.03685, 2018.
- Attention is all you need. Advances in neural information processing systems, 30, 2017.
- Graph attention networks. arXiv preprint arXiv:1710.10903, 2017.
- Haoze Wu. Improving sat-solving with machine learning. In Proceedings of the 2017 ACM SIGCSE Technical Symposium on Computer Science Education, pp. 787–788, 2017.
- Representing long-range context for graph neural networks with global attention. Advances in Neural Information Processing Systems, 34:13266–13279, 2021.
- A comprehensive survey on graph neural networks. IEEE transactions on neural networks and learning systems, 32(1):4–24, 2020.
- How powerful are graph neural networks? arXiv preprint arXiv:1810.00826, 2018.
- Learning local search heuristics for boolean satisfiability. In Advances in Neural Information Processing Systems, pp. 7992–8003, 2019.
- Elimination mechanism of glue variables for solving sat problems in linguistics. In The Asian Conference on Language, pp. 147–167, 2021.
- Graph neural networks: A review of methods and applications. AI Open, 1:57–81, 2020.
- A comprehensive survey on transfer learning. Proceedings of the IEEE, 109(1):43–76, 2020.
- Wenxi Wang (8 papers)
- Yang Hu (147 papers)
- Mohit Tiwari (18 papers)
- Sarfraz Khurshid (19 papers)
- Kenneth McMillan (2 papers)
- Risto Miikkulainen (59 papers)