Host: The Japanese Society for Artificial Intelligence
Name : The 36th Annual Conference of the Japanese Society for Artificial Intelligence
Number : 36
Location : [in Japanese]
Date : June 14, 2022 - June 17, 2022
In this work, we aim to establish a novel technique for SAT solving by combining deep learning and optimization methods. In general, deep learning models can be trained by optimizing a differentiable cost function. Considering the properties of deep learning models, we can build a deep learning model for SAT solving using back-propagation once the cost function is given for SAT solving. Therefore, we define a differentiable cost function for SAT solving and minimize the cost function by updating the assignment in a continuous vector space using Newton's method. Since Newton's method converges quadratically to a solution when an initial assignment is close to a solution, we use the prediction of deep learning as an initial assignment. The proposed technique is denoted as GNN-MatSat, which solves SAT instances by minimizing a differentiable cost function using Newton's method starting from the prediction of GNN. We conducted experiments to evaluate the performance of GNN-MatSat on the random track benchmarks of SAT Competition 2017 and 2018. The experimental results show that GNN-MatSat successfully found a solution for all benchmarks. We also compared GNN-MatSat with MatSat and observed that GNN-MatSat found a solution faster with fewer iterations than MatSat by starting the search from the predicted assignment close to a solution.