| | |
| | |
Stat |
Members: 3645 Articles: 2'501'711 Articles rated: 2609
19 April 2024 |
|
| | | |
|
Article overview
| |
|
Enhancing SAT solvers with glue variable predictions | Jesse Michael Han
; | Date: |
6 Jul 2020 | Abstract: | Modern SAT solvers routinely operate at scales that make it impractical to
query a neural network for every branching decision. NeuroCore, proposed by
Selsam and Bjorner, offered a proof-of-concept that neural networks can still
accelerate SAT solvers by only periodically refocusing a score-based branching
heuristic. However, that work suffered from several limitations: their modified
solvers require GPU acceleration, further ablations showed that they were no
better than a random baseline on the SATCOMP 2018 benchmark, and their training
target of unsat cores required an expensive data pipeline which only labels
relatively easy unsatisfiable problems. We address all these limitations, using
a simpler network architecture allowing CPU inference for even large industrial
problems with millions of clauses, and training instead to predict {em glue
variables}---a target for which it is easier to generate labelled data, and
which can also be formulated as a reinforcement learning task. We demonstrate
the effectiveness of our approach by modifying the state-of-the-art SAT solver
CaDiCaL, improving its performance on SATCOMP 2018 and SATRACE 2019 with
supervised learning and its performance on a dataset of SHA-1 preimage attacks
with reinforcement learning. | Source: | arXiv, 2007.2559 | Services: | Forum | Review | PDF | Favorites |
|
|
No review found.
Did you like this article?
Note: answers to reviews or questions about the article must be posted in the forum section.
Authors are not allowed to review their own article. They can use the forum section.
browser Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; ClaudeBot/1.0; +claudebot@anthropic.com)
|
| |
|
|
|
| News, job offers and information for researchers and scientists:
| |