
A Brief Analogy of NNs and Literal Selection Procedures within SAT Solvers
Neural Networks (NNs) are an important component in analytical solving and heuristic determination of propositional logical statements in current and future methodologies of such statements. There are many devised architectures that predict solutions for given propositional statements as a large data-set (10000-100000 problems) and successfully categorize it’s UN/SAT core predictability to vary degrees of accuracy. This work is the first step towards designing, implementing, and testing a Transformer Neural Network that is capable of handling the generation and solving of First Order Propositional Logic, and can be modified for Higher Order Logic, in a parallel fashion. Understanding the current standards for these systems, such as NeuroSAT and an exploration into other possibilities of approach that are non-conventional, such as TGNs (Typed Graph Networks). Given the ultimate goal to develop and initially theorize a mathematically sound structure for a TNN (Transformer Neural Network) employing parallel multi-head attention processing queues, it is important to gain specific direction in regards to the other tiers of the complete system with the analysis of closely related works. Furthermore, the paradigm of exploring literal based choice analytics and optimization patterns within a prime NN architecture for solving SAT based problems up to k levels will be an important exploration into improving the run-time and accuracy of Boolean Satisfiability using DPLL.
Files
Metadata
Work Title | A Brief Analogy of NNs and Literal Selection Procedures within SAT Solvers |
---|---|
Access | |
Creators |
|
Keyword |
|
License | MIT License |
Work Type | Journal |
Publication Date | January 20, 2023 |
Subject |
|
Language |
|
Geographic Area |
|
Deposited | January 26, 2023 |
Versions
Analytics
Collections
This resource is currently not in any collection.