A Brief Analogy of NNs and Literal Selection Procedures within SAT Solvers
Neural Networks (NNs) are an important component in analytical solving and heuristic determination of propositional logical statements in current and future methodologies of such statements. There are many devised architectures that predict solutions for given propositional statements as a large data-set (10000-100000 problems) and successfully categorize it’s UN/SAT core predictability to vary degrees of accuracy. This work is the first step towards designing, implementing, and testing a Transformer Neural Network that is capable of handling the generation and solving of First Order Propositional Logic, and can be modified for Higher Order Logic, in a parallel fashion. Understanding the current standards for these systems, such as NeuroSAT and an exploration into other possibilities of approach that are non-conventional, such as TGNs (Typed Graph Networks). Given the ultimate goal to develop and initially theorize a mathematically sound structure for a TNN (Transformer Neural Network) employing parallel multi-head attention processing queues, it is important to gain specific direction in regards to the other tiers of the complete system with the analysis of closely related works. Furthermore, the paradigm of exploring literal based choice analytics and optimization patterns within a prime NN architecture for solving SAT based problems up to k levels will be an important exploration into improving the run-time and accuracy of Boolean Satisfiability using DPLL.
This work is currently under review in the SCOPUS (subsidiary of Elsevier) journal sponsored by ILSET 2023 conference, and this work was also successfully presented/published in the ILSET 2023 conference over an online format in April 2023.
|A Brief Analogy of NNs and Literal Selection Procedures within SAT Solvers
|Note about publication and dissemination of work
|CC BY 4.0 (Attribution)
|January 20, 2023
|January 26, 2023
This resource is currently not in any collection.