A Brief Analogy of NNs and Literal Selection Procedures within SAT Solvers

Neural Networks (NNs) are an important component in analytical solving and heuristic determination of propositional logical statements in current and future methodologies of such statements. There are many devised architectures that predict solutions for given propositional statements as a large data-set (10000-100000 problems) and successfully categorize it’s UN/SAT core predictability to vary degrees of accuracy. This work is the first step towards designing, implementing, and testing a Transformer Neural Network that is capable of handling the generation and solving of First Order Propositional Logic, and can be modified for Higher Order Logic, in a parallel fashion. Understanding the current standards for these systems, such as NeuroSAT and an exploration into other possibilities of approach that are non-conventional, such as TGNs (Typed Graph Networks). Given the ultimate goal to develop and initially theorize a mathematically sound structure for a TNN (Transformer Neural Network) employing parallel multi-head attention processing queues, it is important to gain specific direction in regards to the other tiers of the complete system with the analysis of closely related works. Furthermore, the paradigm of exploring literal based choice analytics and optimization patterns within a prime NN architecture for solving SAT based problems up to k levels will be an important exploration into improving the run-time and accuracy of Boolean Satisfiability using DPLL.

Files

Metadata

Work Title A Brief Analogy of NNs and Literal Selection Procedures within SAT Solvers
Access
Open Access
Creators
  1. Mahfuza Farooque
  2. Arya Keni
Keyword
  1. Neural Networks
  2. Machine Learning
  3. SAT Solvers
  4. GNN
  5. TNN
  6. RNN
License MIT License
Work Type Conference Proceeding
Publication Date January 20, 2023
Subject
  1. Computer Science
  2. Formal Verification
Language
  1. English
Deposited January 26, 2023

Versions

Analytics

Collections

This resource is currently not in any collection.

Work History

Version 1
published

  • Created
  • Updated
  • Added Creator Arya Keni
  • Added Creator Mahfuza Farooque
  • Added A_Brief_Analogy_of_NNs_on_SAT_Solvers____Reinvention_Journal_Published_Version__wBooleanExplanations_.pdf
  • Updated License Show Changes
    License
    • https://opensource.org/licenses/MIT
  • Published
  • Updated Geographic Area, Description Show Changes
    Geographic Area
    • State College, PA, USA
    Description
    • Neural Networks (NNs) are an important
    • component in analytical solving and heuristic
    • determination of propositional logical statements in current and future methodologies
    • of such statements. There are many devised
    • architectures that predict solutions for given
    • propositional statements as a large data-set
    • (10000-100000 problems) and successfully
    • categorize it’s UN/SAT core predictability to
    • vary degrees of accuracy. This work is the first
    • step towards designing, implementing, and
    • testing a Transformer Neural Network that is
    • capable of handling the generation and solving of First Order Propositional Logic, and
    • can be modified for Higher Order Logic, in a
    • parallel fashion. Understanding the current
    • standards for these systems, such as NeuroSAT
    • and an exploration into other possibilities of
    • approach that are non-conventional, such as
    • TGNs (Typed Graph Networks). Given the
    • ultimate goal to develop and initially theorize
    • a mathematically sound structure for a TNN
    • (Transformer Neural Network) employing parallel multi-head attention processing queues,
    • it is important to gain specific direction in
    • regards to the other tiers of the complete system with the analysis of closely related works.
    • Furthermore, the paradigm of exploring literal based choice analytics and optimization
    • patterns within a prime NN architecture for
    • solving SAT based problems up to k levels will
    • be an important exploration into improving
    • the run-time and accuracy of Boolean Satisfiability using DPLL.
    • Neural Networks (NNs) are an important component in analytical solving and heuristic determination of propositional logical statements in current and future methodologies of such statements. There are many devised architectures that predict solutions for given propositional statements as a large data-set (10000-100000 problems) and successfully categorize it’s UN/SAT core predictability to vary degrees of accuracy. This work is the first step towards designing, implementing, and testing a Transformer Neural Network that is capable of handling the generation and solving of First Order Propositional Logic, and can be modified for Higher Order Logic, in a
    • parallel fashion. Understanding the current standards for these systems, such as NeuroSAT and an exploration into other possibilities of approach that are non-conventional, such as TGNs (Typed Graph Networks). Given the ultimate goal to develop and initially theorize a mathematically sound structure for a TNN (Transformer Neural Network) employing parallel multi-head attention processing queues, it is important to gain specific direction in regards to the other tiers of the complete system with the analysis of closely related works. Furthermore, the paradigm of exploring literal based choice analytics and optimization patterns within a prime NN architecture for solving SAT based problems up to k levels will be an important exploration into improving the run-time and accuracy of Boolean Satisfiability using DPLL.
  • Updated Creator Arya Keni
  • Updated Creator Mahfuza Farooque
  • Updated

Version 2
published

  • Created
  • Updated License, Subtitle, Publisher's Statement Show Changes
    License
    • https://opensource.org/licenses/MIT
    • https://creativecommons.org/licenses/by/4.0/
    Subtitle
    • Note about publication and dissemination of work
    Publisher's Statement
    • This work is currently under review in the SCOPUS (subsidiary of Elsevier) journal sponsored by ILSET 2023 conference, and this work was also successfully presented/published in the ILSET 2023 conference over an online format in April 2023.
  • Updated Publisher, Geographic Area, Description Show Changes
    Publisher
    • ILSET 2023
    Geographic Area
    • United States
    Description
    • Neural Networks (NNs) are an important component in analytical solving and heuristic determination of propositional logical statements in current and future methodologies of such statements. There are many devised architectures that predict solutions for given propositional statements as a large data-set (10000-100000 problems) and successfully categorize it’s UN/SAT core predictability to vary degrees of accuracy. This work is the first step towards designing, implementing, and testing a Transformer Neural Network that is capable of handling the generation and solving of First Order Propositional Logic, and can be modified for Higher Order Logic, in a
    • parallel fashion. Understanding the current standards for these systems, such as NeuroSAT and an exploration into other possibilities of approach that are non-conventional, such as TGNs (Typed Graph Networks). Given the ultimate goal to develop and initially theorize a mathematically sound structure for a TNN (Transformer Neural Network) employing parallel multi-head attention processing queues, it is important to gain specific direction in regards to the other tiers of the complete system with the analysis of closely related works. Furthermore, the paradigm of exploring literal based choice analytics and optimization patterns within a prime NN architecture for solving SAT based problems up to k levels will be an important exploration into improving the run-time and accuracy of Boolean Satisfiability using DPLL.
    • Neural Networks (NNs) are an important component in analytical solving and heuristic determination of propositional logical statements in current and future methodologies of such statements. There are many devised architectures that predict solutions for given propositional statements as a large data-set (10000-100000 problems) and successfully categorize it’s UN/SAT core predictability to vary degrees of accuracy. This work is the first step towards designing, implementing, and testing a Transformer Neural Network that is capable of handling the generation and solving of First Order Propositional Logic, and can be modified for Higher Order Logic, in a parallel fashion. Understanding the current standards for these systems, such as NeuroSAT and an exploration into other possibilities of approach that are non-conventional, such as TGNs (Typed Graph Networks). Given the ultimate goal to develop and initially theorize a mathematically sound structure for a TNN (Transformer Neural Network) employing parallel multi-head attention processing queues, it is important to gain specific direction in regards to the other tiers of the complete system with the analysis of closely related works. Furthermore, the paradigm of exploring literal based choice analytics and optimization patterns within a prime NN architecture for solving SAT based problems up to k levels will be an important exploration into improving the run-time and accuracy of Boolean Satisfiability using DPLL.
  • Published
  • Updated