Title
On the stability of Lagrange programming neural networks for satisfiability problems of prepositional calculus
Abstract
The Hopfield type neural networks for solving difficult combinatorial optimization problems have used the gradient descent algorithms to solve constrained optimization problems via penalty functions. However, it is well known that the convergence to local minima is inevitable in these approaches. The Boltzmann Machines have used a simulated annealing technique, and were proven to be able to find global minima theoretically. However they require large computational resources. Recently Lagrange programming neural networks have been proposed. They differ from the gradient descent algorithms by using anti-descent terms in their dynamical differential equations. In this paper we theoretically analyze the stability and the convergence property of one of the Lagrange programming neural networks (LPPH) when it is applied to a satisfiability problem (SAT) of prepositional calculus. We prove that (1) the solutions of the SAT are the equilibrium points of the LPPH and vice versa, and (2) if the given expression is satisfiable, there is at least one stable equilibrium point of the LPPH.
Year
DOI
Venue
1996
10.1016/0925-2312(95)00087-9
Neurocomputing
Keywords
Field
DocType
High-order neural network,Lagrangian method,Stability,Satisfiability problem,Propositional calculus
Convergence (routing),Satisfiability,Equilibrium point,Artificial intelligence,Artificial neural network,Simulated annealing,Gradient descent,Mathematical optimization,Boolean satisfiability problem,Maxima and minima,Machine learning,Calculus,Mathematics
Journal
Volume
Issue
ISSN
13
2-4
0925-2312
Citations 
PageRank 
References 
10
0.85
4
Authors
2
Name
Order
Citations
PageRank
M. Nagamatu1100.85
T. Yanaru2166.03