Title
Lagrangian Method for Satisfiability Problems of Propositional Calculus
Abstract
The Hopfield type neural networks for solving difficult combinatorial optimization problems have used the gradient descent algorithms to solve constrained optimization problems via penalty functions. However, it is well known that the convergence to local minima is inevitable in these approaches. Recently Lagrange programming neural networks have been proposed. They differ from the gradient descent algorithms by using anti-descent terms in their dynamical differential equations. In this paper we analyze the stability and the convergence property of the Lagrangian method when it is applied to a satisfiability problem of propositional calculus.
Year
DOI
Venue
1995
10.1109/ANNES.1995.499442
ANNES
Keywords
Field
DocType
convergence property,hopfield type neural network,gradient descent algorithm,dynamical differential equation,satisfiability problems,anti-descent term,optimization problem,lagrange programming neural network,lagrangian method,propositional calculus,difficult combinatorial optimization problem,local minimum,calculus,satisfiability,neural network,constraint optimization,local minima,satisfiability problem,neural networks,convergence,differential equation,neural nets,numerical stability,computability,penalty function,stability,gradient descent,stability analysis,differential equations
Gradient descent,Mathematical optimization,Computer science,Boolean satisfiability problem,Satisfiability,Propositional calculus,Augmented Lagrangian method,Lagrangian relaxation,Artificial neural network,Constrained optimization
Conference
ISBN
Citations 
PageRank 
0-8186-7174-2
0
0.34
References 
Authors
3
2
Name
Order
Citations
PageRank
Masahiro Nagamatu154.64
T. Yanaru2166.03