Title
An efficient neural network for solving convex optimization problems with a nonlinear complementarity problem function
Abstract
In this paper, we present a one-layer recurrent neural network (NN) for solving convex optimization problems by using the Mangasarian and Solodov (MS) implicit Lagrangian function. In this paper by using Krush–Kuhn–Tucker conditions and MS function the NN model was derived from an unconstrained minimization problem. The proposed NN model is one layer and compared to the available NNs for solving convex optimization problems, which has a better performance in convergence time. The proposed NN model is stable in the sense of Lyapunov and globally convergent to optimal solution of the original problem. Finally, simulation results on several numerical examples are presented and the validity of the proposed NN model is demonstrated.
Year
DOI
Venue
2020
10.1007/s00500-019-04189-8
Soft Computing
Keywords
Field
DocType
One-layer neural networks, Convex programming, Nonlinear complementarity problem
Convergence (routing),Minimization problem,Lyapunov function,Mathematical optimization,Lagrangian,Computer science,Recurrent neural network,Artificial neural network,Convex optimization,Nonlinear complementarity problem
Journal
Volume
Issue
ISSN
24
6
1432-7643
Citations 
PageRank 
References 
0
0.34
0
Authors
3
Name
Order
Citations
PageRank
Mahdi Ranjbar1102.51
Effati Sohrab227630.31
S. M. Miri300.68