Title
Stability Analysis of Gradient-Based Neural Networks for Optimization Problems
Abstract
The paper introduces a new approach to analyze the stability of neural network models without using any Lyapunov function. With the new approach, we investigate the stability properties of the general gradient-based neural network model for optimization problems. Our discussion includes both isolated equilibrium points and connected equilibrium sets which could be unbounded. For a general optimization problem, if the objective function is bounded below and its gradient is Lipschitz continuous, we prove that (a) any trajectory of the gradient-based neural network converges to an equilibrium point, and (b) the Lyapunov stability is equivalent to the asymptotical stability in the gradient-based neural networks. For a convex optimization problem, under the same assumptions, we show that any trajectory of gradient-based neural networks will converge to an asymptotically stable equilibrium point of the neural networks. For a general nonlinear objective function, we propose a refined gradient-based neural network, whose trajectory with any arbitrary initial point will converge to an equilibrium point, which satisfies the second order necessary optimality conditions for optimization problems. Promising simulation results of a refined gradient-based neural network on some problems are also reported.
Year
DOI
Venue
2001
10.1023/A:1011245911067
Journal of Global Optimization
Keywords
Field
DocType
Gradient-based neural network,Equilibrium point,Equilibrium set,Asymptotic stability,Exponential stability
Lyapunov function,Gradient descent,Mathematical optimization,Stochastic neural network,Equilibrium point,Recurrent neural network,Artificial neural network,Backpropagation,Optimization problem,Mathematics
Journal
Volume
Issue
ISSN
19
4
1573-2916
Citations 
PageRank 
References 
19
1.46
11
Authors
4
Name
Order
Citations
PageRank
Qiaoming Han11149.00
Li-Zhi Liao244835.22
Houduo Qi343732.91
Liqun Qi43155284.52