Title
A new method for global stability analysis of delayed reaction-diffusion neural networks.
Abstract
This paper presents improved criteria for global exponential stability of reaction–diffusion neural networks with time-varying delays. A novel diffusion-dependent Lyapunov functional, which is directly linked to the diffusion terms, is suggested to analyze the role of diffusivity of each neuron on the model dynamics. In the case of Dirichlet boundary conditions, the extended Wirtinger’s inequality is employed to exploit the stabilizing effect of reaction–diffusion terms. In the framework of descriptor system approach, the augmented Lyapunov functional technique is utilized to reduce the conservatism in the values of the time delay bounds. As a result, the derived global stability criteria are more effective than the existing ones. Three numerical examples are provided to illustrate the effectiveness of the proposed methodology.
Year
DOI
Venue
2018
10.1016/j.neucom.2018.08.015
Neurocomputing
Keywords
Field
DocType
Reaction–diffusion neural networks,Time-varying delays,Lyapunov method,Linear matrix inequality (LMI)
Applied mathematics,Pattern recognition,Dirichlet boundary condition,Exponential stability,Artificial intelligence,Artificial neural network,Reaction–diffusion system,Lyapunov functional,Thermal diffusivity,Mathematics
Journal
Volume
ISSN
Citations 
317
0925-2312
1
PageRank 
References 
Authors
0.35
23
4
Name
Order
Citations
PageRank
Xiaomei Lu11248.38
Wu-Hua Chen286958.24
Zhen Ruan3151.18
Tingwen Huang45684310.24