Title
Extended LaSalle's invariance principle for full-range cellular neural networks
Abstract
In several relevant applications to the solution of signal processing tasks in real time, a cellular neural network (CNN) is required to be convergent, that is, each solution should tend toward some equilibrium point. The paper develops a Lyapunov method, which is based on a generalized version of LaSalle's invariance principle, for studying convergence and stability of the differential inclusions modeling the dynamics of the full-range (FR) model of CNNs. The applicability of the method is demonstrated by obtaining a rigorous proof of convergence for symmetric FR-CNNs. The proof, which is a direct consequence of the fact that a symmetric FR-CNN admits a strict Lyapunov function, is much more simple than the corresponding proof of convergence for symmetric standard CNNs.
Year
DOI
Venue
2009
10.1155/2009/730968
EURASIP Journal on Advances in Signal Processing
Keywords
Field
DocType
equilibrium point,symmetric fr-cnn,extended lasalle,symmetric fr-cnns,cellular neural network,differential inclusion,rigorous proof,direct consequence,symmetric standard cnns,full-range cellular neural network,invariance principle,generalized version,corresponding proof,stability analysis,symmetric matrices,very large scale integration,differential equations,hypercubes,invariance,convergence,stability,cellular neural networks,mathematical model,lyapunov function,set theory
Differential inclusion,Convergence (routing),Lyapunov function,Applied mathematics,Invariance principle,Invariant (physics),Computer science,Symmetric matrix,Artificial intelligence,Cellular neural network,Machine learning,LaSalle's invariance principle
Journal
Volume
Issue
ISSN
2009,
1
1687-6180
Citations 
PageRank 
References 
1
0.36
9
Authors
4
Name
Order
Citations
PageRank
Mauro Di Marco120518.38
Mauro Forti239836.80
Massimo Grazzini313111.01
Luca Pancioni420717.58