An additive diagonal-stability condition for absolute exponential stability of a general class of neural networks

Document Type


Publication Date



This paper presents new results on the absolute exponential stability (AEST) of neural networks with a general class of partially Lipschitz continuous and monotone increasing activation functions under a mild condition that the interconnection matrix T of the network system is additively diagonally stable; i.e., for any positive diagonal matrix D 1, there exists a positive diagonal matrix D 2 such that D 2(T - D 1) + (T - D 1) T D 2 is negative definite. This result means that the neural networks with additively diagonally stable interconnection matrices are guaranteed to be globally exponentially stable for any neuron activation functions in the above class, any constant input vectors and any other network parameters. The additively diagonally stable interconnection matrices include diagonally semistable ones and H-matrices with nonpositive diagonal elements as special cases. The obtained AEST result substantially extends the existing ones in the literature on absolute stability (ABST) of neural networks. The additive diagonal stability condition is shown to be necessary and sufficient for AEST of neural networks with two neurons. Summary and discussion of the known results about ABST and AEST of neural networks are also given.

Publication Source (Journal or Book title)

IEEE Transactions on Circuits and Systems I: Fundamental Theory and Applications

First Page


Last Page


This document is currently not available here.