Elsevier

Physica D: Nonlinear Phenomena

Volume 170, Issue 2, 1 September 2002, Pages 162-173
Physica D: Nonlinear Phenomena

Harmless delays in Cohen–Grossberg neural networks

https://doi.org/10.1016/S0167-2789(02)00544-4Get rights and content

Abstract

Without assuming monotonicity and differentiability of the activation functions and any symmetry of interconnections, we establish some sufficient conditions for the globally asymptotic stability of a unique equilibrium for the Cohen–Grossberg neural network with multiple delays. Lyapunov functionals and functions combined with the Razumikhin technique are employed. The criteria are all independent of the magnitudes of the delays, and thus the delays under these conditions are harmless.

Introduction

Cohen and Grossberg [3] proposed and studied an artificial feedback neural network, which is described by a system of ordinary differential equations ẋi=−ai(xi)bi(xi)−∑j=1ntijsj(xj),i=1,…,n,where n≥2 is the number of neurons in the network, xi denotes the state variable associated to the ith neuron, ai represents an amplification function, and bi is an appropriately behaved function. The n×n connection matrix T=(tij) tells how the neurons are connected in the network, and the activation function sj shows how the jth neuron reacts to the input. Functions ai, bi and si are subject to certain conditions to be specified later. It is seen that (1.1) includes the Hopfield neural network as a special case, which is of the form Ciẋi=−xiRi+∑j=1ntijsj(xj)+Ji,i=1,2,…,n,where the positive constants Ci and Ri are the neuron amplifier input capacitances and resistances, respectively; Ji is the constant input from outside of the network and xi, sj and T=(tij) are the same as in (1.1).

Due to their promising potential for the tasks of classification, associative memory, parallel computations, and their ability to solve difficult optimization problems, , have greatly attracted the attention of the scientific community. Various generalizations and modifications of , have also been proposed and studied, among which is the incorporation of time delay into the model. In fact, due to the finite speeds of the switching and transmission of signals in a network, time delays do exist in a working network and thus should be incorporated into the model equations of the network. For more detailed justifications for introducing delays into model equations of neural networks, see [8] and the recent book [14].

Marcus and Westervelt [8] first introduced a single delay into (1.2) and considered the following system of delay differential equations: Ciẋi=−xiRi+∑j=1ntijsj(xj(t−τ))+Ji,i=1,2,…,n.It was observed both experimentally and numerically in [8] that delay could destroy an otherwise stable network and cause sustained oscillations and thus could be harmful. System (1.3) has also been studied by Wu [15], Wu and Zou [16]. Gopalsamy and He [6], and van den Driessche and Zou [13] studied a further generalized version with multiple delays ẋi=−bixi+∑j=1ntijsj(xj(t−τij))+Ji,i=1,2,…,n.For the Cohen–Grossberg model (1.1), Ye et al. [18] also introduced delays by considering the following system of delay differential equations: ẋi(t)=−ai(xi)bi(xi)−∑k=0Kj=1ntij(k)sj(xj(t−τk)),i=1,2,…,n,where n×n matrixes Tk=(tij(k)) represent the interconnections which are associated with delay τk and the delays τk, k=0,1,…,K, are arranged such that 0=τ0<τ1<⋯<τK.

Established in the pioneering work of Cohen and Grossberg [3] and Hopfield [7] was the “globally asymptotic stability” of systems , , respectively. It was proved that given any initial conditions, the solution of the system (1.1) (or (1.2)) will converge to some equilibrium of the corresponding system. Such a “global stability” in [3], [7] was obtained by considering some potential functions under the assumption that the connection matrix T is symmetric. When it comes to the delayed systems , , , it is natural to expect that this global stability remains if the delays are sufficiently small. Indeed, such an expectation was confirmed in [17], [18] under a certain type of symmetry requirement. When a network is designed for the purpose of associative memories, it is required that the system have a set of stable equilibria, each of which corresponds to an addressable memory. The global stability confirmed in [3], [7], [17], [18] is necessary and crucial for associative memory networks. However, an obvious drawback of the above work is the lack of description or even estimates for the basin of attraction of each stable equilibrium. In other words, given a set of initial conditions, one knows that the solution will converge to some equilibrium, but does not know exactly to which equilibrium it will converge. In terms of associative memories, one does not know precisely what initial conditions are needed in order to retrieve a particular pattern stored in the network. Also, the work of Ye et al. [17], [18] cannot tell what would happen when the delays increase. We have mentioned above that large delay could destroy the stability of an equilibrium in a network. Even if the delay does not change the stability of an equilibria, it could affect the basin of attraction of the stable equilibrium. For such a topic, see the recent work of Pakdaman et al. [9], [10], or Wu [14].

On the other hand, in applications of neural networks to parallel computations, signal processing and other problems involving the optimization, it is required that there be a well-defined computable solution for all possible initial states. In other words, it is required that the network have a unique equilibrium that is globally attractive. In fact, earlier applications of neural networks to optimization problems have suffered from the existence of a complicated set of equilibria (see [12]). Thus, the global attractivity of a unique equilibrium for the model system is of great importance for both practical and theoretical purposes, and has been the major concern of many authors. We refer to Bélair [1], Cao and Wu [2], Gopalsamy and He [6], van den Driessche and Zou [13] for the delayed Hopfield model (1.3) or (1.4). As for the delayed Cohen–Grossberg model (1.5), to the best of the authors’ knowledge, no similar result has been established yet, and this fact motivates this work. Thus, the purpose of this paper is to obtain some criteria for the global attractivity of a unique equilibrium of the following system ẋi(t)=−ai(xi)bi(xi)−∑k=0Kj=1ntij(k)sj(xj(t−τk))+Ji,i=1,2,…,n,where Ji,i=1,2,…,n, denote the constant inputs from outside of the system. We do not confine ourselves to the symmetric connections and thus allow much broader connection topologies for the network. Moreover, unlike most of the previous authors, we will not assume monotonicity and differentiability for the activation functions. Our main results show that under some conditions on the connection strengths and structures, the delays could be harmless in the sense that the solutions of system (1.6) always converge to the unique equilibrium, irrespective of the amplitudes of the delays.

This paper is organized as follows. In Section 2, we introduce some notations and assumptions. In Section 3, we establish our main results on the globally asymptotic stability of (1.6). Some examples and numerical simulations are given in Section 4 to demonstrate the main results, and a summary is given in Section 5.

Section snippets

Preliminaries

Let R denote the set of real numbers and Rn=R×R×⋯×Rn.If x∈Rn, then xT=(x1,…,xn) denotes the transpose of x, ∥x2=(xTx)1/2. Let Rn×n denote the set of n×n real matrices. For Z∈Rn×n, the spectral norm of Z is defined as ∥Z∥2=(max{|λ|:λisaneigenvalueofZTZ})1/2.The initial conditions associated with (1.6) are given in the form xi(s)=φi(s)∈C([−τ,0],R),i=1,2,…,n,where τ=max{τk,0≤kK}=τK.

For functions ai(x) and bi(x), i=1,2,…,n, we have the following assumptions:

  • (H1)

    For each i∈{1,2,…,n}, ai is bounded,

Main results

First, we show that system (1.6) does have an equilibrium.

Proposition 3.1

If (H1), (H2) and (H4) hold, then for every input J, there exists an equilibrium for system (1.6).

Proof

Let the input J be given. By (H1), x is an equilibrium of (1.6) if and only if x=(x1,…,xn)T is a solution of the system bi(xi)−∑k=0Kj=1ntij(k)sj(xj)+Ji=0,i=1,…,n.From (H4), we obtain k=0Kj=1ntij(k)sj(xj)−Ji≤∑k=0Kj=1n|tij(k)|Mj+|Ji|=:Pi.By (H2), we know that bi−1 exists and is increasing. Now consider the equivalent (to (3.1)) system x

Examples and simulations

In this section, we give some examples to demonstrate our criteria. Example 4.2, Example 4.3, Example 4.4 also show that the three criteria do not include one another.

Example 4.1

Consider the all excitatory doubly stochastic connection matrix studied in [8], [15], i.e., ai(u)=1,bi(u)=u,k=0,tii=0, tij=1/(n−1) for ij, sj(u)=s(u), j=1,…,n, is increasingly sigmoid with neuron gain s′(0)=supx∈Rs′(x)>0. Then, ∥T0∥=1,L=s′(0), and all , , reduce to s′(0)<1.Note that (4.1) has been proved by Wu [15] to be a

Summary

As is widely known, time delays do exist in a neural network, due to the finite speeds of switching and transmission of signals in the network. Although delays do not change the structure of the equilibria of the network, they can destroy the stability of an otherwise stable equilibrium. Even if the delays do not change the stability of an equilibrium, they can affect the basin of the attraction of a stable equilibrium. As far as a unique equilibrium is concerned, seeking conditions under which

Acknowledgements

The authors would like to thank two referees for their helpful suggestions and comments.

References (18)

There are more references available in the full text version of this article.

Cited by (191)

  • Quasi projective synchronization of time varying delayed complex valued Cohen-Grossberg neural networks

    2022, Information Sciences
    Citation Excerpt :

    The CGNNs have been introduced by Cohen and Grossberg [11] in the year 1983 as an extension of Hopfield neural networks [27]. Complex-valued CGNNs have numerous applications in the field of engineering and science such as computing technology, neurobiology and population biology [30,39]. The state vectors, connection weights, and also activation, amplification and behaved functions of CGNNs are all complex valued.

View all citing articles on Scopus

Research supported by NCE-MITACS and NSERC of Canada.

View full text