Elsevier

Neurocomputing

Volume 142, 22 October 2014, Pages 326-334
Neurocomputing

Asymptotic almost automorphic solutions of impulsive neural network with almost automorphic coefficients

https://doi.org/10.1016/j.neucom.2014.04.028Get rights and content

Abstract

In this paper existence and asymptotic stability of asymptotically almost automorphic solution of impulsive neural networks with delay is discussed. The results are established by using various fixed point theorems and Lyapunov-like functional. As far as we know, this is the first paper to discuss such kind of solutions for impulsive neural networks. At the end, we give few examples to illustrate our theoretical findings. One can see that the numerical simulation results show asymptotically almost automorphic behaviour of the solution.

Introduction

Since the introduction of almost periodic functions by Bohr [8], there have been various important generalization of this concept. One important generalization is the concept of asymptotic almost automorphic functions which has been introduced in the literature by N׳Guérékata [18]. One of the very important and natural questions in the field of differential equations is that if the force function possesses a special characteristic then whether the solution possesses the same characteristic or not. This concept is no exception and many mathematicians applied this in the field of differential equations, for more details one could see [10], [11], [15], [28] and references therein. It could be argued that many phenomena exhibit regularity behaviour periodicity. These kinds of phenomena could be modelled by considering more general notion such as almost periodic, almost automorphic, pseudo-almost automorphic and asymptotically almost automorphic.

Recurrent neural networks (RNNs) create an internal state of the network which allows it to exhibit dynamic temporal behaviour. Majorally used RNNs are Hopfield neural networks, cellular neural networks and there are many applications of RNN in the fields of signal processing, pattern recognition, optimization and associative memories. There are many qualitative results on neural networks, some of them are [4], [12], [14], [20] and the references therein. It is very important to study the mathematical properties of the RNN as it gives the long term prediction of the behaviour. The mathematical topics of interests are the nature of the solutions, stability, periodicity, almost periodicity, etc. It has been shown that the RNNs are universal approximators of dynamical systems (see [21]). Asymptotic almost automorphic functions are more general than almost periodic, automorphic, hence it covers wider class of functions. If the observed output is not showing periodic, almost periodic or almost automorphic behaviour, then one could check whether its behaviour is asymptotic almost automorphic or not.

On the other hand, impulsive differential equations involve differential equations on continuous time interval as well as difference equations on discrete set of times. It provides a real framework of modelling the system, which undergo for abrupt changes like shocks, earthquake, and harvesting. There are few published monographs and literatures on impulsive differential [6], [7], [17]. Impulses are sudden interruptions in the systems. In neural case, one could say that these are abrupt changes in the neural state. The affect on human will depend on the intensity of the change. In signal processing, the faulty elements in the corresponding artificial network may produce sudden changes in the state voltages and thereby affect the normal transient behaviour in processing signals or information. For more details one can see [20] and reference therein. Neural networks have been studied extensively, but the mathematical modelling of dynamical systems with impulses is very recent area of research [1], [2], [3], [21], [22]. Impulsive neural networks have been extensively studied by Stamova [26]. There are many works on almost periodicity of impulsive neural networks (for example [22], [23], [24], [25], [26], [27] and reference therein). One cannot avoid time delay while working with many real phenomena. In neural networks the time delay refers to delays in the information processing of neurons due to various reasons. It could be caused by finite switching speed of amplifier circuits. While it is more natural to introduce the time delay, but at the same time it makes the dynamics more complex and the system may lose its stability and show almost-periodic, almost automorphic, pseudo-almost automorphic motion or asymptotic almost automorphic, bifurcation and chaotic nature. These kinds of solutions are more general and cover bigger class of dynamics. Motivated by Stamova [22], in this work we shall study the stability and existence of asymptotic almost automorphic solutions of the following impulsive differential equations:dxi(t)dt=j=1naij(t)xj(t)+j=1nαij(t)fj(xj(t))+j=1nβij(t)fj(xj(tα))+γi(t),tτk,α>0,Δ(x(τk))=Akx(τk)+Ik(x(τk))+γk,x(τk0)=x(τk),x(τk+0)=x(τk)+Δx(τk),kZ,tR,x(t)=Ψ0(t),t[α,0]where aij,αij,βij,fj,γiC(R,R) for i=1,2,,n,j=1,2,,n. Also AkRn×n,Ik(x)C(Ω,Rn),γkRn. We denote Ω a domain in Rn. The symbol C(X,Y) denote the set of all continuous functions from X to Y.

The organization of the paper is as follows: in Section 2, we give some basic definitions and results. In Section 3, we establish the existence of asymptotic almost automorphic solution of Eq. (1.1). At the end in Section 4, we give examples with numerical simulations to illustrate our analytical findings.

Section snippets

Preliminaries

The symbol Rn denotes the n-dimensional space with norm |x|=max{|xi|;i=1,2,,n}. We denote PC(J,Rn), space of piecewise continuous functions from JR to Rn with point of discontinuity of first kind τk at which it is left continuous.

For reader׳s convenience, we define the following class of spaces:

  • PC0(R+×Rn,Rn)={ϕPC(R+×Rn,Rn): limt|ϕ(t,x)|=0intuniformlyinxRn};

  • AA(R,Rn)={ϕPC(R,Rn):ϕ isalmostautomorphicfunction};

  • AA(R×Rn,Rn)={ϕPC(R×Rn,Rn):ϕ isalmostautomorphicfunction};

  • AAA(R×Rn,Rn)={ϕPC(R×Rn,R

Existence and stability

Consider the following linear system corresponding to the system (1.1):dx(t)dt=A(t)x(t),tτkΔx(τk)=Akx(τk),kZ,tR,where A(t)=(aij(t)),i,j=1,2,,n. In order to prove our result, we need the following assumptions:

(H1) The function A(t)C(R,Rn) is asymptotically almost automorphic.

(H2) det(I+Ak)0 and the sequences Ak and τk are asymptotically almost automorphic.

If Uk(t,s) is the Cauchy matrix associated with the system dx(t)dt=A(t)x(t),τk1tτk,then the Cauchy matrix of the system (3.1) is of

Examples

The classical model of Hopfield neural network is the following:dxi(t)dt=ai(t)xi(t)+j=1nαijfj(xj(t))+j=1nβijfj(xj(tα))+γi(t),tτk,α>0,Δ(x(τk))=Akx(τk)+Ik(x(τk))+γk,x(τk0)=x(τk),x(τk+0)=x(τk)+Δx(τk),kZ,tR,x(t)=ϕ0(t),t[α,0]where ai,fj,γiC(R,R), αij,βijR for i=1,2,,n, j=1,2,,n. Also AkRn×n, Ik(x)C(Ω, Rn),γkRn. We denote Ω a domain in Rn. In this case our matrix A(t) is a diagonal matrix with diagonal entire a1(t),,an(t). We assume that ai(t) is asymptotically almost automorphic

Discussion

It is well known that periodicity is very well studied behaviour of many physical and natural systems. In recent past many mathematicians and scientists have argued that a more general class of functions are more suitable to explain many complicated processes which show behaviour which is near periodic. This kind of behaviour of a physical system is called almost periodic. Hence almost periodic functions are most suitable to explain these kinds of phenomena. The asymptotically almost

Acknowledgement

We are thankful to anonymous reviewers for their constructive comments and suggestions, which help us to improve the manuscript. The first author work is partially supported by NBHM Grant “SRIC/IITMANDI/2013/NBHM/SYA/45/02”.

Syed Abbas received his M.Sc. and Ph.D. degrees from Indian Institute of Technology Kanpur, India, in 2004 and 2009 respectively. He is currently working as an assistant professor at the School of Basic Sciences, Indian Institute of Technology Mandi, India, since August 2010. He worked as a research associate at the University of Fribourg, Switzerland, during July–September 2009 and postdoctoral fellow at the University of Bologna, Italy, during June–August 2010 and May–July 2011. From November

References (28)

  • S. Abbas

    Pseudo almost periodic sequence solutions of discrete time cellular neural networks

    Nonlinear Anal. Model. Control

    (2009)
  • B. Ammar et al.

    Existence and uniqueness of pseudo almost-periodic solutions of recurrent neural networks with time-varying coefficients and mixed delays

    IEEE Trans. Neural Netw. Learn. Syst.

    (2012)
  • D. Araya, R. Castro, C. Lizama, Almost automorphic solutions of difference equations, Adv. Differ. Equ. (2009) Art. ID...
  • D.D. Bainov et al.

    Systems with Impulsive Effects

    (1989)
  • Cited by (32)

    View all citing articles on Scopus

    Syed Abbas received his M.Sc. and Ph.D. degrees from Indian Institute of Technology Kanpur, India, in 2004 and 2009 respectively. He is currently working as an assistant professor at the School of Basic Sciences, Indian Institute of Technology Mandi, India, since August 2010. He worked as a research associate at the University of Fribourg, Switzerland, during July–September 2009 and postdoctoral fellow at the University of Bologna, Italy, during June–August 2010 and May–July 2011. From November 2012 to December 2012, he was a visiting guest scientist at the Department of Mathematics, TU Dresden, Germany. His main research interests are Abstract/delay/fractional differential equations, almost periodic solutions, neural networks, and ecological modelling.

    Lakshman Mahto received his M.Sc. degree in Mathematics from the Department of Mathematics, Ranchi university, Ranchi, Jharkhand, India, in 2009. He is currently pursuing his Ph.D. degree in Mathematics from Indian Institute of Technology Mandi, India. His current research interests include stability theory of neural networks and fractional differential equations.

    Mokhtar Hafayed was born in Biskra, Algeria. He is currently working as an assistant professor in the Laboratory of Applied Mathematics, Biskra University, Algeria. His main research interests are differential equations, stochastic control, maximum principle, forward–backward stochastic differential equations and mathematics finance.

    Adel M. Alimi was born in Sfax, Tunisia, in 1966. He graduated in Electrical Engineering in 1990, obtained a Ph.D. and then an HDR both in Electrical & Computer Engineering in 1995 and 2000 respectively. He is now a professor in Electrical & Computer Engineering at the University of Sfax. His research interest includes applications of intelligent methods (neural networks, fuzzy logic, evolutionary algorithms) to pattern recognition, robotic systems, vision systems, and industrial processes. He focuses his research on intelligent pattern recognition, learning, analysis and intelligent control of large scale complex systems.

    He is an associate editor and a member of the editorial board of many international scientific journals (e.g. “Pattern Recognition Letters”, “Neurocomputing”, “Neural Processing Letters”, “International Journal of Image and Graphics”, “Neural Computing and Applications”, “International Journal of Robotics and Automation”, “International Journal of Systems Science”, etc.). He is an IEEE senior member and a member of IAPR, INNS and PRS. He is the 2009–2010 IEEE Tunisia Section Treasurer, the 2009–2010 IEEE Computational Intelligence Society Tunisia Chapter Chair, the 2011 IEEE Sfax Subsection, the 2010–2011 IEEE Computer Society Tunisia Chair, the 2011 IEEE Systems, Man, and Cybernetics Tunisia Chapter, the SMCS corresponding member of the IEEE Committee on Earth Observation, and the IEEE Counselor of the ENIS Student Branch.

    View full text