Controllability of nonlinear stochastic neutral fractional dynamical systems

Abstract. In this paper, we obtain an equivalent nonlinear integral equation to the stochastic neutral fractional system with bounded operator. Using the integral equation, the sufficient conditions for ensuring the complete controllability of the stochastic fractional neutral systems with Wiener and Lévy noise are obtained. Banach’s fixed point theorem is used to obtain the results. Examples are provided to illustrate the theory.


Introduction
Controllability is a qualitative property of dynamical systems and is of particular importance in control theory.Theory of controllability originates from the famous work of Kalman in 1960, where the concept of controllability was defined for finite dimensional deterministic linear systems.The natural extension of the concept of controllability to infinite dimensional systems is studied by many authors.A discussion on the concepts of controllability of infinite dimensional systems can be found in [2,6,8].
In recent years, fractional differential equations (FDEs) have attracted considerable interest due to its ability to model complex phenomena by capturing nonlocal relations in space and time.At the same time, the fluctuations in nature can be captured only by adding random elements into the differential equations, which are called stochastic differential equations (SDEs).Also, in many applications, one assumes that the system under consideration is governed by a principle of causality; that is, the future state of the system is independent of the past states and is determined solely by the present.However, under closer scrutiny, it becomes apparent that the principle of causality is often only a first approximation to the true situation and that a more realistic model would include some of the past states of the system.There are also a number of applications in which the delayed argument occurs in the derivative of the state variable as well as in the independent variable, the so-called neutral differential difference equations.Such problems are more difficult to motivate but often arise in the study of two or more simple oscillatory systems with some interconnections between them.In some cases, the connection can be replaced by differential equations involving delays in the highest order derivatives.Neutral differential equations are encountered in the description of various physical scenarios like the lossless transmission connection, stunted transmission connection [7], vibrating masses attached to an elastic bar [11], and collision problem in electrodynamics [10].The controllability of such systems are studied in [12] and the references therein.Therefore, the investigation of fractional neutral differential equations with stochastic nature attracts great attention, especially as regards to controllability.
The controllability of fractional and stochastic dynamical systems have been studied by many authors separately.The controllability of linear and nonlinear fractional dynamical systems is studied in [4] and the references therein.The natural extension of the controllability concepts from deterministic to stochastic control systems has no meaning.Therefore, there is a need in further weakening of these concepts in order to extend them to stochastic control systems.For the controllability of SDEs, one can refer to [3,5,18,19,23].It is worth pointing out that most of the works on controllability of stochastic systems only focused on the case of SDEs driven by a Brownian motion [9].Unfortunately, the fluctuations in financial markets, sudden changes in the environment, and many other real systems cannot be described by Brownian motion, and this leads to the use of Lévy noise to model such discontinuous systems.Lévy processes have stationary and independent increments, their sample paths are right continuous having number of discontinuities at random times, and they are special classes of semi martingales and Markov processes.Along with these advantages, Lévy processes have applications in diverse fields like mathematical finance, financial economics, stochastic control, and quantum field theory.These form the reason for making the study of SDEs with Lévy noise important inspite of its increased mathematical complexities.A detailed study of Lévy process in finite and infinite dimensions can be found in [1,20] and the references therein.
In [4], controllability of linear system of the form where 1/2 < α 1 and A and B are bounded linear operators, is investigated.The controllability of stochastic counterpart of the above fractional dynamical integro-differential systems is studied in [15].In this paper, our aim is to extend the results to stochastic neutral fractional dynamical system driven by Wiener and Lévy noise.The Lévy-Itô decomposition of an arbitrary Lévy process into Brownian and Poisson parts is used to study the stochastic fractional system with Lévy noise.Examples are provided to support the developed theory.

Preliminaries
Let X, U , and K be separable Hilbert spaces, and for convenience, we will use the same notation • to represent their norms.L(X, U ) is the space of all bounded linear operators from X to U , L p (X) is the Lebesgue space of p-integrable functions on X, B(X) is the Borel σ-algebra of subsets of X, and J denotes the interval [0, T ].
We assume that a filtered probability space (Ω, F, {F t } t 0 , P) with the probability measure P on Ω satisfies the usual hypothesis: (i) F 0 contains all A ∈ F such that P(A) = 0, (ii) F t = F t + for all t ∈ J, where F t + is the intersection of all F s , s > t, i.e., the filtration is right continuous.
Let us consider the following space settings: cesses with values in X, identifying processes, which are modification of each other and endowed with the norm, where E denotes expectation with respect to P.
) is a Hilbert space of all square integrable and F t -measurable processes with values in U .
• H 0 2 := L 2 (Ω, F 0 , X) is the Hilbert space of all F 0 -measurable square integrable random variables with values in X.
Let us recall some basic definitions from fractional calculus.Let α, β > 0 with n−1 < α, β < n and n Definition 1. (See [13].)The Riemann-Liouville fractional integral of a function f is defined as where the function f (t) has absolutely continuous derivative up to order n − 1.

Controllability results for systems with Wiener noise
In this section, we obtain sufficient conditions for the controllability of nonlinear stochastic fractional neutral differential system where 0 < α 1, α = 1/2, A : X → X is a bounded linear operator, W (t) is a K-valued Wiener process with positive symmetric trace class covariance operator, σ : J ×X → L 0 2 (K, X) (where L 0 2 is the space of Hilbert-Schmidt operators [22]), functions f, g : J ×X → X are continuous, and g is continuously differentiable, u ∈ U ad , a Hilbert space of admissible control functions, and B : U → X is a bounded linear operator.
Lemma 1. (See [14].)Suppose that A is a linear bounded operator defined on a Banach space, and assume that A < 1.Then (I − A) −1 is linear, bounded, and The convergence of the above series is in the operator norm, and Let us assume the following hypothesis: Let x ∈ H 2 , then by (H1) we have Nonlinear Anal.Model.Control, 22(5):702-718 which implies that I α A < 1.Hence, by Lemma 1 we conclude that (I − I α A) −1 is a bounded linear operator satisfying On the other hand, taking I α on both sides of (2), we have Therefore, using Lemma 1 and the fact that I α commutes with A, we obtain x 0 − g(0, x 0 ) Thus, the solution of ( 2) is the solution of the above nonlinear integral equation ( 3).
Similarly to the conventional controllability concept, the controllability of the stochastic fractional dynamical system is defined as follows: the set of all states attainable from x 0 in time t > 0 is given by the set where x(t) is given in (3).Definition 3. (See [18].)The stochastic fractional system (2) is said to be completely controllable on the interval J if for every x 1 ∈ Y , there exists a control u ∈ U ad such that the solution x(t) given in (3) satisfies x(T ) = x 1 .
In other words, R T (x 0 ) = Y.
Define the operator L T : U ad → X as (see [18]) Clearly, the adjoint operator L * T of L T satisfying L * T ∈ L(X, U ad ) is obtained as Definition 4. (See [21].)The controllability Grammian operator W T : X → X is defined as where * denotes adjoint operator.
The corresponding deterministic operator Γ T : X → X is given by The linear system corresponding to (2) is where the functions f, g : J → X are continuous and g is continuously differentiable.
Theorem 1.The fractional system (4) is controllable on J if and only if for some γ > 0, The proof is similar to that of the integer order case given in [17], provided that the relation between W T and Γ T s [17, Lemma 5] remains the same for the fractional order case.The following lemma asserts that the relation between W T and Γ T s remains the same even for the fractional order systems.
Lemma 2. For every z ∈ Y , there exists a process φ(•) ∈ L F 2 (J, L(K, X)) such that: Proof.(i) can be obtained as in [17].Now, we prove (ii).Let z ∈ L 2 (Ω, F T , X), then from the first equality we have Now, the definition of the operator and stochastic Fubini's theorem lead to the desired representation: This completes the proof of the lemma.
https://www.mii.vu.lt/NAFor simplicity, take Let us further assume the following conditions: (H2) g : J × X → X is continuous, and there exists a constant N 1 > 0 such that (H3) f : J × X → X is continuous, and there exists a constant N 3 > 0 such that (H4) σ : J × X → L 0 2 is continuous, and there exists a constant N 5 > 0 such that Theorem 2. If hypothesis (H1)-(H5) are satisfied and if the linear stochastic fractional neutral system corresponding to (2) is completely controllable, then the nonlinear stochastic fractional neutral system (2) is completely controllable.
Proof.Let x 1 be an arbitrary random variable in Y .Define the operator Φ on H 2 by Nonlinear Anal.Model.Control, 22(5):702-718 Since the linear system corresponding to the nonlinear system ( 2) is controllable, we have that W T is invertible (see [15]).Define the control variable u as We now show that Φ has a fixed point.This fixed point is then a solution of the control problem.Clearly, Φ(x(T )) = x 1 , which means that the control u steers the nonlinear system from the initial state x 0 to x 1 in the time T , provided we can obtain a fixed point of the nonlinear operator Φ.First, we show that Φ maps H 2 into itself.From the assumptions we have From ( 5) it follows that there exists a constant C 1 > 0 such that Thus, Φ maps H 2 into itself.Now, for x 1 , x 2 ∈ H 2 , we have Using (H5), we conclude that Φ is a contraction mapping, and hence, there exists a unique fixed point x ∈ H 2 for Φ.This fixed point of Φ satisfies x(T ) = x 1 for any arbitrary x 1 ∈ Y .Therefore, system (2) is completely controllable on J.
Nonlinear Anal.Model.Control, 22(5):702-718 4 Controllability results for systems with Lévy noise Consider the nonlinear stochastic neutral fractional differential system driven by Lévy noise of the form Here d N (t, z) = N (dt, dz) = N (dt, dz) − ν(dz)dt is a compensated Poisson random measure, where N (dt, dz) denotes the Poisson random measure associated to Poisson point process on Z ∈ B(X), and ν(dz) is a σ-finite Lévy measure on (Z, B(Z)), h : If hypothesis (H1) is satisfied, then by the Lemma 1 the solution of system (2) is the same as the solution of the following nonlinear integral equation: which can be obtained similar to (3).We assume the following conditions: (H6) h : J × X × Z → X is continuous, and there exists a constant be such that 0 ρ 2 < 1.
Proof.Let x 1 be an arbitrary random variable in Y .Define the operator Φ on H 2 by Since the linear system corresponding to the nonlinear system ( 6) is completely controllable, we have that W T is invertible [16].Define the control variable u as We now show that Φ has a fixed point.This fixed point is then a solution of the control problem.Clearly, Φ(x(T )) = x 1 , which means that the control u steers the nonlinear system from the initial state x 0 to x 1 in the time T , provided we can obtain a fixed point of the nonlinear operator Φ.First, we show that Φ maps H 2 into itself.From the assumptions we have From ( 8) it follows that there exists a constant C 2 > 0 such that Thus, Φ maps H 2 into itself.Now, for x 1 , x 2 ∈ H 2 , we have Using (H7), we conclude that Φ is a contraction mapping, and hence, there exists a unique fixed point x ∈ H 2 for Φ.This fixed point of Φ satisfies x(T ) = x 1 for any arbitrary x 1 ∈ Y .Therefore, system (6) is completely controllable on J.

Example
In this section, we provide examples to support the theory developed in the previous sections.
To show that the nonlinear system ( 10) is controllable, it is enough to check if the hypotheses of Theorem 3 are satisfied.We first check if the linear system corresponding to (10)  where 0 < γ 0.7869.We see that σ(t, x(t)), g(t, x(t), z), and h(t, x(t)) are Lipschitz continuous with 1/225 as Lipschitz constant.We also obtain the value of ρ in hypothesis (H7) to be ρ = 0.6295 < 1.All the hypothesis of Theorem 3 are thus verified, and hence, system (10) is controllable.
is controllable by showing the operator is invertible.