Robust strong duality for nonconvex optimization problem under data uncertainty in constraint

This paper deals with the robust strong duality for nonconvex optimization problem with the data uncertainty in constraint. A new weak conjugate function which is abstract convex, is introduced and three kinds of robust dual problems are constructed to the primal optimization problem by employing this weak conjugate function: the robust augmented Lagrange dual, the robust weak Fenchel dual and the robust weak Fenchel-Lagrange dual problem. Characterizations of inequality (1.1) according to robust abstract perturbation weak conjugate duality are established by using the abstract convexity. The results are used to obtain robust strong duality between noncovex uncertain optimization problem and its robust dual problems mentioned above, the optimality conditions for this noncovex uncertain optimization problem are also investigated.


Introduction
Robust optimization problems [4,5,7,8,22,23,[29][30][31] and robust dual theory [3, 6, 10-13, 15, 16, 18, 19, 28] have attracted much attention of mathematical researchers. Many of the works in this area were considered convex robust optimization problems, in [6,15] robust Lagrangian strong duality was established in convex optimization and in [16] robust Lagrangian strong duality theorem was given whenever the Lagrangian function is convex. Moreover, duality theory which is based on conjugate function plays an important role in optimization. In convex analysis, dual problem is constructed in terms of conjugate functions by using the well-known Legendre-Fenchel transform.
Robust classical conjugate duality was presented for convex optimization problem in [18]. Furthermore, in [13], characterizations of inequality below in terms of robust abstract perturbational duality were established, where X, Y are locally convex Hausdorff topological vector spaces, V ∅ is an uncertainty set, F V : X × Y →R = R ∪ {±∞} for each v ∈ V, l : X →R is a lower semicontinuous proper convex function. The results were then applied to robust DC and robust convex optimization problems, and strong Fenchel duality and strong Lagrangian duality for these classes of robust problems were also obtained. It is well known that dual problems constructed by using general augmented Lagrangian functions or weak conjugate functions, and strong duality conditions for noncovex optimization problems were comprehensively studied by researchers [1,2,13,14,17,20,21,25,27,33]. In particular, the conjugate function theory, developed by Azimov and Gasimov in [1], used superlinear functions of the form x * , x − c x instead of linear functions x * , x used in convex analysis. They extended the usual definition of the subdifferential, using this class of functions, and established duality relations in terms of so-called weak subdifferentiability of the perturbation function associated with the problem under consideration. By using weak conjugate function and weak subdifferential given in [1], Küçük ect. in [17] constructed weak Fenchel conjugate dual problem and weak Fenchel-Lagrange conjugate dual problem, presented necessary and sufficient conditions for the strong duality of the dual problems and nonconvex scalar optimization problem; In [33], the duality scheme and strong duality theorems for nonconvex optimization problem were presented, which are based on the weak conjugate function and the weak subdifferential concept given in [1].
Nevertheless, there are few duality results on nonconvex robust optimization problem in the literature, since it is not only very hard to verify the zero duality gap conditions formulated in terms of perturbation and/or dualizing parameterization functions, but also to derive the conditions formulated in terms of objective and constraint functions. Motivated by [13,17,33], the aim of this paper is to formulate robust dual problems by using the weak conjugate function we introduced (see Definition 2.1) and establish robust strong duality results for nonconvex uncertain optimization problem. Characterization of general inequality (1.1) above with uncertainty is established according to robust perturbational weak conjugate duality, where we only assume the right hand function l in (1.1) is abstract convex [9,24], which covers very broad classes on nonconvex functions. Then the results are used as key tools to obtain the strong duality for the robust augmented Lagrange dual (RD L ), robust weak Fenchel dual (RD w F ) and the robust weak Fenchel-Lagrange dual problems (RD w LF ) which are all defined by using the weak conjugate function, and are also applied to investigate the optimality conditions for nonconvex robust optimization problem.
The paper is organized as follows. In section 2, we recall some notations and introduce some preliminary results which will be used in the rest of paper. In section 3, we construct three types of robust dual problems for the primal optimization problem by using the weak conjugate function and obtain the strong duality respectively by establishing the inequality (1.1) via robust perturbation weak conjugate duality. In section 4, we investigate the relations among the optimal objective values of (RD L ), (RD w F ), (RD w LF ) and the robust optimization (RP) of (UP). Finally, section 5, we present necessary and sufficient optimality conditions for (RD L ), (RD w F ), R(D w LF ) and (RP).

Preliminary results
In this section, we introduce the definitions of weak conjugate, weak biconjugate function, weak subdifferentials and some basic theorems and lemmas about these notions.
Throughout this paper, let X, Y be two locally convex vector spaces with their topological dual spaces X * and Y * , endowed with the weak * topologies W(X * , X) and W(Y * , Y), respectively. Let D ⊂ Y be a nonempty closed convex cone, the dual cone of D is defined by where we use the notation ·, · for the value of the continuous linear function y * ∈ Y * and y ∈ Y. We use the notation R + = {x | x ∈ R, x ≥ 0}. We also recall the corresponding concepts and results on (extended) real-valued functions. Let f : G →R, g : G →R be functions defined on a set G ⊆ X, then the inequality f ≤ g means that f (x) ≤ g(x) for all x ∈ G. The domain and the epigraph of f are respectively. The strict epigraph of f : X →R is the set We now introduce the definitions of new weak conjugate and weak biconjugate functions. First we need to have a function σ for the above definitions. It is assumed that σ : Y → R + is continuous function with the following properties: (2.1) is called the weak conjugate function of f . This function is H X −convex; (b) The function f ww : X →R defined by Definition 2.2. Let X be a locally convex vector space. Let f : X → R be a single valued function and The set Remark 2.2. If σ(x) = x , then the definition of 2.2 reduces to the corresponding definition in [1].
Consider the following optimization problem with uncertain parameter in the constraint: where f : X →R and g : X × Z → Y are given functions, Z is another locally convex vector space, Q ⊂ X is a nonempty closed set, v is uncertain parameter and belongs to V ⊆ Z.
In this paper, robust optimization approach is applied to (UP). Now, we associate with (UP) its robust counterpart We denote the feasible set of (RP) by The problem (RP) is called the robust primal problem of (UP). The infimum for problem (RP) is denoted by inf (RP) and every element x ∈ S such that f (x) = inf (RP) is called a robust solution of (UP) (or a solution of (RP)). The Lagrange perturbation function of (UP) is F : V × X × Y →R define as follows: Remark 2.3. It follows immediately from the definition of weak biconjugate function, we have Remark 2.4. Considering (1.1) and the definition of F v (x, y), we can conclude that Let q : X * × R + →R be the function defined by Lemma 2.1. Let p w : X * × R + →R be a weak conjugate function, then p w is lower semicontinuous and convex on X * × R + .
Proof. By the definition of weak conjugate function, we have where (ρ + cσ)(x) = ρ(x) + cσ(x), so p w is lower semicontinuous on X * × R + . Since x * is linear function and c is a constant, so convexity is easy to obtain. The proof is complete.
Proof. For any x ∈ X, from the definition of q w one has which, together with the inclusion above, proves that (iii) holds.
Proof of (iv). Since Π is surjective and Then one has epiq ww = coΛ. Moreover, the following statements are equivalent: Proof. Observe that q w = sup v∈V F ww v (·, 0 Y ), and so by assumption, one obtains domq w ∅. According to [34], epiq ww = co(epiq) which, together with Lemma 2.1 (iii), implies For the equivalence of (i) and (ii), note that in light of Lemma 2.1, (i) is equivalent to p ww = q w , which means also that q ww = p w . The last equality and epiq ww = coΛ show epip w = epiq ww = coΛ, which is (ii). The proof is complete.

Robust strong duality for nonconvex uncertain optimization problem
The aim of this section is to construct three types of robust dual problems for (UP) by using weak conjugate function: the robust augmented Lagrange dual, the robust weak Fenchel dual and the robust weak Fenchel-Lagrange dual problem, to establish characterization of inequality (1.1) according to robust abstract perturbational weak conjugate duality, and finally, by employing these results to obtain robust strong duality results for (UP).

Robust augmented Lagrange duality
To define an augmented Lagrange function for (UP), we need augmented function σ to be a continuous function with the properties (2.1). For each fixed v ∈ V, the uncertain augmented Lagrange function associated with (UP) is given by otherwise, for x ∈ X, y ∈ Y, y * ∈ Y * and d ∈ R + , where function F v (x, y) is defined in (2.2). By using the definition of F v (x, y), we can concretize the augmented Lagrange associated with (UP) The uncertain dual function of (UP) is Then uncertain augmented Lagrange dual problem of (UP) is defined as The optimistic counterpart of the uncertain augmented Lagrange dual (UD L ) is a deterministic maximization problem given by Now, when x * = 0 X * and c = 0 in (2.3), the value of the function F w v (0, 0, y * , d) simply denoted by As a result, robust augmented Lagrange dual problem for (UP) with respect to F v can be given by The supremum for problem (RD L ) is denoted by sup(RD L ) and any element (v, is termed as a solution of (RD L ).
so we conclude that sup(RD L ) ≤ inf(RP) In the following sections we always assume Γ is a set of functions defined on X, H is a set of functions and H ∅, Theorem 3.2. (Robust abstract perturbational weak conjugate duality) Consider the following statements: (a 1 ) epip w = Λ; (b 1 ) For any l ∈ Γ H (X), the following assertions are equivalent: c). One has (a 1 ) ⇔ (b 1 ).
Conversely, if (b 1 2 ) holds, then for any (x * , c) ∈ dom l w , there exist (v, c), and hence l = l ww ≤ p ww ≤ p, the " = " above is from l being H-convex. (b 1 ) ⇒ (a 1 ). Assume that (b 1 ) holds, we will show that (a 1 ) holds. To this aim, considering Lemma 2.1 (iv) into account, it is sufficient to prove that epip w ⊂ Λ.
Take every (x * , c, r) ∈ epip w . Then This shows that (x * , c, r) ∈ Π(epiF w v ) and hence (x * , c, r) ∈ Λ. Remark 3.1. Theorem 3.2 generalizes [ [13], Theorem 3.1]. In [13] the authors used the classical conjugate function and assumed the right-side function l(x) of inequality (1.1) is convex lower semicontinuous, whereas Theorem 3.2 in this paper, we employ the weak conjugate function and only assume l(x) is abstract convex, which covers very broad classes on nonconvex function.
Following from (3.1), we obtain 3) The first equality above follows from Remark 2.1. Taking account into (3.3) and the definition of p ww (x), one has Which implies p ww = sup v∈V F ww v (·, 0). The proof is complete.

Remark 3.4.
In the proof of the strong duality theorem 3.3, our sufficient condition epip w = Λ is different from the existed conditions. Duality Theorem 11.59 in [26], the condition was supposed dualizing parameterization φ(x, y) is level-bounded in x locally uniformly in y, duality theorems in [1,32], conditions were assumed perturbation function h = inf x φ(x, y) is proper and weakly subdifferential at the origin 0 ∈ Y. All these conditions were formulated in terms of dualization parameterization or perturbation functions associated with the given problem.
We recall the assumption epip w = Λ in theorem 3.3, which employ the epigraph of function F w v defined by (2.3), it is also related to dualization parameterization F v (x, y), and we know that epip w = coΛ is easy to satisfy from Lemma 2.3, Proposition 3.1 and Remark 3.1. Moreover, Theorem 3.4 gives an equivalent condition that the set Λ is closed convex set, so it is worth further exploration to find the sufficient conditions that can ensure the set Λ is not only a closed convex set but also only related to the objective function and the constraint function.

Robust weak Fenchel conjugate duality
The Fenchel perturbation function of (UP) is F : V × X × X →R defined as for x * , u * ∈ X * and c, d ∈ R + , where γ = x + u. By choosing x * = 0 X * , c = 0, we have Hence, the robust Fenchel dual problem (RD w F ) with respect to F v is defined as The supremum for problem (RD F ) is denoted by sup(RD F ) and any element (v, u * , d) ∈ V × X * × R + such that −F w v (0, 0, u * , d) = sup(RD F ) is termed as a solution of (RD F ). Remark 3.5. sup(RD w F ) ≤ inf(RP) follows immediately from the definition of F w v (0, 0, u * , d). Let the projection Π 1 : (x * , c, u * , d, r) ∈ X * × R + × X * × R + × R → (x * , c, r) ∈ X * × R + × R and let is a solution of (RD F ) and sup(RD F ) = inf(RP) Proof. The proof is similar to that of Theorem 3.3.

Fenchel-Lagrange weak conjugate duality
The Fenchel-Lagrange perturbation function of (UP) is F : The weak conjugate function of F v (x, u, y) is defined as F w v : where γ = x + u. By choosing x * = 0 X * , c = 0 and d = e, we have Hence, the robust Fenchel-Lagrange dual problem (RD w FL ) with respect to F v is defined as The supremum for problem (RD w FL ) is denoted by sup(RD w FL ) and any element (v, u * , d, y * , d) ∈ V × X * × R + × X * × R + such that −F w v (0, 0, u * , d) = sup(RD w FL ) is termed as a solution of (RD w FL ). Theorem 3.6. (Weak duality) sup(RD w FL ) ≤ inf(RP). Proof. For any (v, u Let the projection Π 2 : (x * , c, u * , d, y * , e, r) ∈ X * ×R + × X * ×R + ×Y * ×R + ×R → (x * , c, r) ∈ X * ×R + ×R and let Proof. Let (v, u * , y * , d) be an arbitrary element of V × X * × Y * × R + . It is known that Proof. Let (v, u * , y * , d) be an arbitrary element of V × X * × Y * × R + . It is known that 0 ∈ −D − g(x, v) for all x ∈ S v , so we have Hence, taking the supremum in both sides over (v, u * , This completes the proof. Proof. Under these assumptions and considering Theorem 3.7, it is known that sup(RD w FL ) = inf(RP). By Propositions 4.1 and 4.2, we obtain Now, we present an example of robust optimization problem which prove the relationships between the optimal values of the three proposed dual problems. Example 1. Consider the following one-dimensional optimization with data uncertainty in constraint: for all x ∈ R and Q = R, the data v ∈ [−1, 1] is uncertain.
In this example, we always assume function defined in (2.1) is σ(x) = |x|. Let us calculate the weak conjugate function of Lagrange perturbation then we have sup(RD L ) = −1.
Let us calculate the weak conjugate function of Fenchel perturbation Then we obtain sup(RD w F ) = −1. We also get sup(RD w FL ) = −∞. So we obtain −∞ = sup(RD w FL ) < sup(RD w F ) = sup(RD L ) = −1. which shows that weak conjugate function is more likely to guarantee zero dual gaps than classical conjugate function.