Comparing Abs‐Normal NLPs to MPECs

We show that the class of unconstrained NLPs in abs‐normal form is a subclass of the class of MPECs and that the class of NLPs with general constraints in abs‐normal form is equivalent to the class of MPECs. Moreover, we compare constraint qualifications and stationarity concepts of these problem classes and observe close relations between them.


Introduction
Nonsmoothness arises in many practical optimization problems, for example in engineering and economics. Typical problem classes are MPECs and abs-normal NLPs. In this paper we consider very briefly the relations between these two classes. An overview about MPECs can be found in [1], for prerequisites of the abs-normal form see [2,3].

Unconstrained Abs-Normal NLP
We take interest in problems of the form with ϕ in abs-normal form [2,3]. Then, these problems can be formulated as unconstrained abs-normal NLPs. Definition 2.1 (Unconstrained Abs-Normal NLP) Let D x be an open subset of R n . A nonsmooth unconstrained optimization problem is called an unconstrained abs-normal NLP if functions f ∈ C d (D x,|z| , R) and c Z ∈ C d (D x,|z| , R s ) for D x,|z| = D x × D |z| and d ≥ 1 exist such that the NLP can equivalently be stated as where 0 ∈ D |z| and ∂ 2 c Z (x, |z|) is strictly lower triangular. The variables z i , i = 1, . . . , s are called switching variables.
Further, we can rewrite problem (2) as an MPEC. To this end we define variable vectors u = [z] + = max(z, 0) and v = [z] − = max(−z, 0) and replace |z| by u + v and z by u − v. Moreover, we need to enforce complementarity of u and v so that the representations of |z| and z hold.
Thus, unconstrained abs-normal NLPs are a subclass of MPECs. In the following we compare regularity conditions and transfer stationarity concepts from MPECs. It turns out that LIKQ and MPEC-LICQ are equivalent.  Note that Abadie's constraint qualification (MPEC-ACQ) holds in this setting without any prerequisites. Key is the strictly lower triangular structure of ∂ 2 c Z (x, |z|) and the absence of additional constraints from (3). We say that a non-smooth NLP is in abs-normal form if functions f ∈ C 1 (D x,|z| , R), g ∈ C 1 (D x,|z| , R ng ), h ∈ C 1 (D x,|z| , R n h ), and c Z ∈ C 1 (D x × R s ≥0 , R s ) with ∂ 2 c Z (x, |z|) strictly lower triangular exist such that the problem reads These problem classes are equivalent: with 0 = min(u, v) = 1 2 (u + v − (u − v)) the complementarity condition is posed in abs-normal form.

Conclusion and Outlook
We have considered unconstrained abs-normal NLPs and we have studied their relations with MPECs; more details can be found in [5]. In [6], the LIKQ and optimality conditions for the general abs-normal NLP (4) are studied. The comparison of these concepts to the theory of MPECs is a subject of ongoing research.