Abstract: We display a suitable conjugacy for which we show that the l0 pseudonorm is "convex" in the sense of generalized convexity. Then, we provide a systematic way to design norms and lower bound convex minimization programs for generalized sparse optimization
Abstract: In this seminar, we first describe the theory of reformulations and numerical solution of generalized disjunctive programming (GDP) problems, which are expressed in terms of Boolean and continuous variables, and involve algebraic constraints, disjunctions and propositional logic statements. We propose a framework to generate alternative MINLP formulations for convex nonlinear GDPs that lead to stronger relaxations.
Next, by using the above theory, we address the global optimization of nonconvex nonlinear generalized disjunctive programming (GDP) problems. In order to predict tighter lower bounds to the global optimum we consider a sequence of basic steps for the convex relaxation to strengthen the bounds and finally we describe some solution methods.