Note: This content is accessible to all versions of every browser. However, this browser does not seem to support current Web standards, preventing the display of our site's design details.


Ghost penalties in nonconvex constrained optimization: Diminishing stepsizes and iteration complexity

We consider, for the first time, general diminishing stepsize methods for nonconvex, constrained optimization problems. We show that by using directions obtained in an SQP-like fashion, convergence to generalized stationary points can be proved. We then consider the iteration complexity of this method and of some variants where the stepsize is either kept constant or decreased according to very simple rules. We establish convergence to a $\delta-$approximate stationary points in at most $O(\delta^{-2})$, $O(\delta^{-3})$, or $O(\delta^{-4})$ iterations according to the assumptions made on the problem. These complexity results complement nicely the very few existing results in the field.

Type of Seminar:
IfA Seminar
Prof. Francisco Facchinei
University of Rome
Jun 16, 2017   11:00 h

ETL K 25
Contact Person:

No downloadable files available.
Biographical Sketch:
Francisco Facchinei received a Ph.D. degree in system engineering from the University of Rome La Sapienza, Rome, Italy. He is full professor of Operations Research, Engineering Faculty, University of Rome La Sapienza. His research interests focus on theoretical and algorithmic issues related to nonlinear optimization, variational inequalities, complementarity problems, equilibrium programming, and computational game theory. He is author of the two volume research monograph "Finite-dimensional variational inequalities and complementarity problems"