Note: This content is accessible to all versions of every browser. However, this browser does not seem to support current Web standards, preventing the display of our site's design details.

Ghost penalties in nonconvex constrained optimization: Diminishing stepsizes and iteration complexity

Back
Abstract:
We consider, for the first time, general diminishing stepsize methods for nonconvex, constrained optimization problems. We show that by using directions obtained in an SQP-like fashion, convergence to generalized stationary points can be proved. We then consider the iteration complexity of this method and of some variants where the stepsize is either kept constant or decreased according to very simple rules. We establish convergence to a $\delta-$approximate stationary points in at most $O(\delta^{-2})$, $O(\delta^{-3})$, or $O(\delta^{-4})$ iterations according to the assumptions made on the problem. These complexity results complement nicely the very few existing results in the field.

Type of Seminar:
IfA Seminar
Speaker:
Prof. Francisco Facchinei
University of Rome
Date/Time:
Jun 16, 2017   11:00 h
Location:

ETL K 25
Contact Person: