Note: This content is accessible to all versions of every browser. However, this browser does not seem to support current Web standards, preventing the display of our site's design details.


Benchmarking large-scale distributed convex quadratic programming algorithms


Attila Kozma, C. Conte, M. Diehl

Optimization Methods and Software, (accepted)

This paper aims to collect, benchmark and implement state-of-the-art decomposable convex quadratic programming (QP) methods employing duality. In order to decouple the original problem, these methods relax some constraints by introducing dual variables and apply a hierarchical optimization scheme. In the lower level of this scheme, a sequence of parametric QPs is solved in parallel, while in the high-level problem, a gradient-based method is applied to achieve an optimal dual solution. Finding the optimal dual variables is a hard problem since the dual function is not twice continuously differentiable and not strongly convex. We investigate and compare several gradient-based methods using a set of convex QPs as benchmarks. We discuss the theoretical worst-case convergence properties of the investigated methods, but we also evaluate their practical convergence behaviour. The benchmark set as well as the suite of implemented algorithms are released as open-source software. From our experiments, it turns out that the alternating direction method of multipliers and the restarted version of the fast gradient method are the best methods for solving decomposable QPs in terms of the number of necessary, lower level QP solutions.


Type of Publication:


File Download:

Request a copy of this publication.
(Uses JavaScript)
% Autogenerated BibTeX entry
@Article { KozCon:2014:IFA_4808,
    author={Attila Kozma and C. Conte and M. Diehl},
    title={{Benchmarking large-scale distributed convex quadratic
	  programming algorithms}},
    journal={Optimization Methods and Software},
Permanent link