Note: This content is accessible to all versions of every browser. However, this browser does not seem to support current Web standards, preventing the display of our site's design details.

  

Short course on Convex Optimization and Applications, 21, 22, 23, 27 March
Lecture 4: Alternating Direction Method of Multipliers

Back
Abstract:
Problems in areas such as machine learning and dynamic optimization on a large network lead to extremely large convex optimization problems, with problem data stored in a decentralized way, and processing elements distributed across a network. We argue that the alternating direction method of multipliers is well suited to such problems. The method was developed in the 1970s, with roots in the 1950s, and is equivalent or closely related to many other algorithms, such as dual decomposition, the method of multipliers, Douglas-Rachford splitting, Spingarn's method of partial inverses, Dykstra's alternating projections, Bregman iterative algorithms for $\ell_1$ problems, proximal methods, and others. After briefly surveying the theory and history of the algorithm, we discuss applications to statistical and machine learning problems such as the lasso and support vector machines, and to dynamic energy management problems arising in the smart grid.
Based on joint work by: Stephen Boyd, Neal Parikh, Eric Chu, Borja Pelleato, and Jon Eckstein.

Published recordings: On-demand video.

http://control.ee.ethz.ch/~valice/Boyd_Lec4_Mar2012.pdf
Type of Seminar:
IfA Seminar
Speaker:
Prof. Stephen Boyd
Electrical Engineering, Information Systems Laboratory, Stanford University
Date/Time:
Mar 27, 2012   16:15
Location:

ETF E 1, Sternwartstr. 7
Contact Person:

John Lygeros
File Download:

Request a copy of this publication.
Biographical Sketch:
See Lecture 1