Note: This content is accessible to all versions of every browser. However, this browser does not seem to support current Web standards, preventing the display of our site's design details.


Distributed Privacy


A. Papasavvas

Master Thesis, SS17

In many distributed optimization problems, agents have individual objectives and constraints that they do not want to share. In fact, privacy is often used as an argument for distributed algorithms, since the agents do not need to share their private problem data. Instead they perform local computations and only share the results of this computation in order to converge to a solution with the other agents. In this study we consider the alternating direction method of multipliers (ADMM), and we argue that in the case of ADMM this view on privacy is incomplete. Based on a case study of plug-in electric vehicle charging, we demonstrate that privacy can be compromised, given that a standard implementation of ADMM is used. We investigate several scenarios where an attacker is able to extract private information. Finally, we propose simple extensions to ADMM that guard against some attacks on privacy.

Supervisors: Felix Rey, Damian Frick, John Lygeros, C. Jones


Type of Publication:

(12)Diploma/Master Thesis

File Download:

Request a copy of this publication.
(Uses JavaScript)
% Autogenerated BibTeX entry
@PhdThesis { Xxx:2017:IFA_5695
Permanent link