Note: This content is accessible to all versions of every browser. However, this browser does not seem to support current Web standards, preventing the display of our site's design details.


Distributionally Robust Logistic Regression


S. Shafieezadeh-Abadeh, P. Mohajerin Esfahani, D. Kuhn

Neural Information Processing Systems (NIPS), awarded for spotlight presentation (Top 4%), (arXiv:1509.09259)

This paper proposes a distributionally robust approach to logistic regression. We use the Wasserstein distance to construct a ball in the space of probability distributions centered at the uniform distribution on the training samples. If the radius of this ball is chosen judiciously, we can guarantee that it contains the unknown data-generating distribution with high confidence. We then formulate a distributionally robust logistic regression model that minimizes a worst-case expected logloss function, where the worst case is taken over all distributions in the Wasserstein ball. We prove that this optimization problem admits a tractable reformulation and encapsulates the classical as well as the popular regularized logistic regression problems as special cases. We further propose a distributionally robust approach based on Wasserstein balls to compute upper and lower confidence bounds on the misclassification probability of the resulting classifier. These bounds are given by the optimal values of two highly tractable linear programs. We validate our theoretical out-of-sample guarantees through simulated and empirical experiments.


Type of Publication:


File Download:

Request a copy of this publication.
(Uses JavaScript)
% Autogenerated BibTeX entry
@InProceedings { ShaEsf:2015:IFA_5260,
    author={S. Shafieezadeh-Abadeh and P. Mohajerin Esfahani and D. Kuhn},
    title={{Distributionally Robust Logistic Regression}},
    booktitle={Neural Information Processing Systems (NIPS)},
Permanent link