Foundations and Trends® in Optimization > Vol 3 > Issue 2

The Many Faces of Degeneracy in Conic Optimization

By Dmitriy Drusvyatskiy, University of Washington, USA, ddrusv@uw.edu | Henry Wolkowicz, University of Waterloo, Canada, hwolkowicz@uwaterloo.ca

 
Suggested Citation
Dmitriy Drusvyatskiy and Henry Wolkowicz (2017), "The Many Faces of Degeneracy in Conic Optimization", Foundations and Trends® in Optimization: Vol. 3: No. 2, pp 77-170. http://dx.doi.org/10.1561/2400000011

Publication Date: 20 Dec 2017
© 2017 D. Drusvyatskiy and H. Wolkowicz
 
Subjects
Optimization,  Computational geometry,  Computational biology,  Operations research,  Denoising,  Information theory and computer science,  Information theory and statistics,  Pattern recognition and learning,  Modeling and analysis
 

Free Preview:

Download extract

Share

Download article
In this article:
1. What this monograph is about 
Part I: Theory 
2. Convex geometry 
3. Virtues of strict feasibility 
4. Facial reduction 
Part II: Applications and illustrations 
5. Introduction 
6. Matrix completions 
7. Hard combinatorial problems 
Acknowledgements 
Index 
References 

Abstract

Slater’s condition – existence of a “strictly feasible solution” – is a common assumption in conic optimization. Without strict feasibility, first-order optimality conditions may be meaningless, the dual problem may yield little information about the primal, and small changes in the data may render the problem infeasible. Hence, failure of strict feasibility can negatively impact off-the-shelf numerical methods, such as primal-dual interior point methods, in particular. New optimization modeling techniques and convex relaxations for hard nonconvex problems have shown that the loss of strict feasibility is a more pronounced phenomenon than has previously been realized. In this text, we describe various reasons for the loss of strict feasibility, whether due to poor modeling choices or (more interestingly) rich underlying structure, and discuss ways to cope with it and, in many pronounced cases, how to use it as an advantage. In large part, we emphasize the facial reduction preprocessing technique due to its mathematical elegance, geometric transparency, and computational potential.

DOI:10.1561/2400000011
ISBN: 978-1-68083-390-4
114 pp. $80.00
Buy book (pb)
 
ISBN: 978-1-68083-391-1
114 pp. $135.00
Buy E-book (.pdf)
Table of contents:
1. What this monograph is about
Part I: Theory
2. Convex geometry
3. Virtues of strict feasibility
4. Facial reduction
Part II: Applications and illustrations
5. Introduction
6. Matrix completions
7. Hard combinatorial problems
Acknowledgements
Index
References

The Many Faces of Degeneracy in Conic Optimization

Slater’s condition – existence of a “strictly feasible solution” – is a common assumption in conic optimization. Without strict feasibility, first-order optimality conditions may be meaningless, the dual problem may yield little information about the primal, and small changes in the data may render the problem infeasible. Hence, failure of strict feasibility can negatively impact off-the-shelf numerical methods, such as primal-dual interior point methods, in particular. New optimization modeling techniques and convex relaxations for hard nonconvex problems have shown that the loss of strict feasibility is a more pronounced phenomenon than has previously been realized.

The Many Faces of Degeneracy in Conic Optimization describes various reasons for the loss of strict feasibility, whether due to poor modeling choices or (more interestingly) rich underlying structure, and discusses ways to cope with it and, in many pronounced cases, how to use it as an advantage. In large part, it emphasizes the facial reduction preprocessing technique due to its mathematical elegance, geometric transparency, and computational potential.

The Many Faces of Degeneracy in Conic Optimization is divided into two parts. Part I presents the necessary theoretical grounding in conic optimization, including basic optimality and duality theory, connections of Slater’s condition to the distance to infeasibility and sensitivity theory, the facial reduction procedure, and the singularity degree. Part II focuses on illustrative examples and applications, including matrix completion problems (semidefinite, low-rank, and Euclidean distance), relaxations of hard combinatorial problems (quadratic assignment and max-cut), and sum of squares relaxations of polynomial optimization problems.

 
OPT-011