Foundations and Trends® in Optimization > Vol 5 > Issue 1-2

Acceleration Methods

By Alexandre d’Aspremont, CNRS & Ecole Normale Supérieure, France, aspremon@ens.fr | Damien Scieur, Samsung SAIT AI Lab & Mila, Canada, damien.scieur@gmail.com | Adrien Taylor, INRIA & Ecole Normale Supérieure, France, adrien.taylor@inria.fr

 
Suggested Citation
Alexandre d’Aspremont, Damien Scieur and Adrien Taylor (2021), "Acceleration Methods", Foundations and Trends® in Optimization: Vol. 5: No. 1-2, pp 1-245. http://dx.doi.org/10.1561/2400000036

Publication Date: 15 Dec 2021
© 2021 A. d’Aspremont et al.
 
Subjects
Optimization
 

Free Preview:

Download extract

Share

Download article
In this article:
1. Introduction
2. Chebyshev Acceleration
3. Nonlinear Acceleration
4. Nesterov Acceleration
5. Proximal Acceleration and Catalysts
6. Restart Schemes
Appendices
Acknowledgements
References

Abstract

This monograph covers some recent advances in a range of acceleration techniques frequently used in convex optimization. We first use quadratic optimization problems to introduce two key families of methods, namely momentum and nested optimization schemes. They coincide in the quadratic case to form the Chebyshev method.

We discuss momentum methods in detail, starting with the seminal work of Nesterov [1] and structure convergence proofs using a few master templates, such as that for optimized gradient methods, which provide the key benefit of showing how momentum methods optimize convergence guarantees. We further cover proximal acceleration, at the heart of the Catalyst and Accelerated Hybrid Proximal Extragradient frameworks, using similar algorithmic patterns.

Common acceleration techniques rely directly on the knowledge of some of the regularity parameters in the problem at hand. We conclude by discussing restart schemes, a set of simple techniques for reaching nearly optimal convergence rates while adapting to unobserved regularity parameters.

DOI:10.1561/2400000036
ISBN: 978-1-68083-928-9
262 pp. $99.00
Buy book (pb)
 
ISBN: 978-1-68083-929-6
262 pp. $280.00
Buy E-book (.pdf)
Table of contents:
1. Introduction
2. Chebyshev Acceleration
3. Nonlinear Acceleration
4. Nesterov Acceleration
5. Proximal Acceleration and Catalysts
6. Restart Schemes
Appendices
Acknowledgements
References

Acceleration Methods

This monograph covers recent advances in a range of acceleration techniques frequently used in convex optimization. Using quadratic optimization problems, the authors introduce two key families of methods, namely momentum and nested optimization schemes. These methods are covered in detail and include Chebyshev Acceleration, Nonlinear Acceleration, Nesterov Acceleration, Proximal Acceleration and Catalysts and Restart Schemes.

This book provides the reader with an in-depth description of the developments in Acceleration Methods since the early 2000s, whilst referring the reader back to underpinning earlier work for further understanding. This topic is important in the modern-day application of convex optimization techniques in many applicable areas.

This book is an introduction to the topic that enables the reader to quickly understand the important principles and apply the techniques to their own research.

 
OPT-036