Foundations and Trends® in Machine Learning > Vol 17 > Issue 2

User-friendly Introduction to PAC-Bayes Bounds

By Pierre Alquier, ESSEC Business School, Asia-Pacific Campus, Singapore, pierre.alquier.stat@gmail.com

 
Suggested Citation
Pierre Alquier (2024), "User-friendly Introduction to PAC-Bayes Bounds", Foundations and Trends® in Machine Learning: Vol. 17: No. 2, pp 174-303. http://dx.doi.org/10.1561/2200000100

Publication Date: 22 Jan 2024
© 2024 P. Alquier
 
Subjects
Deep learning,  Model choice,  Online learning,  Variational inference,  Bayesian learning,  Classification and prediction,  Statistical learning theory,  Learning and statistical methods,  Information theory and statistics
 

Free Preview:

Download extract

Share

Download article
In this article:
1. Introduction
2. First Step in the PAC-Bayes World
3. Tight and Non-vacuous PAC-Bayes Bounds
4. PAC-Bayes Oracle Inequalities and Fast Rates
5. Beyond “Bounded Loss” and “i.i.d. Observations”
6. Related Approaches in Statistics and Machine Learning Theory
7. Conclusion
Acknowledgements
References

Abstract

Aggregated predictors are obtained by making a set of basic predictors vote according to some weights, that is, to some probability distribution. Randomized predictors are obtained by sampling in a set of basic predictors, according to some prescribed probability distribution.

Thus, aggregated and randomized predictors have in common that their definition rely on a probability distribution on the set of predictors. In statistical learning theory, there is a set of tools designed to understand the generalization ability of such predictors: PAC-Bayesian or PAC-Bayes bounds.

Since the original PAC-Bayes bounds (Shawe-Taylor and Williamson, 1997; McAllester, 1998), these tools have been considerably improved in many directions. We will for example describe a simplified version of the localization technique (Catoni, 2003; Catoni, 2007) that was missed by the community, and later rediscovered as “mutual information bounds”. Very recently, PAC-Bayes bounds received a considerable attention. There was workshop on PAC-Bayes at NIPS 2017, (Almost) 50 Shades of Bayesian Learning: PACBayesian trends and insights, organized by B. Guedj, F. Bach and P. Germain. One of the reasons of this recent interest is the successful application of these bounds to neural networks (Dziugaite and Roy, 2017). Since then, this is a recurring topic of workshops in the major machine learning conferences.

The objective of these notes is to provide an elementary introduction to PAC-Bayes bounds.

DOI:10.1561/2200000100
ISBN: 978-1-63828-326-3
144 pp. $95.00
Buy book (pb)
 
ISBN: 978-1-63828-327-0
144 pp. $155.00
Buy E-book (.pdf)
Table of contents:
1. Introduction
2. First Step in the PAC-Bayes World
3. Tight and Non-vacuous PAC-Bayes Bounds
4. PAC-Bayes Oracle Inequalities and Fast Rates
5. Beyond “Bounded Loss” and “i.i.d. Observations”
6. Related Approaches in Statistics and Machine Learning Theory
7. Conclusion
Acknowledgements
References

User-friendly Introduction to PAC-Bayes Bounds

Probably approximately correct (PAC) bounds have been an intensive field of research over the last two decades. Hundreds of papers have been published and much progress has been made resulting in PAC-Bayes bounds becoming an important technique in machine learning.

The proliferation of research has made the field for a newcomer somewhat daunting. In this tutorial, the author guides the reader through the topic’s complexity and large body of publications. Covering both empirical and oracle PAC-bounds, this book serves as a primer for students and researchers who want to get to grips quickly with the subject. It provides a friendly introduction that illuminates the basic theory and points to the most important publications to gain deeper understanding of any particular aspect.

 
MAL-100