Foundations and Trends® in Machine Learning > Vol 9 > Issue 2-3

Patterns of Scalable Bayesian Inference

Elaine Angelino, University of California, Berkeley, USA, elaine@eecs.berkeley.edu Matthew James Johnson, Harvard University, USA, mattjj@csail.mit.edu Ryan P. Adams, Harvard University and Twitter, USA, rpa@seas.harvard.edu
 
Suggested Citation
Elaine Angelino, Matthew James Johnson and Ryan P. Adams (2016), "Patterns of Scalable Bayesian Inference", Foundations and TrendsĀ® in Machine Learning: Vol. 9: No. 2-3, pp 119-247. http://dx.doi.org/10.1561/2200000052

Published: 17 Nov 2016
© 2016 E. Angelino, M. J. Johnson and R. P. Adams
 
Subjects
Bayesian learning,  Markov chain Monte Carlo,  Variational Inference,  Parallel algorithms
 

Free Preview:

Article Help

Share

Download article
In this article:
1. Introduction
2. Background
3. MCMC with data subsets
4. Parallel and distributed MCMC
5. Scaling variational algorithms
6. Challenges and questions
Acknowledgements
References

Abstract

Datasets are growing not just in size but in complexity, creating a demand for rich models and quantification of uncertainty. Bayesian methods are an excellent fit for this demand, but scaling Bayesian inference is a challenge. In response to this challenge, there has been considerable recent work based on varying assumptions about model structure, underlying computational resources, and the importance of asymptotic correctness. As a result, there is a zoo of ideas with a wide range of assumptions and applicability. In this paper, we seek to identify unifying principles, patterns, and intuitions for scaling Bayesian inference. We review existing work on utilizing modern computing resources with both MCMC and variational approximation techniques. From this taxonomy of ideas, we characterize the general principles that have proven successful for designing scalable inference procedures and comment on the path forward.

DOI:10.1561/2200000052
ISBN: 978-1-68083-218-1
144 pp. $95.00
Buy book
 
ISBN: 978-1-68083-219-8
144 pp. $260.00
Buy E-book
Table of contents:
1. Introduction
2. Background
3. MCMC with data subsets
4. Parallel and distributed MCMC
5. Scaling variational algorithms
6. Challenges and questions
Acknowledgements
References

Patterns of Scalable Bayesian Inference

Datasets are growing not just in size but in complexity, creating a demand for rich models and quantification of uncertainty. Bayesian methods are an excellent fit for this demand, but scaling Bayesian inference is a challenge. In response to this challenge, there has been considerable recent work based on varying assumptions about model structure, underlying computational resources, and the importance of asymptotic correctness. As a result, there is a zoo of ideas with a wide range of assumptions and applicability.

Patterns of Scalable Bayesian Inference seeks to identify unifying principles, patterns, and intuitions for scaling Bayesian inference. It examines how these techniques can be scaled up to larger problems and scaled out across parallel computational resources. It reviews existing work on utilizing modern computing resources with both MCMC and variational approximation techniques. From this taxonomy of ideas, it characterizes the general principles that have proven successful for designing scalable inference procedures and addresses some of the significant open questions and challenges.

 
MAL-052