Foundations and Trends® in Machine Learning > Vol 1 > Issue 1–2

Graphical Models, Exponential Families, and Variational Inference

By Martin J. Wainwright, Department of Statistics, and Department of Electrical Engineering and Computer Science, University of California, USA, wainwrig@stat.berkeley.edu | Michael I. Jordan, Department of Statistics, and Department of Electrical Engineering and Computer Science, University of California, USA, jordan@stat.berkeley.edu

 
Suggested Citation
Martin J. Wainwright and Michael I. Jordan (2008), "Graphical Models, Exponential Families, and Variational Inference", Foundations and Trends® in Machine Learning: Vol. 1: No. 1–2, pp 1-305. http://dx.doi.org/10.1561/2200000001

Publication Date: 18 Nov 2008
© 2008 M. J. Wainwright and M. I. Jordan
 
Subjects
Graphical models
 

Free Preview:

Download extract

Share

Login to download a free copy
In this article:
1 Introduction 
2 Background 
3 Graphical Models as Exponential Families 
4 Sum-Product, Bethe–Kikuchi, and Expectation-Propagation 
5 Mean Field Methods 
6 Variational Methods in Parameter Estimation 
7 Convex Relaxations and Upper Bounds 
8 Integer Programming, Max-product, and Linear Programming Relaxations 
9 Moment Matrices, Semidefinite Constraints, and Conic Programming Relaxation 
10 Discussion 
Acknowledgments 
A Background Material 
B Proofs and Auxiliary Results: Exponential Families and Duality 
C Variational Principles for Multivariate Gaussians 
D Clustering and Augmented Hypergraphs 
E Miscellaneous Results 
References 

Abstract

The formalism of probabilistic graphical models provides a unifying framework for capturing complex dependencies among random variables, and building large-scale multivariate statistical models. Graphical models have become a focus of research in many statistical, computational and mathematical fields, including bioinformatics, communication theory, statistical physics, combinatorial optimization, signal and image processing, information retrieval and statistical machine learning. Many problems that arise in specific instances — including the key problems of computing marginals and modes of probability distributions — are best studied in the general setting. Working with exponential family representations, and exploiting the conjugate duality between the cumulant function and the entropy for exponential families, we develop general variational representations of the problems of computing likelihoods, marginal probabilities and most probable configurations. We describe how a wide variety of algorithms — among them sum-product, cluster variational methods, expectation-propagation, mean field methods, max-product and linear programming relaxation, as well as conic programming relaxations — can all be understood in terms of exact or approximate forms of these variational representations. The variational approach provides a complementary alternative to Markov chain Monte Carlo as a general source of approximation methods for inference in large-scale statistical models.

DOI:10.1561/2200000001
ISBN: 978-1-60198-184-4
312 pp. $125.00
Buy book (pb)
 
ISBN: 978-1-60198-185-1
312 pp. $200.00
Buy E-book (.pdf)
Table of contents:
1: Introduction
2: Background
3: Graphical models as exponential families
4: Sum product, Bethe-Kikuchi, and expectation-propagation
5: Mean field methods
6: Variational methods in parameter estimation
7: Convex relaxations and upper bounds
8: Max-product and LP relaxations
9: Moment matrices and conic relaxations
10: Discussion
A: Background Material
B: Proofs for exponential families and duality
C: Variational principles for multivariate Gaussians
D: Clustering and augmented hypergraphs
E: Miscellaneous results
References

Graphical Models, Exponential Families, and Variational Inference

The formalism of probabilistic graphical models provides a unifying framework for capturing complex dependencies among random variables, and building large-scale multivariate statistical models. Graphical models have become a focus of research in many statistical, computational and mathematical fields, including bioinformatics, communication theory, statistical physics, combinatorial optimization, signal and image processing, information retrieval and statistical machine learning. Many problems that arise in specific instances-including the key problems of computing marginals and modes of probability distributions-are best studied in the general setting. Working with exponential family representations, and exploiting the conjugate duality between the cumulant function and the entropy for exponential families, Graphical Models, Exponential Families and Variational Inference develops general variational representations of the problems of computing likelihoods, marginal probabilities and most probable configurations. It describes how a wide variety of algorithms- among them sum-product, cluster variational methods, expectation-propagation, mean field methods, and max-product-can all be understood in terms of exact or approximate forms of these variational representations. The variational approach provides a complementary alternative to Markov chain Monte Carlo as a general source of approximation methods for inference in large-scale statistical models.

 
MAL-001