Foundations and Trends® in Machine Learning > Vol 1 > Issue 1–2
Cite

Graphical Models, Exponential Families, and Variational Inference

  • Martin J. Wainwright 1
  • Michael I. Jordan 2

[1]Martin J. Wainwright, Department of Statistics, and Department of Electrical Engineering and Computer Science, University of California, USA, wainwrig@stat.berkeley.edu [2]Michael I. Jordan, Department of Statistics, and Department of Electrical Engineering and Computer Science, University of California, USA, jordan@stat.berkeley.edu

Short description

The core of this paper is a general set of variational principles for the problems of computing marginal probabilities and modes, applicable to multivariate statistical models in the exponential family.

Keywords

Download a free copy

(Full text PDF)

Table of contents

1 Introduction
2 Background
3 Graphical Models as Exponential Families
4 Sum-Product, Bethe–Kikuchi, and Expectation-Propagation
5 Mean Field Methods
6 Variational Methods in Parameter Estimation
7 Convex Relaxations and Upper Bounds
8 Integer Programming, Max-product, and Linear Programming Relaxations
9 Moment Matrices, Semidefinite Constraints, and Conic Programming Relaxation
10 Discussion
Acknowledgments
A Background Material
B Proofs and Auxiliary Results: Exponential Families and Duality
C Variational Principles for Multivariate Gaussians
D Clustering and Augmented Hypergraphs
E Miscellaneous Results
References

Foundations and Trends® in Machine Learning

(Vol 1, Issue 1–2, 2008, pp 1-305)

DOI: 10.1561/2200000001

Abstract

The formalism of probabilistic graphical models provides a unifying framework for capturing complex dependencies among random variables, and building large-scale multivariate statistical models. Graphical models have become a focus of research in many statistical, computational and mathematical fields, including bioinformatics, communication theory, statistical physics, combinatorial optimization, signal and image processing, information retrieval and statistical machine learning. Many problems that arise in specific instances — including the key problems of computing marginals and modes of probability distributions — are best studied in the general setting. Working with exponential family representations, and exploiting the conjugate duality between the cumulant function and the entropy for exponential families, we develop general variational representations of the problems of computing likelihoods, marginal probabilities and most probable configurations. We describe how a wide variety of algorithms — among them sum-product, cluster variational methods, expectation-propagation, mean field methods, max-product and linear programming relaxation, as well as conic programming relaxations — can all be understood in terms of exact or approximate forms of these variational representations. The variational approach provides a complementary alternative to Markov chain Monte Carlo as a general source of approximation methods for inference in large-scale statistical models.

Table of contents

1: Introduction
2: Background
3: Graphical models as exponential families
4: Sum product, Bethe-Kikuchi, and expectation-propagation
5: Mean field methods
6: Variational methods in parameter estimation
7: Convex relaxations and upper bounds
8: Max-product and LP relaxations
9: Moment matrices and conic relaxations
10: Discussion
A: Background Material
B: Proofs for exponential families and duality
C: Variational principles for multivariate Gaussians
D: Clustering and augmented hypergraphs
E: Miscellaneous results
References
Cover image for Graphical Models, Exponential Families, and Variational Inference

Graphical Models, Exponential Families, and Variational Inference

312 pages

DOI: 10.1561/9781601981851

E-ISBN: 978-1-60198-185-1

ISBN: 978-1-60198-184-4

Description

The formalism of probabilistic graphical models provides a unifying framework for capturing complex dependencies among random variables, and building large-scale multivariate statistical models. Graphical models have become a focus of research in many statistical, computational and mathematical fields, including bioinformatics, communication theory, statistical physics, combinatorial optimization, signal and image processing, information retrieval and statistical machine learning. Many problems that arise in specific instances-including the key problems of computing marginals and modes of probability distributions-are best studied in the general setting. Working with exponential family representations, and exploiting the conjugate duality between the cumulant function and the entropy for exponential families, Graphical Models, Exponential Families and Variational Inference develops general variational representations of the problems of computing likelihoods, marginal probabilities and most probable configurations. It describes how a wide variety of algorithms- among them sum-product, cluster variational methods, expectation-propagation, mean field methods, and max-product-can all be understood in terms of exact or approximate forms of these variational representations. The variational approach provides a complementary alternative to Markov chain Monte Carlo as a general source of approximation methods for inference in large-scale statistical models.