Foundations and Trends® in Machine Learning >
Vol 6 > Issue 2-3

By
**Francis Bach**, INRIA - Ecole Normale Supérieure, France, francis.bach@ens.fr

Francis Bach (2013), "Learning with Submodular Functions: A Convex Optimization Perspective", Foundations and Trends® in Machine Learning: Vol. 6: No. 2-3, pp 145-373. http://dx.doi.org/10.1561/2200000039

© 2013 F. Bach

Classification and prediction, Clustering, Optimization, Computational learning, Operations research, Complexity in signal processing, Statistical/machine learning, Learning and statistical methods, Segmentation and grouping, Information theory and computer science, Information theory and statistics, Pattern recognition and learning

Download article
**In this article:**

Submodular functions are relevant to machine learning for at least two reasons: (1) some problems may be expressed directly as the optimization of submodular functions and (2) the Lovász extension of submodular functions provides a useful set of regularization functions for supervised and unsupervised learning. In this monograph, we present the theory of submodular functions from a convex analysis perspective, presenting tight links between certain polyhedra, combinatorial optimization and convex optimization problems. In particular, we show how submodular function minimization is equivalent to solving a wide variety of convex optimization problems. This allows the derivation of new efficient algorithms for approximate and exact submodular function minimization with theoretical guarantees and good practical performance. By listing many examples of submodular functions, we review various applications to machine learning, such as clustering, experimental design, sensor placement, graphical model structure learning or subset selection, as well as a family of structured sparsity-inducing norms that can be derived and used from submodular functions.

256 pp. $99.00

Buy book (pb)
256 pp. $230.00

Buy E-book (.pdf)
1. Introduction

2. Definitions

3. Lovász Extension

4. Properties of Associated Polyhedra

5. Convex Relaxation of Submodular Penalties

6. Examples and Applications of Submodularity

7. Non-smooth Convex Optimization

8. Separable Optimization Problems: Analysis

9. Separable Optimization Problems: Algorithms

10. Submodular Function Minimization

11. Other Submodular Optimization Problems

12. Experiments

13. Conclusion

Appendices

Acknowledgements

References

Submodular functions are relevant to machine learning for at least two reasons: (1) some problems may be expressed directly as the optimization of submodular functions, and (2) the Lovász extension of submodular functions provides a useful set of regularization functions for supervised and unsupervised learning.

In *Learning with Submodular Functions: A Convex Optimization Perspective*, the theory of submodular
functions is presented in a self-contained way from a convex analysis perspective, presenting tight links between
certain polyhedra, combinatorial optimization and convex optimization problems. In particular, it describes how
submodular function minimization is equivalent to solving a wide variety of convex optimization problems. This allows
the derivation of new efficient algorithms for approximate and exact submodular function minimization with theoretical
guarantees and good practical performance. By listing many examples of submodular functions, it reviews various
applications to machine learning, such as clustering, experimental design, sensor placement, graphical model structure
learning or subset selection, as well as a family of structured sparsity-inducing norms that can be derived and used
from submodular functions.

*Learning with Submodular Functions: A Convex Optimization Perspective* is an ideal reference for
researchers, scientists, or engineers with an interest in applying submodular functions to machine learning problems.