Foundations and Trends® in Communications and Information Theory > Vol 1 > Issue 4

Information Theory and Statistics: A Tutorial

By I. Csiszár, Rényi Institute of Mathematics, Hungarian Academy of Sciences, Hungary, csiszar@renyi.hu | P.C. Shields, University of Toledo, USA, paul.shields@utoledo.edu

 
Suggested Citation
I. Csiszár and P.C. Shields (2004), "Information Theory and Statistics: A Tutorial", Foundations and Trends® in Communications and Information Theory: Vol. 1: No. 4, pp 417-528. http://dx.doi.org/10.1561/0100000004

Publication Date: 15 Dec 2004
© 2004 I. Csiszár and P.C. Shields
 
Subjects
Information theory and statistics
 

Free Preview:

Download extract

Share

Download article
In this article:
Preface 
1. Preliminaries 
2. Large deviations, hypothesis testing 
3. I-projections 
4. f-Divergence and contingency tables 
5. Iterative algorithms 
6. Universal coding 
7. Redundancy bounds 
8. Redundancy and the MDL principle 
Appendix A. Summary of process concepts 
Historical Notes 
References 

Abstract

This tutorial is concerned with applications of information theory concepts in statistics, in the finite alphabet setting. The information measure known as information divergence or Kullback-Leibler distance or relative entropy plays a key role, often with a geometric flavor as an analogue of squared Euclidean distance, as in the concepts of I-projection, I-radius and I-centroid. The topics covered include large deviations, hypothesis testing, maximum likelihood estimation in exponential families, analysis of contingency tables, and iterative algorithms with an "information geometry" background. Also, an introduction is provided to the theory of universal coding, and to statistical inference via the minimum description length principle motivated by that theory.

DOI:10.1561/0100000004
ISBN: 978-1-933019-05-5
124 pp. $50.00
Buy book (pb)
 
ISBN: 978-1-933019-54-3
124 pp. $100.00
Buy E-book (.pdf)
Table of contents:
Preface
1. Preliminaries
2. Large deviations, hypothesis testing
3. I-projections
4. f-Divergence and contingency tables
5. Iterative algorithms
6. Universal coding
7. Redundancy bounds
8. Redundancy and the MDL principle
Appendix A. Summary of process concepts Historical notes
Historical Notes
References

Information Theory and Statistics

Information Theory and Statistics: A Tutorial is concerned with applications of information theory concepts in statistics, in the finite alphabet setting. The topics covered include large deviations, hypothesis testing, maximum likelihood estimation in exponential families, analysis of contingency tables, and iterative algorithms with an "information geometry" background. Also, an introduction is provided to the theory of universal coding, and to statistical inference via the minimum description length principle motivated by that theory.

The tutorial does not assume the reader has an in-depth knowledge of Information Theory or statistics. As such, Information Theory and Statistics: A Tutorial is an excellent introductory text to this highly-important topic in mathematics, computer science and electrical engineering. It provides both students and researchers with an invaluable resource to quickly get up to speed in the field.

 
CIT-004