Qing Wang, Sanjeev R. Kulkarni and Sergio Verdú (2009), "Universal Estimation of Information Measures for Analog Sources", Foundations and Trends® in Communications and Information Theory: Vol. 5: No. 3, pp 265-353. http://dx.doi.org/10.1561/0100000021

© 2009 Q. Wang, S. R. Kulkarni and S. Verdú

Download article
**In this article:**

1. Introduction

2. Plug-in Algorithms

3. Algorithms Based on Partitioning

4. Algorithms Based on k-Nearest-Neighbor Distances

5. Other Algorithms

6. Algorithm Summary and Experiments

7. Sources with Memory

References

This monograph presents an overview of universal estimation of information measures for continuous-alphabet sources. Special attention is given to the estimation of mutual information and divergence based on independent and identically distributed (i.i.d.) data. Plug-in methods, partitioning-based algorithms, nearest-neighbor algorithms as well as other approaches are reviewed, with particular focus on consistency, speed of convergence and experimental performance.

120 pp. $85.00

Buy book (pb)
120 pp. $100.00

Buy E-book (.pdf)
1. Introduction

2. Plug-in Algorithms

3. Algorithms Based on Partitioning

4. Algorithms based on k-Nearest-Neighbor Distances

5. Other Algorithms

6. Algorithm Summary and Experiments

7. Sources with Memory

References

Entropy, mutual information and divergence measure the randomness, dependence and dissimilarity, respectively, of random objects. In addition to their prominent role in information theory, they have found numerous applications, among others, in probability theory statistics, physics, chemistry, molecular biology, ecology, bioinformatics, neuroscience, machine learning, linguistics, and finance. Many of these applications require a universal estimate of information measures which does not assume knowledge of the statistical properties of the observed data. Over the past few decades, several nonparametric algorithms have been proposed to estimate information measures.

*Universal Estimation of Information Measures for Analog Sources* presents a comprehensive survey of universal estimation of
information measures for memoryless analog (real-valued or real vector-valued) sources with an emphasis on the estimation of mutual information
and divergence and their applications. The book reviews the consistency of the universal algorithms and the corresponding sufficient conditions
as well as their speed of convergence.

*Universal Estimation of Information Measures for Analog Sources* provides a comprehensive review of an
increasingly important topic in Information Theory. It will be of interest to students, practitioners and researchers working in
Information Theory