Foundations and Trends® in Communications and Information Theory > Vol 10 > Issue 1-2

Concentration of Measure Inequalities in Information Theory, Communications, and Coding

Maxim Raginsky, Department of Electrical and Computer Engineering, Coordinated Science Laboratory, University of Illinois at Urbana-Champaign, United States, maxim@illinois.edu Igal Sason, Department of Electrical Engineering, Technion – Israel Institute of Technology, Israel, sason@ee.technion.ac.il
 
Suggested Citation
Maxim Raginsky and Igal Sason (2013), "Concentration of Measure Inequalities in Information Theory, Communications, and Coding", Foundations and Trends® in Communications and Information Theory: Vol. 10: No. 1-2, pp 1-246. http://dx.doi.org/10.1561/0100000064

Published: 23 Oct 2013
© 2013 M. Raginsky and I. Sason
 
Subjects
Coding theory and practice,  Information theory and statistics,  Multiuser information theory,  Shannon theory
 

Free Preview:

Article Help

Share

Download article
In this article:
1. Introduction
2. Concentration Inequalities via the Martingale Approach
3. The Entropy Method, Logarithmic Sobolev Inequalities, and Transportation-Cost Inequalities
Acknowledgments
References

Abstract

Concentration inequalities have been the subject of exciting developments during the last two decades, and have been intensively studied and used as a powerful tool in various areas. These include convex geometry, functional analysis, statistical physics, mathematical statistics, pure and applied probability theory (e.g., concentration of measure phenomena in random graphs, random matrices, and percolation), information theory, theoretical computer science, learning theory, and dynamical systems.

This monograph focuses on some of the key modern mathematical tools that are used for the derivation of concentration inequalities, on their links to information theory, and on their various applications to communications and coding. In addition to being a survey, this monograph also includes various new recent results derived by the authors.

The first part of the monograph introduces classical concentration inequalities for martingales, as well as some recent refinements and extensions. The power and versatility of the martingale approach is exemplified in the context of codes defined on graphs and iterative decoding algorithms, as well as codes for wireless communication.

The second part of the monograph introduces the entropy method, an information-theoretic technique for deriving concentration inequalities for functions of many independent random variables. The basic ingredients of the entropy method are discussed first in conjunction with the closely related topic of logarithmic Sobolev inequalities, which are typical of the so-called functional approach to studying the concentration of measure phenomenon. The discussion on logarithmic Sobolev inequalities is complemented by a related viewpoint based on probability in metric spaces. This viewpoint centers around the so-called transportation-cost inequalities, whose roots are in information theory. Some representative results on concentration for dependent random variables are briefly summarized, with emphasis on their connections to the entropy method. Finally, we discuss several applications of the entropy method and related information-theoretic tools to problems in communications and coding. These include strong converses, empirical distributions of good channel codes with non-vanishing error probability, and an information-theoretic converse for concentration of measure.

DOI:10.1561/0100000064
ISBN: 978-1-60198-724-2
260 pp. $99.00
Buy book
 
ISBN: 978-1-60198-725-9
260 pp. $230.00
Buy E-book
Table of contents:
1. Introduction
2. Concentration Inequalities via the Martingale Approach
3. The Entropy Method, Logarithmic Sobolev Inequalities, and Transportation-Cost Inequalities
Acknowledgments
References

Concentration of Measure Inequalities in Information Theory, Communications, and Coding

Concentration inequalities have been the subject of exciting developments during the last two decades, and have been intensively studied and used as a powerful tool in various areas. These include convex geometry, functional analysis, statistical physics, mathematical statistics, pure and applied probability theory (e.g., concentration of measure phenomena in random graphs, random matrices, and percolation), information theory, theoretical computer science, learning theory, and dynamical systems.

Concentration of Measure Inequalities in Information Theory, Communications, and Coding focuses on some of the key modern mathematical tools that are used for the derivation of concentration inequalities, on their links to information theory, and on their various applications to communications and coding. In addition to being a survey, this monograph also includes various new recent results derived by the authors.

Concentration of Measure Inequalities in Information Theory, Communications, and Coding is essential reading for all researchers and scientists in information theory and coding.

 
CIT-064

Concentration of Measure Inequalities in Information Theory, Communications, and Coding: Second Edition
This is the second edition of Concentration of Measure Inequalities in Information Theory, Communications, and Coding
ISBN: 978-1-60198-906-2