Foundations and Trends® in Communications and Information Theory > Vol 19 > Issue 2

Common Information, Noise Stability, and Their Extensions

By Lei Yu, Nankai University, China, leiyu@nankai.edu.cn | Vincent Y. F. Tan, National University of Singapore, Singapore, vtan@nus.edu.sg

 
Suggested Citation
Lei Yu and Vincent Y. F. Tan (2022), "Common Information, Noise Stability, and Their Extensions", Foundations and Trends® in Communications and Information Theory: Vol. 19: No. 2, pp 107-389. http://dx.doi.org/10.1561/0100000122

Publication Date: 28 Apr 2022
© 2022 L. Yu and V. Y. F. Tan
 
Subjects
Communication complexity,  Information theory and computer science,  Shannon theory,  Information theory and statistics,  Multiuser information theory
 

Free Preview:

Download extract

Share

Download article
In this article:
1. Introduction
2. Wyner’s Common Information
3. Gács–Körner–Witsenhausen’s Common Information
4. Rényi and Total Variation Common Information
5. Exact Common Information
6. Approximate and Exact Channel Synthesis
7. Common Information and Nonnegative Rank
8. Non-Interactive Correlation Distillation
9. q-Stability
10. Functional Inequalities
11. Open Problems
Acknowledgements
References

Abstract

Common information is ubiquitous in information theory and related areas such as theoretical computer science and discrete probability. However, because there are multiple notions of common information, a unified understanding of the deep interconnections between them is lacking. This monograph seeks to fill this gap by leveraging a small set of mathematical techniques that are applicable across seemingly disparate problems.

In Part I, we review the operational tasks and properties associated with Wyner’s and Gács–Körner–Witsenhausen’s (GKW’s) common information. In Part II, we discuss extensions of the former from the perspective of distributed source simulation. This includes the Rényi common information which forms a bridge between Wyner’s common information and the exact common information. Via a surprising equivalence between the Rényi common information of order ∞ and the exact common information, we demonstrate the existence of a joint source in which the exact common information strictly exceeds Wyner’s common information. Other closely related topics discussed in Part II include the channel synthesis problem and the connection of Wyner’s and exact common information to the nonnegative rank of matrices.

In Part III, recognizing that GKW’s common information is zero for most non-degenerate sources, we examine it with a more refined lens via the Non-Interactive Correlation Distillation (NICD) problem in which we quantify the agreement probability of extracted bits from a bivariate source. We extend this to the noise stability problem which includes as special cases the k-user NICD and q-stability problems. This allows us to seamlessly transition to discussing their connections to various conjectures in information theory and discrete probability, such as the Courtade–Kumar, Li– Médard and Mossell–O’Donnell conjectures. Finally, we consider functional inequalities (e.g., the hypercontractivity and Brascamp–Lieb inequalities), which constitute a further generalization of the noise stability problem in which the Boolean functions therein are replaced by nonnnegative functions. We demonstrate that the key ideas behind the proofs in Part III can be presented in a pedagogically coherent manner and unified via information-theoretic and Fourier-analytic methods.

DOI:10.1561/0100000122
ISBN: 978-1-63828-014-9
304 pp. $99.00
Buy book (pb)
 
ISBN: 978-1-63828-015-6
304 pp. $145.00
Buy E-book (.pdf)
Table of contents:
1. Introduction
2. Wyner’s Common Information
3. Gács–Körner–Witsenhausen’s Common Information
4. Rényi and Total Variation Common Information
5. Exact Common Information
6. Approximate and Exact Channel Synthesis
7. Common Information and Nonnegative Rank
8. Non-Interactive Correlation Distillation
9. q-Stability
10. Functional Inequalities
11. Open Problems
Acknowledgements
References

Common Information, Noise Stability, and Their Extensions

Common information measures the amount of matching variables in two or more information sources. It is ubiquitous in information theory and related areas such as theoretical computer science and discrete probability. However, because there are multiple notions of common information, a unified understanding of the deep interconnections between them is lacking. In this monograph the authors fill this gap by leveraging a small set of mathematical techniques that are applicable across seemingly disparate problems.

The reader is introduced in Part I to the operational tasks and properties associated with the two main measures of common information, namely Wyner’s and Gács–Körner–Witsenhausen’s (GKW). In the subsequent two Parts, the authors take a deeper look at each of these. In Part II they discuss extensions to Wyner’s common information from the perspective of distributed source simulation, including the Rényi common information. In Part III, GKW common information comes under the spotlight. Having laid the groundwork, the authors seamlessly transition to discussing their connections to various conjectures in information theory and discrete probability.

This monograph provides students and researchers in Information Theory with a comprehensive resource for understanding common information and points the way forward to creating a unified set of techniques applicable over a wide range of problems.

 
CIT-122