Foundations and Trends® in Communications and Information Theory > Vol 3 > Issue 3

Information Combining

By Ingmar Land, Aalborg University, Denmark | Johannes Huber, Erlangen University, Germany

 
Suggested Citation
Ingmar Land and Johannes Huber (2006), "Information Combining", Foundations and TrendsĀ® in Communications and Information Theory: Vol. 3: No. 3, pp 227-330. http://dx.doi.org/10.1561/0100000013

Publication Date: 30 Nov 2006
© 2006 I. Land and J. Huber
 
Subjects
Coding theory and practice
 

Free Preview:

Download extract

Share

Download article
In this article:
1. Introduction 
2. Binary-Input Symmetric Memoryless Channel 
3. Jensen's Inequality Revisited 
4. Information Combining for SPC Codes 
5. Information Combining for Repetition Codes 
6. Applications and Examples 
7. Conclusions 
Acknowledgments 
A. Binary Information Functions 
B. Convexity Lemma 
C. Acronyms 
References 

Abstract

Consider coded transmission over a binary-input symmetric memoryless channel. The channel decoder uses the noisy observations of the code symbols to reproduce the transmitted code symbols. Thus, it combines the information about individual code symbols to obtain an overall information about each code symbol, which may be the reproduced code symbol or its a-posteriori probability. This tutorial addresses the problem of "information combining" from an information-theory point of view: the decoder combines the mutual information between channel input symbols and channel output symbols (observations) to the mutual information between one transmitted symbol and all channel output symbols. The actual value of the combined information depends on the statistical structure of the channels. However, it can be upper and lower bounded for the assumed class of channels. This book first introduces the concept of mutual information profiles and revisits the well-known Jensen's inequality. Using these tools, the bounds on information combining are derived for single parity-check codes and for repetition codes. The application of the bounds is illustrated in four examples: information processing characteristics of coding schemes, including extrinsic information transfer (EXIT) functions; design of multiple turbo codes; bounds for the decoding threshold of low-density parity-check codes; EXIT function of the accumulator.

DOI:10.1561/0100000013
ISBN: 978-1-933019-46-8
122 pp. $75.00
Buy book (pb)
 
ISBN: 978-1-933019-98-7
122 pp. $100.00
Buy E-book (.pdf)
Table of contents:
1. Introduction
2 Binary-Input Symmetric Memoryless Channel
3 Jensen's Inequality Revisited
4. Information Combining for SPC Codes
5. Information Combining for Repetition Codes
6. Applications and Examples
7. Conclusions
Appendices
References

Information Combining

Information Combining is an introduction to the principles of information combining. The concept is described, the bounds for repetition codes and for single parity-check codes are proved, and some applications are provided. As the focus is on the basic principles, it considers a binary symmetric source, binary linear channel codes, and binary-input symmetric memoryless channels.

Information Combining first introduces the concept of mutual information profiles and revisits the well-known Jensen's inequality. Using these tools, the bounds on information combining are derived for single parity-check codes and for repetition codes. The application of the bounds is illustrated in four examples.

Information Combining provides an excellent tutorial on this important subject for students, researchers and professonals working in communications and information theory.

 
CIT-013