By Ramji Venkataramanan, University of Cambridge, UK, firstname.lastname@example.org | Sekhar Tatikonda, Yale University, USA, email@example.com | Andrew Barron, Yale University, USA, firstname.lastname@example.org
Developing computationally-efficient codes that approach the Shannon-theoretic limits for communication and compression has long been one of the major goals of information and coding theory. There have been significant advances towards this goal in the last couple of decades, with the emergence of turbo codes, sparsegraph codes, and polar codes. These codes are designed primarily for discrete-alphabet channels and sources. For Gaussian channels and sources, where the alphabet is inherently continuous, Sparse Superposition Codes or Sparse Regression Codes (SPARCs) are a promising class of codes for achieving the Shannon limits. This monograph provides a unified and comprehensive over-view of sparse regression codes, covering theory, algorithms, and practical implementation aspects. The first part of the monograph focuses on SPARCs for AWGN channel coding, and the second part on SPARCs for lossy compression (with squared error distortion criterion). In the third part, SPARCs are used to construct codes for Gaussian multi-terminal channel and source coding models such as broadcast channels, multiple-access channels, and source and channel coding with side information. The monograph concludes with a discussion of open problems and directions for future work.
Developing increasingly computationally-efficient codes for communication and compression has been the major goal of information and coding theory. There have been significant advances towards this goal in the last couple of decades with the emergence of turbo codes, sparse-graph codes, and polar codes. The world has seen faster network speeds and greater storage due to many of these developments. A new class of codes, Sparse Regression Codes (SPARCs), are a promising class of codes for achieving the Shannon limits of a communication channel.
This monograph presents a unified and comprehensive overview of sparse regression codes, covering theory, algorithms, and practical implementation aspects. Written by the world’s recognized experts in the field it describes the use of SPARCs for efficient communication over AWGN channels, for lossy compression and multi-terminal communication.
Researchers and students in modern communication and network systems will find Sparse Regression Codes an essential resource in understanding these new techniques that will have a significant impact on such systems in the years to come.