Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Journal articles
  4. Mutual Information and Optimality of Approximate Message-Passing in Random Linear Estimation
 
research article

Mutual Information and Optimality of Approximate Message-Passing in Random Linear Estimation

Barbier, Jean  
•
Macris, Nicolas  
•
Dia, Mohamad  
Show more
July 1, 2020
Ieee Transactions On Information Theory

We consider the estimation of a signal from the knowledge of its noisy linear random Gaussian projections. A few examples where this problem is relevant are compressed sensing, sparse superposition codes, and code division multiple access. There has been a number of works considering the mutual information for this problem using the replica method from statistical physics. Here we put these considerations on a firm rigorous basis. First, we show, using a Guerra-Toninelli type interpolation, that the replica formula yields an upper bound to the exact mutual information. Secondly, for many relevant practical cases, we present a converse lower bound via a method that uses spatial coupling, state evolution analysis and the I-MMSE theorem. This yields a single letter formula for the mutual information and the minimal-mean-square error for random Gaussian linear estimation of all discrete bounded signals. In addition, we prove that the low complexity approximate message-passing algorithm is optimal outside of the so-called hard phase, in the sense that it asymptotically reaches the minimal-mean-square error. In this work spatial coupling is used primarily as a proof technique. However our results also prove two important features of spatially coupled noisy linear random Gaussian estimation. First there is no algorithmically hard phase. This means that for such systems approximate message-passing always reaches the minimal-mean-square error. Secondly, in the limit of infinitely long coupled chain, the mutual information associated to spatially coupled systems is the same as the one of uncoupled linear random Gaussian estimation.

  • Details
  • Metrics
Type
research article
DOI
10.1109/TIT.2020.2990880
Web of Science ID

WOS:000544995900018

Author(s)
Barbier, Jean  
Macris, Nicolas  
Dia, Mohamad  
Krzakala, Florent
Date Issued

2020-07-01

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC

Published in
Ieee Transactions On Information Theory
Volume

66

Issue

7

Start page

4270

End page

4303

Subjects

Computer Science, Information Systems

•

Engineering, Electrical & Electronic

•

Computer Science

•

Engineering

•

estimation

•

multiaccess communication

•

mutual information

•

physics

•

random variables

•

noise measurement

•

couplings

•

compressed sensing (cs)

•

interpolation techniques

•

spatial coupling (sc)

•

approximate message-passing (amp)

•

replica formula

•

statistical physics

•

sparse superposition codes

•

threshold saturation

•

tight bounds

•

sharp bounds

•

ldgm codes

•

cdma

•

capacity

•

recovery

•

ldpc

•

error

Editorial or Peer reviewed

REVIEWED

Written at

EPFL

EPFL units
LTHC  
Available on Infoscience
July 17, 2020
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/170206
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés