Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. EPFL thesis
  4. Sparsity-Driven Statistical Inference for Inverse Problems
 
doctoral thesis

Sparsity-Driven Statistical Inference for Inverse Problems

Kamilov, Ulugbek  
2015

This thesis addresses statistical inference for the resolution of inverse problems. Our work is motivated by the recent trend whereby classical linear methods are being replaced by nonlinear alternatives that rely on the sparsity of naturally occurring signals. We adopt a statistical perspective and model the signal as a realization of a stochastic process that exhibits sparsity as its central property. Our general strategy for solving inverse problems then lies in the development of novel iterative solutions for performing the statistical estimation. The thesis is organized in five main parts. In the first part, we provide a general overview of statistical inference in the context of inverse problems. We discuss wavelet--based and gradient--based algorithms for linear and nonlinear forward models. In the second part, we present an in-depth discussion of cycle spinning, which is a technique used to improve the quality of signals recovered with wavelet--based methods. Our main contribution here is its proof of convergence; we also introduce a novel consistent cycle-spinning algorithm for denoising statistical signals. In the third part, we introduce a stochastic signal model based on L'evy processes and investigate popular gradient--based algorithms such as those that deploy total-variation regularization. We develop a novel algorithm based on belief propagation for computing the minimum mean-square error estimator and use it to benchmark several popular methods that recover signals with sparse derivatives. In the fourth part, we propose and analyze a novel adaptive generalized approximate message passing (adaptive GAMP) algorithm that reconstructs signals with independent wavelet-coefficients from generalized linear measurements. Our algorithm is an extension of the standard GAMP algorithm and allows for the joint learning of unknown statistical parameters. We prove that, when the measurement matrix is independent and identically distributed Gaussian, our algorithm is asymptotically consistent. This means that it performs as well as the oracle algorithm, which knows the parameters exactly. In the fifth and final part, we apply our methodology to an inverse problem in optical tomographic microscopy. In particular, we propose a novel nonlinear forward model and a corresponding algorithm for the quantitative estimation of the refractive index distribution of an object.

  • Files
  • Details
  • Metrics
Type
doctoral thesis
DOI
10.5075/epfl-thesis-6545
Author(s)
Kamilov, Ulugbek  
Advisors
Unser, Michaël  
Jury

Prof. J.-Ph. Thiran (président) ; Prof. M. Unser (directeur) ; Prof. M. Figueiredo, Prof. R. Gribonval, Prof. M. Vetterli (rapporteurs)

Date Issued

2015

Publisher

EPFL

Publisher place

Lausanne

Public defense year

2015-03-27

Thesis number

6545

Subjects

Approximate message passing

•

belief propagation

•

compressive sensing

•

cycle spinning

•

inverse problems

•

iterative shrinkage

•

phase microscopy

•

sparsity

•

statistical inference

•

tomographic microscopy

EPFL units
LIB  
Faculty
STI  
School
IMT  
Doctoral School
EDEE  
Available on Infoscience
March 11, 2015
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/112305
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés