Infoscience

Thesis

# Rigorous optimization recipes for sparse and low rank inverse problems with applications in data sciences

Many natural and man-made signals can be described as having a few degrees of freedom relative to their size due to natural parameterizations or constraints; examples include bandlimited signals, collections of signals observed from multiple viewpoints in a network-of-sensors, and per-flow traffic measurements of the Internet. Low-dimensional models (LDMs) mathematically capture the inherent structure of such signals via combinatorial and geometric data models, such as sparsity, unions-of-subspaces, low-rankness, manifolds, and mixtures of factor analyzers, and are emerging to revolutionize the way we treat inverse problems (e.g., signal recovery, parameter estimation, or structure learning) from dimensionality-reduced or incomplete data. Assuming our problem resides in a LDM space, in this thesis we investigate how to integrate such models in convex and non-convex optimization algorithms for significant gains in computational complexity. We mostly focus on two LDMs: $(i)$ sparsity and $(ii)$ low-rankness. We study trade-offs and their implications to develop efficient and provable optimization algorithms, and--more importantly--to exploit convex and combinatorial optimization that can enable cross-pollination of decades of research in both.

Thèse École polytechnique fédérale de Lausanne EPFL, n° 6350 (2014)
Programme doctoral Informatique, Communications et Information
Faculté des sciences et techniques de l'ingénieur
Institut de génie électrique et électronique
Laboratoire de systèmes d'information et d'inférence
Jury: Prof. E. Telatar (président) ; Prof. V. Cevher (directeur) ; Prof. M. Figueiredo, Prof. N. Sidiropoulos, Prof. P. Vandergheynst (rapporteurs)

Public defense: 2014-10-13

#### Reference

Record created on 2014-10-09, modified on 2016-12-12