TY - EJOUR
DO - 10.1109/TSP.2014.2357776
AB - In this paper, we extend the generalized approximate message passing (G-AMP) approach, originally proposed for high-dimensional generalized-linear regression in the context of compressive sensing, to the generalized-bilinear case, which enables its application to matrix completion, robust PCA, dictionary learning, and related matrix-factorization problems. Here, in Part I of a two-part paper, we derive our Bilinear G-AMP (BiG-AMP) algorithm as an approximation of the sum-product belief propagation algorithm in the high-dimensional limit, where central-limit theorem arguments and Taylor-series approximations apply, and under the assumption of statistically independent matrix entries with known priors. In addition, we propose an adaptive damping mechanism that aids convergence under finite problem sizes, an expectation-maximization (EM)-based method to automatically tune the parameters of the assumed priors, and two rank-selection strategies. In Part II of the paper, we will discuss the specializations of EM-BiG-AMP to the problems of matrix completion, robust PCA, and dictionary learning, and we will present the results of an extensive empirical study comparing EM-BiG-AMP to state-of-the-art algorithms on each problem.
T1 - Bilinear Generalized Approximate Message Passingâ€”Part I: Derivation
IS - 22
DA - 2014
AU - Parker, J.T.
AU - Schniter, P.
AU - Cevher, Volkan
JF - IEEE Transactions on Signal Processing
SP - 5839-5853
VL - 62
EP - 5839-5853
PB - Institute of Electrical and Electronics Engineers
PP - Piscataway
ID - 203543
KW - Approximate message passing
KW - belief propagation
KW - bilinear estimation
KW - matrix completion
KW - dictionary learning
KW - robust principal components analysis
KW - matrix factorization
SN - 1053-587X
UR - http://infoscience.epfl.ch/record/203543/files/06898015.pdf
ER -