A Non-Euclidean Gradient Descent Framework for Non-Convex Matrix Factorization
We study convex optimization problems that feature low-rank matrix solutions. In such scenarios, non-convex methods offer significant advantages over convex methods due to their lower space complexity as well as faster convergence speed. Moreover, many of these methods feature rigorous approximation guarantees. Non-convex algorithms are simple to analyze and implement as they perform Euclidean gradient descent on matrix factors. In contrast, this paper derives non-Euclidean optimization frame- work in the non-convex setting that takes nonlinear gradient steps on the factors. We prove convergence rates to the global minimum under appropriate assumptions. We provide numerical evidence with Fourier Ptychography and FastText applications using real data that shows our approach can significantly enhance solution quality
A Non-Euclidean Gradient Descent_postprint.pdf
Postprint
openaccess
1.6 MB
Adobe PDF
6624c7b86a565a994d515d20196845f1
matrix_fact.pdf
openaccess
791.06 KB
Adobe PDF
b5e8de1dc7d9bc2dd76aef2d5284983e