Sparse Inverse Problems Over Measures: Equivalence Of The Conditional Gradient And Exchange Methods
We study an optimization program over nonnegative Borel measures that encourages sparsity in its solution. Efficient solvers for this program are in increasing demand, as it arises when learning from data generated by a "continuum-of-subspaces" model, a recent trend with applications in signal processing, machine learning, and high-dimensional statistics. We prove that the conditional gradient method (CGM) applied to this infinite-dimensional program, as proposed recently in the literature, is equivalent to the exchange method (EM) applied to its Lagrangian dual, which is a semi-infinite program. In doing so, we formally connect such infinite-dimensional programs to the well-established field of semi-infinite programming. On the one hand, the equivalence established in this paper allows us to provide a rate of convergence for the EM that is more general than those existing in the literature. On the other hand, this connection and the resulting geometric insights might in the future lead to the design of improved variants of the CGM for infinite-dimensional programs, which has been an active research topic. The CGM is also known as the Frank-Wolfe algorithm.
WOS:000473041200015
2019-01-01
29
2
1329
1349
REVIEWED