Conditional Gradient Methods via Stochastic Path-Integrated Differential Estimator

We propose a class of novel variance-reduced stochastic conditional gradient methods. By adopting the recent stochastic path-integrated differential estimator technique (SPIDER) of Fang et al. (2018) for the classical Frank-Wolfe (FW) method, we introduce SPIDER-FW for finite-sum minimization as well as the more general expectation minimization problems. SPIDER-FW enjoys superior complexity guarantees in the non-convex setting while matching the best FW variants in the literature in the convex case. We also extend our framework `a la conditional gradient sliding (CGS) of Lan & Zhou (2016) and propose SPIDER-CGS to further reduce the stochastic first-order oracle complexity.


Published in:
Proceedings of the International Conference on Machine Learning - ICML 2019
Presented at:
36th International Conference on Machine Learning (ICML 2019), Long Beach, USA, June 9-15, 2019
Year:
2019
Note:
Scheduled publication of Proceedings: Volume 97 is assigned to ICML 2019 (ISSN: 2640-3498)
Laboratories:




 Record created 2019-05-07, last modified 2019-08-12


Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)