Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. Conditional Gradient Methods via Stochastic Path-Integrated Differential Estimator
 
conference paper

Conditional Gradient Methods via Stochastic Path-Integrated Differential Estimator

Yurtsever, Alp  
•
Sra, Suvrit
•
Cevher, Volkan  orcid-logo
2019
Proceedings of the International Conference on Machine Learning - ICML 2019
36th International Conference on Machine Learning (ICML 2019)

We propose a class of novel variance-reduced stochastic conditional gradient methods. By adopting the recent stochastic path-integrated differential estimator technique (SPIDER) of Fang et al. (2018) for the classical Frank-Wolfe (FW) method, we introduce SPIDER-FW for finite-sum minimization as well as the more general expectation minimization problems. SPIDER-FW enjoys superior complexity guarantees in the non-convex setting while matching the best FW variants in the literature in the convex case. We also extend our framework `a la conditional gradient sliding (CGS) of Lan & Zhou (2016) and propose SPIDER-CGS to further reduce the stochastic first-order oracle complexity.

  • Files
  • Details
  • Metrics
Loading...
Thumbnail Image
Name

YSC2019.pdf

Type

Publisher's Version

Version

http://purl.org/coar/version/c_970fb48d4fbd8a85

Access type

openaccess

Size

338.98 KB

Format

Adobe PDF

Checksum (MD5)

4b0124b5f465187ea3fe0aef8862bc2d

Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés