Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. Conferences, Workshops, Symposiums, and Seminars
  4. A new regret analysis for Adam-type algorithms
 
conference paper

A new regret analysis for Adam-type algorithms

Alacaoglu, Ahmet  
•
Malitsky, Yura
•
Mertikopoulos, Panayotis
Show more
2020
Proceedings of the 37th International Conference on Machine Learning (ICML)
37th International Conference on Machine Learning (ICLM 2020)

In this paper, we focus on a theory-practice gap for Adam and its variants (AMSgrad, AdamNC, etc.). In practice, these algorithms are used with a constant first-order moment parameter 1 (typically between 0:9 and 0:99). In theory, regret guarantees for online convex optimization require a rapidly decaying 1 ! 0 schedule. We show that this is an artifact of the standard analysis and propose a novel framework that allows us to derive optimal, data-dependent regret bounds with a constant 1, without further assumptions. We also demonstrate the flexibility of our analysis on a wide range of different algorithms and settings.

  • Files
  • Details
  • Metrics
Loading...
Thumbnail Image
Name

A new regret analysis.pdf

Type

Preprint

Version

http://purl.org/coar/version/c_71e4c1898caa6e32

Access type

openaccess

Size

331.76 KB

Format

Adobe PDF

Checksum (MD5)

9ce67ceda9a9cb24125702d69c7ae46c

Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés