Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. EPFL thesis
  4. Scalable constrained optimization
 
doctoral thesis

Scalable constrained optimization

Vladarean, Maria-Luiza  
2024

Modern optimization is tasked with handling applications of increasingly large scale, chiefly due to the massive amounts of widely available data and the ever-growing reach of Machine Learning. Consequently, this area of research is under steady pressure to develop scalable and provably convergent methods capable of handling hefty, high-dimensional problems. The present dissertation contributes to recent efforts in this direction by proposing optimization algorithms with improved scalability.

Concretely, (1) We develop three novel Frank-Wolfe-type methods for minimizing convex stochastic objectives subject to stochastic linear inclusion constraints. The key feature of our algorithms is that they process only a subset of the constraints per iteration, thus gaining an edge over methods that require full passes through the data for large-scale problems. (2) We generalize Frank-Wolfe-type methods to a class of composite non-differentiable objectives --- a setting in which the classical Frank-Wolfe algorithm is known not to converge. We circumvent the difficulties related to non-differentiability by leveraging the problem structure and a modified linear minimization oracle of the constraint set, thus attaining convergence rates akin to the smooth case. (3) We propose an adaptive primal-dual algorithm for solving structured convex-concave saddle point problems, whose empirical convergence is improved as a result of tailoring the stepsizes to the local problem geometry. Importantly, our method achieves adaptivity "for-free" by using readily available quantities such as past gradients, and without relying on more expensive linesearch subroutines.

Our methods are theoretically sound and empirically grounded, as they are each accompanied by rigorous convergence guarantees and experiments showcasing their performance against relevant baselines. In a nutshell, this dissertation provides new algorithmic approaches and points of trade-off on the road toward scalably solving large optimization problems.

  • Files
  • Details
  • Metrics
Loading...
Thumbnail Image
Name

EPFL_TH9906.pdf

Type

N/a

Access type

openaccess

License Condition

copyright

Size

6.07 MB

Format

Adobe PDF

Checksum (MD5)

1ff3bcafb5a017099555482ab629899e

Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés