Repository logo

Infoscience

  • English
  • French
Log In
Logo EPFL, École polytechnique fédérale de Lausanne

Infoscience

  • English
  • French
Log In
  1. Home
  2. Academic and Research Output
  3. EPFL thesis
  4. Riemannian Optimization for Solving High-Dimensional Problems with Low-Rank Tensor Structure
 
doctoral thesis

Riemannian Optimization for Solving High-Dimensional Problems with Low-Rank Tensor Structure

Steinlechner, Michael Maximilian  
2016

In this thesis, we present a Riemannian framework for the solution of high-dimensional optimization problems with an underlying low-rank tensor structure. Here, the high-dimensionality refers to the size of the search space, while the cost function is scalar-valued. Such problems arise, for example, in the reconstruction of high-dimensional data sets and in the solution of parameter dependent partial differential equations. As the degrees of freedom grow exponentially with the number of dimensions, the so-called curse of dimensionality, directly solving the optimization problem is computationally unfeasible even for moderately high-dimensional problems. We constrain the optimization problem by assuming a low-rank tensor structure of the solution; drastically reducing the degrees of freedom. We reformulate this constrained optimization as an optimization problem on a manifold using the smooth embedded Riemannian manifold structure of the low-rank representations of the Tucker and tensor train formats. Exploiting this smooth structure, we derive efficient gradient-based optimization algorithms. In particular, we propose Riemannian conjugate gradient schemes for the solution of the tensor completion problem, where we aim to reconstruct a high-dimensional data set for which the vast majority of entries is unknown. For the solution of linear systems, we show how we can precondition the Riemannian gradient and leverage second-order information in an approximate Newton scheme. Finally, we describe a preconditioned alternating optimization scheme with subspace correction for the solution of high-dimensional symmetric eigenvalue problems.

  • Files
  • Details
  • Metrics
Type
doctoral thesis
DOI
10.5075/epfl-thesis-6958
Author(s)
Steinlechner, Michael Maximilian  
Advisors
Kressner, Daniel  
Jury

Prof. Kathryn Hess Bellwald (présidente) ; Prof. Daniel Kressner (directeur de thèse) ; Prof. Fabio Nobile, Prof. André Uschmajew, Prof. Bart Vandereycken (rapporteurs)

Date Issued

2016

Publisher

EPFL

Publisher place

Lausanne

Public defense year

2016-04-11

Thesis number

6958

Total of pages

165

Subjects

Curse of dimensionality

•

Riemannian optimization

•

low-rank structure

•

Tucker format

•

Tensor Train

•

preconditioning

EPFL units
ANCHP  
Faculty
SB  
School
MATHICSE  
Doctoral School
EDMA  
Available on Infoscience
April 6, 2016
Use this identifier to reference this record
https://infoscience.epfl.ch/handle/20.500.14299/125573
Logo EPFL, École polytechnique fédérale de Lausanne
  • Contact
  • infoscience@epfl.ch

  • Follow us on Facebook
  • Follow us on Instagram
  • Follow us on LinkedIn
  • Follow us on X
  • Follow us on Youtube
AccessibilityLegal noticePrivacy policyCookie settingsEnd User AgreementGet helpFeedback

Infoscience is a service managed and provided by the Library and IT Services of EPFL. © EPFL, tous droits réservés