Abstract

We consider the solution of large-scale symmetric eigenvalue problems for which it is known that the eigenvectors admit a low-rank tensor approximation. Such problems arise, for example, from the discretization of high-dimensional elliptic PDE eigenvalue problems or in strongly correlated spin systems. Our methods are built on imposing low-rank (block) tensor train (TT) structure on the trace minimization characterization of the eigenvalues. The common approach of alternating optimization is combined with an enrichment of the TT cores by (preconditioned) gradients, as recently proposed by Dolgov and Savostyanov for linear systems. This can equivalently be viewed as a subspace correction technique. Several numerical experiments demonstrate the performance gains from using this technique.

Details

Actions