Search

MiS Preprint Repository

We have decided to discontinue the publication of preprints on our preprint server as of 1 March 2024. The publication culture within mathematics has changed so much due to the rise of repositories such as ArXiV (www.arxiv.org) that we are encouraging all institute members to make their preprints available there. An institute's repository in its previous form is, therefore, unnecessary. The preprints published to date will remain available here, but we will not add any new preprints here.

MiS Preprint
29/2021

Local convergence of alternating low-rank optimization methods with overrelaxation

Ivan V. Oseledets, Maxim V. Rakhuba and André Uschmajew

Abstract

The local convergence of alternating optimization methods with overrelaxation for low-rank matrix and tensor problems is established. The analysis is based on the linearization of the method which takes the form of an SOR iteration for a positive semidefinite Hessian and can be studied in the corresponding quotient geometry of equivalent low-rank representations. In the matrix case, the optimal relaxation parameter for accelerating the local convergence can be determined from the convergence rate of the standard method. This result relies on a version of Young's SOR theorem for positive semidefinite $2 \times 2$ block systems.

Received:
Nov 29, 2021
Published:
Nov 29, 2021

Related publications

inJournal
2023 Journal Open Access
Ivan V. Oseledets, Maxim Rakhuba and André Uschmajew

Local convergence of alternating low-rank optimization methods with overrelaxation

In: Numerical linear algebra with applications, 30 (2023) 3, e2459