Preprint 29/2021

Local convergence of alternating low-rank optimization methods with overrelaxation

Ivan V. Oseledets, Maxim V. Rakhuba, and André Uschmajew

Contact the author: Please use for correspondence this email.
Submission date: 29. Nov. 2021 (revised version: June 2022)
Pages: 18
Download full preprint: PDF (1016 kB)

The local convergence of alternating optimization methods with overrelaxation for low-rank matrix and tensor problems is established. The analysis is based on the linearization of the method which takes the form of an SOR iteration for a positive semidefinite Hessian and can be studied in the corresponding quotient geometry of equivalent low-rank representations. In the matrix case, the optimal relaxation parameter for accelerating the local convergence can be determined from the convergence rate of the standard method. This result relies on a version of Young’s SOR theorem for positive semidefinite 2 × 2 block systems.

30.06.2022, 02:21