Preprint 17/2021

A note on the optimal convergence rate of descent methods with fixed step sizes for smooth strongly convex functions

André Uschmajew and Bart Vandereycken

Contact the author: Please use for correspondence this email.
Submission date: 15. Jun. 2021
Pages: 9
Bibtex
Download full preprint: PDF (325 kB)

Abstract:
Based on a recent result by de Klerk, Glineur, and Taylor (SIAM J. Optim., 30(3):2053--2082, 2020) on the attainable convergence rate of gradient descent for smooth and strongly convex functions in terms of function values, a convergence analysis for general descent methods with fixed step sizes is presented. It covers variable metric methods as well as gradient related search directions under angle and scaling conditions. An application to inexact gradient methods is also presented.

16.06.2021, 11:24