

Preprint 17/2021
A note on the optimal convergence rate of descent methods with fixed step sizes for smooth strongly convex functions
André Uschmajew and Bart Vandereycken
Contact the author: Please use for correspondence this email.
Submission date: 15. Jun. 2021 (revised version: March 2022)
Pages: 9
Bibtex
Download full preprint: PDF (336 kB)
Abstract:
Based on a result by Taylor, Hendrickx, and Glineur (J. Optim. Theory Appl.}, 178(2):455--476, 2018) on the attainable convergence rate of gradient descent for smooth and strongly convex functions in terms of function values, an elementary convergence analysis for general descent methods with fixed step sizes is presented. It covers general variable metric methods, gradient related search directions under angle and scaling conditions, as well as inexact gradient methods. In all cases, optimal rates are obtained.