Search

MiS Preprint Repository

We have decided to discontinue the publication of preprints on our preprint server as of 1 March 2024. The publication culture within mathematics has changed so much due to the rise of repositories such as ArXiV (www.arxiv.org) that we are encouraging all institute members to make their preprints available there. An institute's repository in its previous form is, therefore, unnecessary. The preprints published to date will remain available here, but we will not add any new preprints here.

MiS Preprint
17/2021

A note on the optimal convergence rate of descent methods with fixed step sizes for smooth strongly convex functions

André Uschmajew and Bart Vandereycken

Abstract

Based on a result by Taylor, Hendrickx, and Glineur (J. Optim. Theory Appl.}, 178(2):455--476, 2018) on the attainable convergence rate of gradient descent for smooth and strongly convex functions in terms of function values, an elementary convergence analysis for general descent methods with fixed step sizes is presented. It covers general variable metric methods, gradient related search directions under angle and scaling conditions, as well as inexact gradient methods. In all cases, optimal rates are obtained.

Received:
Jun 15, 2021
Published:
Jun 15, 2021

Related publications

inJournal
2022 Journal Open Access
André Uschmajew and Bart Vandereycken

A note on the optimal convergence rate of descent methods with fixed step sizes for smooth strongly convex functions

In: Journal of optimization theory and applications, 194 (2022) 1, pp. 364-373