Zusammenfassung für den Vortrag am 26.01.2023 (17:00 Uhr)

Math Machine Learning seminar MPI MIS + UCLA

Liu Ziyin (University of Tokyo)
The Probabilistic Stability and Low-Rank Bias of SGD
Siehe auch das Video dieses Vortrages.

Conventionally, the stability of stochastic gradient descent (SGD) is understood through a linear stability analysis, where the mean and variance of the parameter or the gradients are examined to determine the stability of SGD close to a stationary point. In this seminar, we discuss the limitations of linear stability theories and motivate a new notion of stability, which we call the probabilistic stability. We first explain why this notion of stability is especially suitable for understanding SGD at a large learning rate and a small batch size in toy problems. Then, with this new notion of stability, we study the implicit bias of SGD and show that SGD at a large learning rate converges to low-rank saddles in matrix factorization problems.

The talk is mainly based on the following two works:

[1] Liu Ziyin, Botao Li, James B. Simon, Masahito Ueda. SGD with a Constant Large Learning Rate Can Converge to Local Maxima. ICLR 2022.

[2] The Probabilistic Stability of SGD. (tentative title, in preparation)


08.02.2023, 11:01