Search

Talk

A Non-Asymptotic Analysis of Gradient Descent for Recurrent Neural Networks

  • Semih Cayci (RWTH Aachen)
Live Stream

Abstract

This talk will present a non-asymptotic analysis of recurrent neural networks trained with gradient descent for the regression problem in the kernel regime without massive overparameterization. Our in-depth analysis (i) provides sharp bounds on the network size and iteration complexity in terms of the sequence length, sample size and ambient dimension, and (ii) identifies the significant impact of long-term dependencies in the dynamical system on the performance bounds characterized by a cutoff point that depends on the modulus of continuity of the activation function. Remarkably, this analysis reveals that an appropriately-initialized recurrent neural network trained with $n$ samples can achieve near-optimality with a network size $m$ that scales only logarithmically with $n$. This sharply contrasts with the prior works that require high-order polynomial dependency of $m$ on $n$ to establish strong regularity conditions. Our results are based on an explicit characterization of the class of dynamical systems that can be approximated and learned by recurrent neural networks via norm-constrained transportation mappings, and establishing local smoothness properties of the hidden state with respect to the learnable parameters.

Links

seminar
05.12.24 19.12.24

Math Machine Learning seminar MPI MIS + UCLA

MPI for Mathematics in the Sciences Live Stream

Katharina Matschke

MPI for Mathematics in the Sciences Contact via Mail

Upcoming Events of this Seminar