Search

Talk

Using iterated-integral signatures as memory in recurrent neural networks

  • Jeremy Reizenstein (University of Warwick)
A3 01 (Sophus-Lie room)

Abstract

The iterated-integral signature from rough path theory is a powerful representation of the shape of a path in space. Recurrent neural networks are among the machine learning algorithms of choice when you want to learn a function whose inputs are sequences. Enabling them to notice patterns in a long sequence of input requires tweaks, the most popular of which is Long Short Term Memory,(Schmidhuber & Hochreiter 1997), which can be slow to learn but is very effective. I'll introduce these two tools, and explain why it might be a good idea to use the signature as an extended way to form a memory in a network, showing some results on how well this works.

Katja Heid

MPI for Mathematics in the Sciences Contact via Mail

Upcoming Events of This Seminar