Talk
Using iterated-integral signatures as memory in recurrent neural networks
- Jeremy Reizenstein (University of Warwick)
Abstract
The iterated-integral signature from rough path theory is a powerful representation of the shape of a path in space. Recurrent neural networks are among the machine learning algorithms of choice when you want to learn a function whose inputs are sequences. Enabling them to notice patterns in a long sequence of input requires tweaks, the most popular of which is Long Short Term Memory,(Schmidhuber & Hochreiter 1997), which can be slow to learn but is very effective. I'll introduce these two tools, and explain why it might be a good idea to use the signature as an extended way to form a memory in a network, showing some results on how well this works.