Abstract for the talk on 20.10.2017 (11:00 h)Arbeitsgemeinschaft ANGEWANDTE ANALYSIS
Jeremy Reizenstein (University of Warwick)
Using iterated-integral signatures as memory in recurrent neural networks
The iterated-integral signature from rough path theory is a powerful representation of the shape of a path in space. Recurrent neural networks are among the machine learning algorithms of choice when you want to learn a function whose inputs are sequences. Enabling them to notice patterns in a long sequence of input requires tweaks, the most popular of which is Long Short Term Memory,(Schmidhuber & Hochreiter 1997), which can be slow to learn but is very effective. I'll introduce these two tools, and explain why it might be a good idea to use the signature as an extended way to form a memory in a network, showing some results on how well this works.