Search

Workshop

Information transfer in recurrent neural networks

  • Oliver Obst (CSIRO ICT Centre, Australia)
G3 10 (Lecture hall)

Abstract

Reservoir computing (RC) is a recent paradigm in the field of recurrent neural networks. RC computing approaches have been employed as mathematical models for generic neural microcircuits, to investigate and explain computations in neocortical columns. A key element of reservoir computing approaches is the randomly constructed, fixed hidden layer - typically, only connections to output units are trained. In previous work, we have addressed performance issues of Echo State Networks, a particular reservoir computing approach, and investigated methods to optimize for longer short-term memory capacity or prediction of highly non-linear mappings. A general method for improving network performance is the use of permutation matrices for reservoir connectivity, however, problem specific methods such as unsupervised learning based on intrinsic plasticity (IP) also exist. IP aims to increase the entropy of each output of the internal units, but unfortunately improves performance only slightly compared to a setup based on random or permutation matrices. Comparing completely random networks, and networks based on permutation matrices, we found transfer entropy between network input and output of individual units to be a significant indicator for the performance of the network. Higher transfer entropy seems to indicate a more homogeneous, coherent computation using permutation matrices as a result of the lower in-degree of nodes. A future extension of this work is to investigate methods to increase transfer entropies based on local learning rules in individual nodes.

Antje Vandenberg

Max-Planck-Institut für Mathematik in den Naturwissenschaften Contact via Mail

Nihat Ay

Max Planck Institute for Mathematics in the Sciences, Leipzig

Ralf Der

Max Planck Institute for Mathematics in the Sciences, Leipzig

Mikhail Prokopenko

CSIRO, Sydney