Zusammenfassung für den Vortrag am 08.05.2002 (14:15 Uhr)Arbeitsgemeinschaft NEURONALE NETZE UND KOGNITIVE SYSTEME
Herbert Jäger (Fraunhofer Institute for Autonomous Intelligent Systems AiS.INDY, Sankt Augustin)
The "echo state" approach to analyzing and training recurrent neural networks
The \"echo state\" approach to analyzing and training recurrent neural networks. The talk introduces a constructive learning algorithm for the supervised training of recurrent neural networks, which is characterized by two properties: (1) a large "echo state" recurrent neural network is used as a "reservoir" of complex dynamics; this network is not changed by learning; (2) only the weights of connections from the echo state network are learnt. The basic mathematical idea is sketched, and a number of theoretical and application-oriented examples are given. The theoretical examples demonstrate a number of novel phenomena in recurrent networks; for instance, the training of short-term memories with large memory spans (100 time step delayed recalls are easily obtained), the training of infinite-duration memories (input-switchable multistate attractors), or the training of arbitrary periodic sequences (n-point attractor learning). The application-oriented examples mostly come from robotics and include the training of motor-controller modules and of event detectors for robots.