Information-theoretic grounding of finite automata in neural systems
Thomas Wennekers and Nihat Ay
Contact the author: Please use for correspondence this email.
Submission date: 26. Jun. 2002
PACS-Numbers: 84.35.+i, 87.19.La, 02.50.Ga
Keywords and phrases: information theory, markov chains, neural networks
Download full preprint: PDF (925 kB), PS ziped (155 kB)
We introduce a measure ``stochastic interaction'' that captures spatial and temporal signal properties in recurrent systems. The measure quantifies the Kullback-Leibler divergence of a Markov chain from a product of split chains for the single units. Maximization of stochastic interaction, also called ``Temporal Infomax'', is shown to induce almost deterministic dynamical systems for unconstrained Markov chains. If part of the units are clamped to prescribed stochastic processes providing external input, Temporal Infomax leads to finite automata, either completely deterministic or at most weakly non-deterministic. This way, computational capabilities may arise in neural systems.