Search

Talk

Progress and challenges on the way towards a quantification of neural information flow using transfer entropy

  • Michael Wibral (Brain Imaging Center Frankfurt , Universität Frankfurt)
A3 02 (Seminar room)

Abstract

To understand the brain means to reconstruct the mutual influences its parts exert on each other. These influences may rely on a large number of different neural interaction mechanisms - many of which are nonlinear in nature. Hence, estimators for neural interactions that are free of an explicit interaction model promise to give a more comprehensive overview of all interactions in a network. A suitable metric of this kind is transfer entropy, an information-theoretic implementation of Wiener's principle of causality.

While this measure is a straightforward translation of Wiener's principle conceptually, several practical challenges have to be met when applying it to neural data, for example the handling of non-negligible interaction delays, difficulties in state space reconstruction and in embedding parameter estimation from noisy data.

More fundamental problems relate to non-stationaries in the data. Simulation results show that for data based on repeating segments ('trials') the number of data points that are fed into the estimator per segment can be reduced (to approach stationary pieces of data) if the number of segments is increased and a statistical comparison against surrogate data is used. Whether these results pertain only to specific simulations or represent a general principle is unknown.

Another fundamental problem is the estimation of the patterns of information flow for multivariate data. While multivariate transfer entropy can in principle be used to isolate direct influences, in practical settings finite data size usually prevents this of use of the estimator and conservative selection strategies for the necessary parameters are unknown. We suggest an approximate solution for this problem based on the potential of modified transfer entropy estimators to reconstruct the time lag of the interaction.

All influence measures based on Wiener's principle also show a false positive bias in the presence of crosstalk between source and target signals and the additional presence of unequal noise profiles. A heuristic approach to test for the presence of this bias based on time shifting data to transform instantaneous influences to time-lagged influences will be presented.

This talk aims at stimulating the discussion about future improvements in the use of information theoretic influence measures in neuroscience.

Katharina Matschke

MPI for Mathematics in the Sciences Contact via Mail