On neural network based estimators of the mutual information

  • Pradeep Kumar Banerjee (MPI MiS, Leipzig)
A3 01 (Sophus-Lie room)


The mutual information is a fundamental quantity that measures the total amount of dependence between two random variables. Estimating the mutual information from finite data samples can be challenging in practice. In this introductory talk, I will briefly review some trainable neural estimators of the mutual information and discuss an application in relation to the analysis of learning in deep neural networks.