On extractable shared information
Johannes Rauh, Pradeep Kumar Banerjee, Eckehard Olbrich, Jürgen Jost, and Nils Bertschinger
Contact the author: Please use for correspondence this email.
Submission date: 10. Jul. 2017
published in: Entropy, 19 (2017) 7, art-no. 328
DOI number (of the published article): 10.3390/e19070328
Keywords and phrases: information decomposition, multivariate mutual information, left monotonicity, Blackwell order
Download full preprint: PDF (291 kB)
Link to arXiv: See the arXiv entry of this preprint.
We consider the problem of quantifying the information shared by a pair of random variables X1,X2 about another variable S. We propose a new measure of shared information, called extractable shared information, that is left monotonic; that is, the information shared about S is bounded from below by the information shared about f(S) for any function f. We show that our measure leads to a new nonnegative decomposition of the mutual information I(S;X1X2) into shared, complementary and unique components. We study properties of this decomposition and show that a left monotonic shared information is not compatible with a Blackwell interpretation of unique information. We also discuss whether it is possible to have a decomposition in which both shared and unique information are left monotonic.