It is common knowledge that causal conclusions can not be drawn from statistical data, e.g. from associations/correlations, alone. However, in recent years a causal theory framework has been developed by Pearl, Spirtes, Glymour, Scheines and their collaborators that allows for causal conclusions due to imposed postulates on the data generation process. The main postulate connects stochastic dependences among random variables to the existence of 'causal mechanisms' underlying the data. Causal hypotheses in this theory can be formalized as directed acyclic graphs or generalizations thereof. The goal of causal inference then is to reconstruct the class of graphs from data by testing the constraints on the observed data that they would imply. However, there is no known efficient way to describe all constraints resulting from a causal hypothesis.

In this project we explore to what extent information-theoretic quantities can be helpful in this reconstruction process. The underlying idea is that quantities based on the relative entropy (e.g. mutual information) provide a coarse/macroscopic view on probability distributions that leads to testable constraints on the data and is still finer than just looking only at the qualitative property of stochastic independence. Quantifying stochastic dependences using information theory also turns out to be useful as a measure of strength of a causal connections (information flow).

Finally, in generic cases there are many equivalent causal hypothesis that fit to given experimental data. To see this consider for example the case of only two dependent observed variables A and B in which it can not be distinguished using dependency information whether A causes B, B causes A, or there is a common cause of both of them. To decide among these cases, further inference rules have been employed and we used information-theoretic methods as a theoretical tool for the justification of one of them.

People:
Related Group Publications:
Steudel, B. and N. Ay: Information-theoretic inference of common ancestors. Entropy, 17 (2015) 4, p. 2304-2327Bibtex [DOI] [ARXIV] [FREELINK]

Moritz, P. ; Reichardt, J. and N. Ay: Discriminating between causal structures in Bayesian Networks given partial observations. Kybernetika, 50 (2014) 2, p. 284-295Bibtex [DOI]

Lohmann, G. ; Stelzer, J. ; Neumann, J. ; Ay, N. and R. Turner: 'More is different' in functional magnetic resonance imaging : a review of recent data analysis techniques. Brain Connectivity, 3 (2013) 3, p. 223-239Bibtex [DOI]

Ay, N. and W. Wenzel: On solution sets of information inequalities. Kybernetika, 48 (2012) 5, p. 845-864Bibtex MIS-Preprint: 16/2011 [FREELINK]

Janzing, D. ; Mooij, J. ; Zhang, K. ; Lemeire, J. ; Zscheischler, J. ; Daniusis, P. ; Steudel, B. and B. Schölkopf: Information-geometric approach to inferring causal directions. Artificial intelligence, 182/183 (2012), p. 1-31Bibtex [DOI]

Moritz, P. ; Reichardt, J. and N. Ay: A new common cause principle for bayesian networks. 9th Workshop on Uncertainty Processing : WUPES'12 ; Marianske Lazne, Czech Republik ; 12-15th September 2012 Academy of Sciences of the Czech Republik / Institute of Information Theory and Automation, 2012. - P. 149-162Bibtex[FREELINK]

Janzing, D. and B. Steudel: Justifying additive noise model-based causal discovery via algorithmic information theory. Open systems and information dynamics, 17 (2010) 2, p. 189-212Bibtex [DOI] [ARXIV]

Ay, N. : A refinement of the common cause principle. Discrete applied mathematics, 157 (2009) 10, p. 2439-2457Bibtex [DOI]

Ay, N. and D. Polani: Information flows in causal networks. Advances in complex systems, 11 (2008) 1, p. 17-41Bibtex MIS-Preprint: 47/2006 [DOI]