It is common knowledge that causal conclusions can not be drawn from
statistical data, e.g. from associations/correlations, alone. However,
in recent years a causal theory framework has been developed by
Pearl, Spirtes, Glymour, Scheines and their collaborators that allows
for causal conclusions due to imposed postulates on the data
generation process. The main postulate connects stochastic dependences
among random variables to the existence of 'causal mechanisms'
underlying the data. Causal hypotheses in this theory can be
formalized as directed acyclic graphs or generalizations thereof. The
goal of causal inference then is to reconstruct the class of graphs
from data by testing the constraints on the observed data that they
would imply. However, there is no known efficient way to describe all
constraints resulting from a causal hypothesis.
In this project we explore to what extent information-theoretic
quantities can be helpful in this reconstruction process. The
underlying idea is that quantities based on the relative entropy
(e.g. mutual information) provide a coarse/macroscopic
view on probability distributions that leads to testable constraints
on the data and is still finer than just looking only at the
qualitative property of stochastic independence. Quantifying
stochastic dependences using information theory also turns out to be
useful as a measure of strength of a causal connections (information flow).
Finally, in generic cases there are many equivalent causal hypothesis
that fit to given experimental data. To see this consider for example
the case of only two dependent observed variables A and B in which it
can not be distinguished using dependency information whether A causes
B, B causes A, or there is a common cause of both of them. To
decide among these cases, further inference rules have been employed
and we used information-theoretic methods as a theoretical tool for
the justification of one of them.
Related Group Publications:
Confounding ghost channels and causality : a new approach to causal information flows.
Bibtex MIS-Preprint: 75/2020 [ARXIV] Ay, N. ; Bertschinger, N. ; Jost, J. ; Olbrich, E. and J. Rauh:
Information and complexity, or: Where is the information?.
Bibtex MIS-Preprint: 102/2020 Ay, N. ; Polani, D. and N. Virgo:
Information decomposition based on cooperative game theory.
56 (2020) 5, p. 979-1014 Bibtex MIS-Preprint: 81/2020 [DOI] [ARXIV] Carlini, L. ; Ay, N. and C. Görgen:
A numerical effciency analysis of a common ancestor condition.
Mathematical aspects of computer and information sciences : 8th international conference,
MACIS 2019, Gebze-Istanbul, Turkey, November 13-15, 2019 ; revised selected papers / D.
Slamanig... (eds.). Springer, 2020. - P. 357-363
(Lecture notes in computer science ; 11989)
Bibtex [DOI] [FREELINK] Banerjee, P. K.
; Rauh, J. and G. Montúfar:
Computing the unique information.
IEEE international symposium on information theory (ISIT) from June 17 to 22, 2018
at the Talisa Hotel in Vail, Colorado, USA IEEE, 2018. - P. 141-145 Bibtex MIS-Preprint: 73/2017 [DOI] [ARXIV] [CODELINK] Sato, Y. and N. Ay:
Information flow in learning a coin-tossing game.
Nonlinear theory and its applications,
7 (2016) 2, p. 118-125 Bibtex [DOI] Steudel, B. and N. Ay:
Information-theoretic inference of common ancestors.
17 (2015) 4, p. 2304-2327 Bibtex [DOI] [ARXIV] Moritz, P. ; Reichardt, J. and N. Ay:
Discriminating between causal structures in Bayesian networks given partial observations.
50 (2014) 2, p. 284-295 Bibtex [DOI] Lohmann, G. ; Stelzer, J. ; Neumann, J. ; Ay, N. and R. Turner:
'More is different' in functional magnetic resonance imaging : a review of recent
data analysis techniques.
3 (2013) 3, p. 223-239 Bibtex [DOI] Ay, N. and W. Wenzel:
On solution sets of information inequalities.
48 (2012) 5, p. 845-864 Bibtex MIS-Preprint: 16/2011 [FREELINK] Janzing, D. ; Mooij, J. ; Zhang, K. ; Lemeire, J. ; Zscheischler, J. ; Daniusis, P. ; Steudel, B. and B. Schölkopf:
Information-geometric approach to inferring causal directions.
182/183 (2012), p. 1-31 Bibtex [DOI] Moritz, P. ; Reichardt, J. and N. Ay:
A new common cause principle for bayesian networks.
Proceedings of the 9th workshop on uncertainty processing WUPES '12 : Marianske Lazne,
Czech Republik ; 12-15th September 2012 Academy of Sciences of the Czech Republik / Institute of Information Theory and Automation, 2012. - P. 149-162 Bibtex [FREELINK] Janzing, D. and B. Steudel:
Justifying additive noise model-based causal discovery via algorithmic information
Open systems and information dynamics,
17 (2010) 2, p. 189-212 Bibtex [DOI] [ARXIV] Ay, N.:
A refinement of the common cause principle.
Discrete applied mathematics,
157 (2009) 10, p. 2439-2457 Bibtex [DOI] Ay, N. and D. Polani:
Information flows in causal networks.
Advances in complex systems,
11 (2008) 1, p. 17-41 Bibtex MIS-Preprint: 47/2006 [DOI]