Research Topic

Information-Theoretic Reasoning in Causal Inference

It is common knowledge that causal conclusions can not be drawn from statistical data, e.g. from associations/correlations, alone. However, in recent years a causal theory framework has been developed by Pearl, Spirtes, Glymour, Scheines and their collaborators that allows for causal conclusions due to imposed postulates on the data generation process. The main postulate connects stochastic dependences among random variables to the existence of 'causal mechanisms' underlying the data. Causal hypotheses in this theory can be formalized as directed acyclic graphs or generalizations thereof. The goal of causal inference then is to reconstruct the class of graphs from data by testing the constraints on the observed data that they would imply. However, there is no known efficient way to describe all constraints resulting from a causal hypothesis.

In this project we explore to what extent information-theoretic quantities can be helpful in this reconstruction process. The underlying idea is that quantities based on the relative entropy (e.g. mutual information) provide a coarse/macroscopic view on probability distributions that leads to testable constraints on the data and is still finer than just looking only at the qualitative property of stochastic independence. Quantifying stochastic dependences using information theory also turns out to be useful as a measure of strength of a causal connections (information flow).

Finally, in generic cases there are many equivalent causal hypothesis that fit to given experimental data. To see this consider for example the case of only two dependent observed variables A and B in which it can not be distinguished using dependency information whether A causes B, B causes A, or there is a common cause of both of them. To decide among these cases, further inference rules have been employed and we used information-theoretic methods as a theoretical tool for the justification of one of them.


inBook
2022 Repository Open Access
Nihat Ay, Nils Bertschinger, Jürgen Jost, Eckehard Olbrich and Johannes Rauh

Information and complexity, or: Where is the information?

In: Complexity and emergence : Lake Como School of Advanced Studies, Italy, July 22-27, 2018 / Sergio Albeverio... (eds.)
Cham : Springer, 2022. - pp. 87-105
(Springer proceedings in mathematics and statistics ; 383)
inJournal
2021 Journal Open Access
Nihat Ay

Confounding ghost channels and causality : a new approach to causal information flows

In: Vietnam journal of mathematics, 49 (2021) 2, pp. 547-576
inBook
2020
Luca Carlini, Nihat Ay and Christiane Görgen

A numerical efficiency analysis of a common ancestor condition

In: Mathematical aspects of computer and information sciences : 8th international conference, MACIS 2019, Gebze-Istanbul, Turkey, November 13-15, 2019 ; revised selected papers / Daniel Slamanig... (eds.)
Cham : Springer, 2020. - pp. 357-363
(Lecture notes in computer science ; 11989)
inJournal
2020 Journal Open Access
Nihat Ay, Daniel Polani and Nathaniel Virgo

Information decomposition based on cooperative game theory

In: Kybernetika, 56 (2020) 5, pp. 979-1014
inBook
2018 Repository Open Access
Pradeep Kumar Banerjee, Johannes Rauh and Guido Montúfar

Computing the unique information

In: IEEE international symposium on information theory (ISIT) from June 17 to 22, 2018 at the Talisa Hotel in Vail, Colorado, USA
Piscataway, NY : IEEE, 2018. - pp. 141-145
inJournal
2016 Journal Open Access
Yuzuru Sato and Nihat Ay

Information flow in learning a coin-tossing game

In: Nonlinear theory and its applications, 7 (2016) 2, pp. 118-125
inJournal
2015 Journal Open Access
Bastian Steudel and Nihat Ay

Information-theoretic inference of common ancestors

In: Entropy, 17 (2015) 4, pp. 2304-2327
inJournal
2014 Journal Open Access
Philipp Moritz, Jörg Reichardt and Nihat Ay

Discriminating between causal structures in Bayesian networks given partial observations

In: Kybernetika, 50 (2014) 2, pp. 284-295
inJournal
2013
Gabriele Lohmann, Johannes Stelzer, Jane Neumann, Nihat Ay and Robert Turner

'More is different' in functional magnetic resonance imaging : a review of recent data analysis techniques

In: Brain Connectivity, 3 (2013) 3, pp. 223-239
inBook
2012
Philipp Moritz, Jörg Reichardt and Nihat Ay

A new common cause principle for bayesian networks

In: Proceedings of the 9th workshop on uncertainty processing WUPES '12 : Marianske Lazne, Czech Republik ; 12-15th September 2012
Praha : Academy of Sciences of the Czech Republik / Institute of Information Theory and Automation, 2012. - pp. 149-162
inJournal
2012
Dominik Janzing, Joris Mooij, Kun Zhang, Jan Lemeire, Jakob Zscheischler, Povilas Daniusis, Bastian Steudel and Bernhard Schölkopf

Information-geometric approach to inferring causal directions

In: Artificial intelligence, 182/183 (2012), pp. 1-31
inJournal
2012 Journal Open Access
Nihat Ay and Walter Wenzel

On solution sets of information inequalities

In: Kybernetika, 48 (2012) 5, pp. 845-864
inJournal
2010 Repository Open Access
Dominik Janzing and Bastian Steudel

Justifying additive noise model-based causal discovery via algorithmic information theory

In: Open systems and information dynamics, 17 (2010) 2, pp. 189-212
inJournal
2009
Nihat Ay

A refinement of the common cause principle

In: Discrete applied mathematics, 157 (2009) 10, pp. 2439-2457
inJournal
2008 Repository Open Access
Nihat Ay and Daniel Polani

Information flows in causal networks

In: Advances in complex systems, 11 (2008) 1, pp. 17-41