The Gaussian conditional independence inference problem

  • Tobias Boege (MPI MiS, Leipzig)
E1 05 (Leibniz-Saal)


Conditional independence is a ternary relation on subsets of a finite vector of random variables $X$. A \textbf{CI statement} $(ij|K)$ asserts that "whenever the outcome of all the variables $X_k$, $k$ in $K$, is known, learning the outcome of $X_i$ provides no further information on $X_j$". These relations are highly structured, in particular under assumptions about the joint distribution. The goal is to describe this by \textbf{CI inference rules}: given that certain CI statements hold, which other (disjunctions of) CI statements are implied under the distribution assumption?

This talk is about regular Gaussian distributions. In this case, conditional independence has an algebraic characterization in terms of subdeterminants of the covariance matrix and inference, a discrete problem by nature, becomes a geometric question about the vanishing of very special polynomials on very special varieties inside the cone of positive-definite matrices.

In the first part of the talk, I show that the space of counterexamples to a (wrong) inference formula can be "difficult" by multiple measures. In particular, proving inference formulas wrong is polynomial-time equivalent to the existential theory of the reals. In the second part, I report on practical approximations to the inference problem and computational results on the way of classifying all Gaussian CI structures on five random variables.

Mirke Olschewski

MPI for Mathematics in the Sciences Contact via Mail

Upcoming Events of this Seminar