Discovery of causal models that contain latent variables through Bayesian scoring of independence constraints

Jabbari F, Ramsey J, Spirtes P, Cooper GF. Discovery of causal models that contain latent variables through Bayesian scoring of independence constraints. In: Proceeding of the European Conference on Machine Learning (2017).

Discovering causal structure from observational data in the
presence of latent variables remains an active research area. Constraintbased
causal discovery algorithms are relatively efficient at discovering
such causal models from data using independence tests. Typically, however,
they derive and output only one such model. In contrast, Bayesian
methods can generate and probabilistically score multiple models, outputting
the most probable one; however, they are often computationally
infeasible to apply when modeling latent variables. We introduce a
hybrid method that derives a Bayesian probability that the set of independence
tests associated with a given causal model are jointly correct.
Using this constraint-based scoring method, we are able to score multiple
causal models, which possibly contain latent variables, and output the
most probable one. The structure-discovery performance of the proposed
method is compared to an existing constraint-based method (RFCI)
using data generated from several previously published Bayesian networks.
The structural Hamming distances of the output models improved
when using the proposed method compared to RFCI, especially for small
sample sizes.


Keywords: Observational data · Latent (hidden) variable
Constraint-based and Bayesian causal discovery · Posterior probability

Publication Year: 
2017
Faculty Author: 
Publication Credits: 
Jabbari F, Ramsey J, Spirtes P, Cooper GF
Publication Download: 
AttachmentSize
PDF icon Jabbari.pdf1.02 MB
^