Entropic Inequality Constraints from $e$separation Relations in Directed Acyclic Graphs with Hidden Variables
Abstract
Directed acyclic graphs (DAGs) with hidden variables are often used to characterize causal relations between variables in a system. When some variables are unobserved, DAGs imply a notoriously complicated set of constraints on the distribution of observed variables. In this work, we present entropic inequality constraints that are implied by $e$separation relations in hidden variable DAGs with discrete observed variables. The constraints can intuitively be understood to follow from the fact that the capacity of variables along a causal pathway to convey information is restricted by their entropy; e.g. at the extreme case, a variable with entropy $0$ can convey no information. We show how these constraints can be used to learn about the true causal model from an observed data distribution. In addition, we propose a measure of causal influence called the minimal mediary entropy, and demonstrate that it can augment traditional measures such as the average causal effect.
 Publication:

arXiv eprints
 Pub Date:
 July 2021
 arXiv:
 arXiv:2107.07087
 Bibcode:
 2021arXiv210707087F
 Keywords:

 Statistics  Machine Learning;
 Computer Science  Machine Learning
 EPrint:
 16 pages. Minor changes, mostly a revision to definition of minimal mediary entropy. (The revised definition of MME in arVix v2 takes the finite cardinality property of the mediary as a premise