Oral
Conditional Independence in Testing Bayesian Networks
Yujia Shen · Haiying Huang · Arthur Choi · Adnan Darwiche

Wed Jun 12th 03:00 -- 03:05 PM @ Grand Ballroom

Testing Bayesian Networks (TBNs) were introduced recently to represent a set of distributions, one of which is selected based on the given evidence and used for reasoning. TBNs are more expressive than classical Bayesian Networks (BNs): Marginal queries correspond to multi-linear functions in BNs and to piecewise multi-linear functions in TBNs. Moreover, marginal TBN queries are universal approximators, like neural networks. In this paper, we study conditional independence in TBNs, showing that it can be inferred from d-separation as in BNs. We also study the role of TBN expressiveness and independence in dealing with the problem of learning using incomplete models (i.e., ones that are missing nodes or edges from the data-generating model). Finally, we illustrate our results on a number of concrete examples, including a case study on (high order) Hidden Markov Models.

Author Information

Yujia Shen (UCLA)
Haiying Huang (UCLA)
Arthur Choi (UCLA)
Adnan Darwiche (UCLA)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors