Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Structured Probabilistic Inference and Generative Modeling

Learnability of Parameter-Bounded Bayes Nets

Arnab Bhattacharyya · Davin Choo · Sutanu Gayen · Dimitrios Myrisiotis

Keywords: [ TV distance ] [ sample complexity ] [ Bayes nets ] [ promise problems ] [ search problems ] [ NP-hardness ] [ Bayes net parameters ] [ Turing reductions ] [ total variation distance ] [ Learning ]


Abstract:

Bayes nets are extensively used in practice to efficiently represent joint probability distributions over a set of random variables and capture dependency relations.In a seminal paper, Chickering et al. (JMLR 2004) showed that given a distribution P, that is defined as the marginal distribution of a Bayes net, it is NP-hard to decide whether there is a parameter-bounded Bayes net that represents P.They called this problem LEARN.In this work, we extend the NP-hardness result of LEARN and prove the NP-hardness of a search variant of LEARN, whereby the Bayes net in question is guaranteed to exist and one is asked to find such a Bayes net.We complement our hardness result with a positive result about the sample complexity that is sufficient to recover a parameter-bounded Bayes net that is close (in TV distance) to a given distribution P, that is represented by some parameter-bounded Bayes net, generalizing a degree-bounded sample complexity result of Brustle et al. (EC 2020).

Chat is not available.