Poster
Learning Gaussian DAG Models without Condition Number Bounds
Constantinos Daskalakis · Vardis Kandiros · Rui Yao
East Exhibition Hall A-B #E-1102
We study the problem of learning a specific type of Bayesian network, a statistical model used to model the cause-and-effect relationships in various areas. There are prior works that tackle this problem, but the number of samples used by prior works depends on the condition number (the ratio of the largest to the smallest eigenvalue) of the covariance matrix. This condition number bounds are not optimal, so we provide a new algorithm that finds the Bayesian network to reduce the number of samples. We prove that the number of samples our algorithm requires is independent of the condition number. Additionally, we provide examples to show that the upper bound of the samples complexity of our algorithm is almost tight. Moreover, under certain additional assumptions, we design a polynomial-time algorithm that recovers the underlying graph, which does not depend on the condition number. Finally, we run simulations on synthetic datasets that confirm our predictions.
Live content is unavailable. Log in and register to view live content