Abstract
Both learning and inference tasks on Bayesian networks are NP-hard in general. Bounded tree-width Bayesian networks have recently received a lot of attention as a way to circumvent this complexity issue; however, while inference on bounded tree-width networks is tractable, the learning problem remains NP-hard even for tree-width~2. In this paper, we propose bounded vertex cover number Bayesian networks as an alternative to bounded tree-width networks. In particular, we show that both inference and learning can be done in polynomial time for any fixed vertex cover number bound k, in contrast to the general and bounded tree-width cases; on the other hand, we also show that learning problem is W[1]-hard in parameter k. Furthermore, we give an alternative way to learn bounded vertex cover number Bayesian networks using integer linear programming (ILP), and show this is feasible in practice.
Original language | English |
---|---|
Title of host publication | The Twenty-ninth Annual Conference on Neural Information Processing Systems (NIPS), Montreal, 6.-12.12.2015 |
Editors | Corinna Cortes, Neil D. Lawrence, Daniel D. Lee, Masashi Sugiyama, Roman Garnett |
Pages | 622-630 |
Publication status | Published - 2015 |
MoE publication type | A4 Conference publication |
Event | IEEE Conference on Neural Information Processing Systems - Montreal, Canada Duration: 7 Dec 2015 → 12 Dec 2015 Conference number: 28 |
Publication series
Name | Advances in neural information processing systems |
---|---|
ISSN (Print) | 1049-5258 |
Conference
Conference | IEEE Conference on Neural Information Processing Systems |
---|---|
Abbreviated title | NIPS |
Country/Territory | Canada |
City | Montreal |
Period | 07/12/2015 → 12/12/2015 |