Quotient normalized maximum likelihood criterion for learning Bayesian network structures

Tomi Silander, Janne Leppä-Aho, Elias Jääsaari, Teemu Roos

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingsScientificpeer-review

17 Citations (Scopus)
41 Downloads (Pure)

Abstract

We introduce an information theoretic criterion for Bayesian network structure learning which we call quotient normalized maximum likelihood (qNML). In contrast to the closely related factorized normalized maximum likelihood criterion, qNML satisfies the property of score equivalence. It is also decomposable and completely free of adjustable hyperparameters. For practical computations, we identify a remarkably accurate approximation proposed earlier by Szpankowski and Weinberger. Experiments on both simulated and real data demonstrate that the new criterion leads to parsimonious models with good predictive accuracy.

Original languageEnglish
Title of host publicationInternational Conference on Artificial Intelligence and Statistics, 9-11 April 2018, Playa Blanca, Lanzarote, Canary Islands
EditorsAmos Storkey, Fernando Perez-Cruz
PublisherJMLR
Pages948-957
Number of pages10
Publication statusPublished - 1 Jan 2018
MoE publication typeA4 Conference publication
EventInternational Conference on Artificial Intelligence and Statistics - Playa Blanca, Spain
Duration: 9 Apr 201811 Apr 2018
Conference number: 21

Publication series

NameProceedings of Machine Learning Research
PublisherPMLR
Volume84
ISSN (Electronic)1938-7228

Conference

ConferenceInternational Conference on Artificial Intelligence and Statistics
Abbreviated titleAISTATS
Country/TerritorySpain
CityPlaya Blanca
Period09/04/201811/04/2018

Fingerprint

Dive into the research topics of 'Quotient normalized maximum likelihood criterion for learning Bayesian network structures'. Together they form a unique fingerprint.

Cite this