Distributed Bayesian matrix factorization with limited communication

Research output: Contribution to journalArticleScientificpeer-review

2 Citations (Scopus)
115 Downloads (Pure)

Abstract

Bayesian matrix factorization (BMF) is a powerful tool for producing low-rank representations of matrices and for predicting missing values and providing confidence intervals. Scaling up the posterior inference for massive-scale matrices is challenging and requires distributing both data and computation over many workers, making communication the main computational bottleneck. Embarrassingly parallel inference would remove the communication needed, by using completely independent computations on different data subsets, but it suffers from the inherent unidentifiability of BMF solutions. We introduce a hierarchical decomposition of the joint posterior distribution, which couples the subset inferences, allowing for embarrassingly parallel computations in a sequence of at most three stages. Using an efficient approximate implementation, we show improvements empirically on both real and simulated data. Our distributed approach is able to achieve a speed-up of almost an order of magnitude over the full posterior, with a negligible effect on predictive accuracy. Our method outperforms state-of-the-art embarrassingly parallel MCMC methods in accuracy, and achieves results competitive to other available distributed and parallel implementations of BMF.

Original languageEnglish
Pages (from-to)1-26
JournalMachine Learning
DOIs
Publication statusPublished - 1 Jan 2019
MoE publication typeA1 Journal article-refereed

Keywords

  • Bayesian matrix factorization
  • Distributed inference
  • Embarrassingly parallel MCMC
  • Posterior propagation

Fingerprint Dive into the research topics of 'Distributed Bayesian matrix factorization with limited communication'. Together they form a unique fingerprint.

Cite this