### Abstract

A common divide-and-conquer approach for Bayesian computation with big data is to partition the data, perform local inference for each piece separately, and combine the results to obtain a global posterior approximation. While being conceptually and computationally appealing, this method involves the problematic need to also split the prior for the local inferences; these weakened priors may not provide enough regularization for each separate computation, thus eliminating one of the key advantages of Bayesian methods. To resolve this dilemma while still retaining the generalizability of the underlying local inference method, we apply the idea of expectation propagation (EP) as a framework for distributed Bayesian inference. The central idea is to iteratively update approximations to the local likelihoods given the state of the other approximations and the prior.

The present paper has two roles: we review the steps that are needed to keep EP algorithms numerically stable, and we suggest a general approach, inspired by EP, for approaching data partitioning problems in a way that achieves the computational benefits of parallelism while allowing each local update to make use of relevant information from the other sites. In addition, we demonstrate how the method can be applied in a hierarchical context to make use of partitioning of both data and parameters. The paper describes a general algorithmic framework, rather than a specific algorithm, and presents an example implementation for it.

Original language | English |
---|---|

Pages (from-to) | 1-53 |

Number of pages | 53 |

Journal | Journal of Machine Learning Research |

Volume | 21 |

Publication status | Published - 2020 |

MoE publication type | A1 Journal article-refereed |

### Keywords

- Bayesian computation
- data partitioning
- expectation propagation
- hierarchical models
- statistical computing
- PRECISION MATRIX
- CLASSIFICATION
- LIKELIHOOD
- REGRESSION
- MODELS

## Equipment

## Cite this

*Journal of Machine Learning Research*,

*21*, 1-53.