Embarrassingly parallel MCMC using deep invertible transformations

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

Standard

Embarrassingly parallel MCMC using deep invertible transformations. / Mesquita, Diego; Blomstedt, Paul; Kaski, Samuel.

35th Conference on Uncertainty in Artificial Intelligence (UAI 2019) . AUAI Press, 2019.

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

Harvard

Mesquita, D, Blomstedt, P & Kaski, S 2019, Embarrassingly parallel MCMC using deep invertible transformations. in 35th Conference on Uncertainty in Artificial Intelligence (UAI 2019) . AUAI Press, Conference on Uncertainty in Artificial Intelligence, Tel Aviv, Israel, 22/07/2019.

APA

Mesquita, D., Blomstedt, P., & Kaski, S. (2019). Embarrassingly parallel MCMC using deep invertible transformations. In 35th Conference on Uncertainty in Artificial Intelligence (UAI 2019) AUAI Press.

Vancouver

Mesquita D, Blomstedt P, Kaski S. Embarrassingly parallel MCMC using deep invertible transformations. In 35th Conference on Uncertainty in Artificial Intelligence (UAI 2019) . AUAI Press. 2019

Author

Mesquita, Diego ; Blomstedt, Paul ; Kaski, Samuel. / Embarrassingly parallel MCMC using deep invertible transformations. 35th Conference on Uncertainty in Artificial Intelligence (UAI 2019) . AUAI Press, 2019.

Bibtex - Download

@inproceedings{ec76f4e7bbdf4d39b788c7721d6e46e6,
title = "Embarrassingly parallel MCMC using deep invertible transformations",
abstract = "While MCMC methods have become a main work-horse for Bayesian inference, scaling them to large distributed datasets is still a challenge. Embarrassingly parallel MCMC strategies take a divide-and-conquer stance to achieve this by writing the target posterior as a product of subposteriors, running MCMC for each of them in parallel and subsequently combining the results. The challenge then lies in devising efficient aggregation strategies. Current strategies tradeoff between approximation quality, and costs of communication and computation. In this work, we introduce a novel method that addresses these issues simultaneously. Our key insight is to introduce a deep invertible transformation to approximate each of the subposteriors. These approximations can be made accurate even for complex distributions and serve as intermediate representations, keeping the total communication cost limited. Moreover, they enable us to sample from the product of the subposteriors using an efficient and stable importance sampling scheme. We demonstrate that the approach outperforms available state-of-the-art methods in a range of challenging scenarios, including high-dimensional and heterogeneous subposteriors.",
author = "Diego Mesquita and Paul Blomstedt and Samuel Kaski",
year = "2019",
language = "English",
isbn = "9781510891562",
booktitle = "35th Conference on Uncertainty in Artificial Intelligence (UAI 2019)",
publisher = "AUAI Press",

}

RIS - Download

TY - GEN

T1 - Embarrassingly parallel MCMC using deep invertible transformations

AU - Mesquita, Diego

AU - Blomstedt, Paul

AU - Kaski, Samuel

PY - 2019

Y1 - 2019

N2 - While MCMC methods have become a main work-horse for Bayesian inference, scaling them to large distributed datasets is still a challenge. Embarrassingly parallel MCMC strategies take a divide-and-conquer stance to achieve this by writing the target posterior as a product of subposteriors, running MCMC for each of them in parallel and subsequently combining the results. The challenge then lies in devising efficient aggregation strategies. Current strategies tradeoff between approximation quality, and costs of communication and computation. In this work, we introduce a novel method that addresses these issues simultaneously. Our key insight is to introduce a deep invertible transformation to approximate each of the subposteriors. These approximations can be made accurate even for complex distributions and serve as intermediate representations, keeping the total communication cost limited. Moreover, they enable us to sample from the product of the subposteriors using an efficient and stable importance sampling scheme. We demonstrate that the approach outperforms available state-of-the-art methods in a range of challenging scenarios, including high-dimensional and heterogeneous subposteriors.

AB - While MCMC methods have become a main work-horse for Bayesian inference, scaling them to large distributed datasets is still a challenge. Embarrassingly parallel MCMC strategies take a divide-and-conquer stance to achieve this by writing the target posterior as a product of subposteriors, running MCMC for each of them in parallel and subsequently combining the results. The challenge then lies in devising efficient aggregation strategies. Current strategies tradeoff between approximation quality, and costs of communication and computation. In this work, we introduce a novel method that addresses these issues simultaneously. Our key insight is to introduce a deep invertible transformation to approximate each of the subposteriors. These approximations can be made accurate even for complex distributions and serve as intermediate representations, keeping the total communication cost limited. Moreover, they enable us to sample from the product of the subposteriors using an efficient and stable importance sampling scheme. We demonstrate that the approach outperforms available state-of-the-art methods in a range of challenging scenarios, including high-dimensional and heterogeneous subposteriors.

UR - http://www.scopus.com/inward/record.url?scp=85073241801&partnerID=8YFLogxK

M3 - Conference contribution

SN - 9781510891562

BT - 35th Conference on Uncertainty in Artificial Intelligence (UAI 2019)

PB - AUAI Press

ER -

ID: 38171814