Projects per year
Abstract
Particle smoothers are SMC (Sequential Monte Carlo) algorithms designed to approximate the joint distribution of the states given observations from a state-space model. We propose dSMC (de-Sequentialized Monte Carlo), a new particle smoother that is able to process T observations in O(log T) time on parallel architectures. This compares favorably with standard particle smoothers, the complexity of which is linear in T. We derive Lp convergence results for dSMC, with an explicit upper bound, polynomial in T. We then discuss how to reduce the variance of the smoothing estimates computed by dSMC by (i) designing good proposal distributions for sampling the particles at the initialization
of the algorithm, as well as by (ii) using lazy resampling to increase the number of particles used in dSMC. Finally, we design a particle Gibbs sampler based on dSMC, which is able to perform parameter inference in a state-space model at a O(log T) cost on parallel hardware.
of the algorithm, as well as by (ii) using lazy resampling to increase the number of particles used in dSMC. Finally, we design a particle Gibbs sampler based on dSMC, which is able to perform parameter inference in a state-space model at a O(log T) cost on parallel hardware.
Original language | English |
---|---|
Number of pages | 39 |
Journal | Journal of Machine Learning Research |
Volume | 23 |
Publication status | Published - 1 Aug 2022 |
MoE publication type | A1 Journal article-refereed |
Fingerprint
Dive into the research topics of 'De-Sequentialized Monte Carlo: a parallel-in-time particle smoother'. Together they form a unique fingerprint.Projects
- 1 Finished
-
Parallel and distributed computing for Bayesian graphical models
Särkkä, S., Emzir, M., Corenflos, A., Hassan, S. S., Ma, X., Merkatas, C., Yaghoobi, F. & Yamin, A.
04/09/2019 → 31/12/2022
Project: Academy of Finland: Other research funding