Gaussian Approximations of SDES in Metropolis-Adjusted Langevin Algorithms

Simo Särkkä, Christos Merkatas, Toni Karvonen

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingsScientificpeer-review

171 Downloads (Pure)

Abstract

Markov chain Monte Carlo (MCMC) methods are a cornerstone of Bayesian inference and stochastic simulation. The Metropolis-adjusted Langevin algorithm (MALA) is an MCMC method that relies on the simulation of a stochastic differential equation (SDE) whose stationary distribution is the desired target density using the Euler-Maruyama algorithm and accounts for simulation errors using a Metropolis step. In this paper we propose a modification of the MALA which uses Gaussian assumed density approximations for the integration of the SDE. The effectiveness of the algorithm is illustrated on simulated and real data sets.
Original languageEnglish
Title of host publication2021 IEEE 31st International Workshop on Machine Learning for Signal Processing, MLSP 2021
PublisherIEEE
Pages1-6
Number of pages6
ISBN (Electronic)978-1-7281-6338-3
ISBN (Print)978-1-6654-1184-4
DOIs
Publication statusPublished - 15 Nov 2021
MoE publication typeA4 Conference publication
EventIEEE International Workshop on Machine Learning for Signal Processing - Gold Coast, Australia
Duration: 25 Oct 202128 Oct 2021
Conference number: 31
https://2021.ieeemlsp.org/

Publication series

NameMachine learning for signal processing
ISSN (Print)1551-2541

Workshop

WorkshopIEEE International Workshop on Machine Learning for Signal Processing
Abbreviated titleMLSP
Country/TerritoryAustralia
CityGold Coast
Period25/10/202128/10/2021
Internet address

Keywords

  • Monte Carlo methods
  • Machine learning algorithms
  • Heuristic algorithms
  • Signal processing algorithms
  • Machine learning
  • Signal processing
  • Markov processes

Fingerprint

Dive into the research topics of 'Gaussian Approximations of SDES in Metropolis-Adjusted Langevin Algorithms'. Together they form a unique fingerprint.

Cite this