Differentiable Particle Filtering via Entropy-Regularized Optimal Transport

Adrien Corenflos, James Thornton, George Deligiannidis, Arnaud Doucet

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

Abstract

Particle Filtering (PF) methods are an established class of procedures for performing inference in non-linear state-space models. Resampling is a key ingredient of PF, necessary to obtain low variance likelihood and states estimates. However, traditional resampling methods result in PF-based loss functions being non-differentiable with respect to model and PF parameters. In a variational inference context, resampling also yields high variance gradient estimates of the PF-based evidence lower bound. By leveraging optimal transport ideas, we introduce a principled differentiable particle filter and provide convergence results. We demonstrate this novel method on a variety of applications.

Original languageEnglish
Title of host publicationProceedings of Machine Learning Research
EditorsM Meila, T Zhang
PublisherJMLR
Number of pages12
Publication statusPublished - 2021
MoE publication typeA4 Conference publication
EventInternational Conference on Machine Learning - Virtual, Online
Duration: 18 Jul 202124 Jul 2021
Conference number: 38

Publication series

NameProceedings of Machine Learning Research
Volume139
ISSN (Electronic)2640-3498

Conference

ConferenceInternational Conference on Machine Learning
Abbreviated titleICML
CityVirtual, Online
Period18/07/202124/07/2021

Keywords

  • LIKELIHOOD EVALUATION

Fingerprint

Dive into the research topics of 'Differentiable Particle Filtering via Entropy-Regularized Optimal Transport'. Together they form a unique fingerprint.

Cite this