State Space Expectation Propagation: Efficient Inference Schemes for Temporal Gaussian Processes

William Wilkinson, Paul Chang, Michael Riis Andersen, Arno Solin

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

7 Downloads (Pure)

Abstract

We formulate approximate Bayesian inference in non-conjugate temporal and spatio-temporal Gaussian process models as a simple parameter update rule applied during Kalman smoothing. This viewpoint encompasses most inference schemes, including expectation propagation (EP), the classical (Extended, Unscented, etc.) Kalman smoothers, and variational inference. We provide a unifying perspective on these algorithms, showing how replacing the power EP moment matching step with linearisation recovers the classical smoothers. EP provides some benefits over the traditional methods via introduction of the so-called cavity distribution, and we combine these benefits with the computational efficiency of linearisation, providing extensive empirical analysis demonstrating the efficacy of various algorithms under this unifying framework. We provide a fast implementation of all methods in JAX.
Original languageEnglish
Title of host publicationProceedings of the 37th International Conference on Machine Learning
Pages10270-10281
Publication statusPublished - 13 Jul 2020
MoE publication typeA4 Article in a conference publication
EventInternational Conference on Machine Learning - Vienna, Austria
Duration: 12 Jul 202018 Jul 2020
Conference number: 37

Publication series

NameProceedings of Machine Learning Research
PublisherPMLR
Volume119
ISSN (Electronic)2640-3498

Conference

ConferenceInternational Conference on Machine Learning
Abbreviated titleICML
CountryAustria
CityVienna
Period12/07/202018/07/2020

Fingerprint

Dive into the research topics of 'State Space Expectation Propagation: Efficient Inference Schemes for Temporal Gaussian Processes'. Together they form a unique fingerprint.

Cite this