State Space Expectation Propagation: Efficient Inference Schemes for Temporal Gaussian Processes

William Wilkinson, Paul Chang, Michael Riis Andersen, Arno Solin

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingsScientificpeer-review

10 Citations (Scopus)
42 Downloads (Pure)


We formulate approximate Bayesian inference in non-conjugate temporal and spatio-temporal Gaussian process models as a simple parameter update rule applied during Kalman smoothing. This viewpoint encompasses most inference schemes, including expectation propagation (EP), the classical (Extended, Unscented, etc.) Kalman smoothers, and variational inference. We provide a unifying perspective on these algorithms, showing how replacing the power EP moment matching step with linearisation recovers the classical smoothers. EP provides some benefits over the traditional methods via introduction of the so-called cavity distribution, and we combine these benefits with the computational efficiency of linearisation, providing extensive empirical analysis demonstrating the efficacy of various algorithms under this unifying framework. We provide a fast implementation of all methods in JAX.
Original languageEnglish
Title of host publicationProceedings of the 37th International Conference on Machine Learning
Publication statusPublished - 13 Jul 2020
MoE publication typeA4 Conference publication
EventInternational Conference on Machine Learning - Vienna, Austria
Duration: 12 Jul 202018 Jul 2020
Conference number: 37

Publication series

NameProceedings of Machine Learning Research
ISSN (Electronic)2640-3498


ConferenceInternational Conference on Machine Learning
Abbreviated titleICML


Dive into the research topics of 'State Space Expectation Propagation: Efficient Inference Schemes for Temporal Gaussian Processes'. Together they form a unique fingerprint.

Cite this