Abstrakti
Gaussian process (GP) regression with 1D inputs can often be performed in linear time via a stochastic differential equation formulation. However, for non-Gaussian likelihoods, this requires application of approximate inference methods which can make the implementation difficult, e.g., expectation propagation can be numerically unstable and variational inference can be computationally inefficient. In this paper, we propose a new method that removes such difficulties. Building upon an existing method called conjugate-computation variational inference, our approach enables linear-time inference via Kalman recursions while avoiding numerical instabilities and convergence issues. We provide an efficient JAX implementation which exploits just-in-time compilation and allows for fast automatic differentiation through large for-loops. Overall, our approach leads to fast and stable variational inference in state-space GP models that can be scaled to time series with millions of data points.
Alkuperäiskieli | Englanti |
---|---|
Otsikko | Proceedings of the 2020 IEEE 30th International Workshop on Machine Learning for Signal Processing, MLSP 2020 |
Kustantaja | IEEE |
ISBN (elektroninen) | 978-1-7281-6662-9 |
DOI - pysyväislinkit | |
Tila | Julkaistu - 23 syysk. 2020 |
OKM-julkaisutyyppi | A4 Artikkeli konferenssijulkaisussa |
Tapahtuma | IEEE International Workshop on Machine Learning for Signal Processing - Aalto University, Espoo, Suomi Kesto: 21 syysk. 2020 → 24 syysk. 2020 Konferenssinumero: 30 https://ieeemlsp.cc |
Workshop
Workshop | IEEE International Workshop on Machine Learning for Signal Processing |
---|---|
Lyhennettä | MLSP |
Maa/Alue | Suomi |
Kaupunki | Espoo |
Ajanjakso | 21/09/2020 → 24/09/2020 |
www-osoite |