Memory-Based Dual Gaussian Processes for Sequential Learning

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingsScientificpeer-review

49 Downloads (Pure)

Abstract

Sequential learning with Gaussian processes (GPs) is challenging when access to past data is limited, for example, in continual and active learning. In such cases, errors can accumulate over time due to inaccuracies in the posterior, hyperparameters, and inducing points, making accurate learning challenging. Here, we present a method to keep all such errors in check using the recently proposed dual sparse variational GP. Our method enables accurate inference for generic likelihoods and improves learning by actively building and updating a memory of past data. We demonstrate its effectiveness in several applications involving Bayesian optimization, active learning, and continual learning.
Original languageEnglish
Title of host publicationProceedings of the 40th International Conference on Machine Learning
EditorsAndread Krause, Emma Brunskill, Kyunghyun Cho, Barbara Engelhardt, Sivan Sabato, Jonathan Scarlett
PublisherJMLR
Pages4035-4054
Number of pages20
Publication statusPublished - Jul 2023
MoE publication typeA4 Conference publication
EventInternational Conference on Machine Learning - Honolulu, United States
Duration: 23 Jul 202329 Jul 2023
Conference number: 40

Publication series

NameProceedings of Machine Learning Research
PublisherPMLR
Volume202
ISSN (Electronic)2640-3498

Conference

ConferenceInternational Conference on Machine Learning
Abbreviated titleICML
Country/TerritoryUnited States
CityHonolulu
Period23/07/202329/07/2023

Fingerprint

Dive into the research topics of 'Memory-Based Dual Gaussian Processes for Sequential Learning'. Together they form a unique fingerprint.

Cite this