Dual parameterization of sparse variational Gaussian processes

Vincent Adam*, Paul Chang*, Mohammad Emtiyaz Khan, Arno Solin

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingsScientificpeer-review

27 Downloads (Pure)

Abstract

Sparse variational Gaussian process (SVGP) methods are a common choice for non-conjugate Gaussian process inference because of their computational benefits. In this paper, we improve their computational efficiency by using a dual parameterization where each data example is assigned dual parameters, similarly to site parameters used in expectation propagation. Our dual parameterization speeds-up inference using natural gradient descent, and provides a tighter evidence lower bound for hyperparameter learning. The approach has the same memory cost as the current SVGP methods, but it is faster and more accurate.
Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 34 (NeurIPS 2021)
PublisherCurran Associates Inc.
Number of pages12
Publication statusPublished - 2021
MoE publication typeA4 Conference publication
EventConference on Neural Information Processing Systems - Virtual, Online
Duration: 6 Dec 202114 Dec 2021
Conference number: 35
https://neurips.cc

Publication series

NameAdvances in Neural Information Processing Systems
PublisherMorgan Kaufmann Publishers
ISSN (Print)1049-5258

Conference

ConferenceConference on Neural Information Processing Systems
Abbreviated titleNeurIPS
CityVirtual, Online
Period06/12/202114/12/2021
Internet address

Fingerprint

Dive into the research topics of 'Dual parameterization of sparse variational Gaussian processes'. Together they form a unique fingerprint.

Cite this