Abstract
The expressive power of Gaussian processes depends heavily on the choice of kernel. In this work we propose the novel harmonizable mixture kernel (HMK), a family of expressive, interpretable, non-stationary kernels derived from mixture models on the generalized spectral representation. As a theoretically sound treatment of non-stationary kernels, HMK supports harmonizable covariances, a wide subset of kernels including all stationary and many non-stationary covariances. We also propose variational Fourier features, an inter-domain sparse GP inference framework that offers a representative set of 'inducing frequencies'. We show that harmonizable mixture kernels interpolate between local patterns, and that variational Fourier features offers a robust kernel learning framework for the new kernel family.
Original language | English |
---|---|
Title of host publication | The 22nd International Conference on Artificial Intelligence and Statistics |
Pages | 1812-1821 |
Publication status | Published - May 2019 |
MoE publication type | A4 Article in a conference publication |
Event | International Conference on Artificial Intelligence and Statistics - Naha, Japan Duration: 16 Apr 2019 → 18 Apr 2019 Conference number: 22 |
Publication series
Name | Proceedings of Machine Learning Research |
---|---|
Publisher | PMLR |
Volume | 89 |
ISSN (Electronic) | 2640-3498 |
Conference
Conference | International Conference on Artificial Intelligence and Statistics |
---|---|
Abbreviated title | AISTATS |
Country | Japan |
City | Naha |
Period | 16/04/2019 → 18/04/2019 |
Keywords
- Kernel methods
- Gaussian Processes