Random Fourier Features For Operator-Valued Kernels

Romain Brault, Markus Heinonen, Florence d'Alché-Buc

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingsScientificpeer-review

124 Downloads (Pure)

Abstract

Devoted to multi-task learning and structured output learning, operator-valued kernels provide a flexible tool to build vector-valued functions in the context of Reproducing Kernel Hilbert Spaces. To scale up these methods, we extend the celebrated Random Fourier Feature methodology to get an approximation of operator-valued kernels. We propose a general principle for Operator-valued Random Fourier Feature construction relying on a generalization of Bochner’s theorem for translation-invariant operator-valued Mercer kernels. We prove the uniform convergence of the kernel approximation for bounded and unbounded operator random Fourier features using appropriate Bernstein matrix concentration inequality. An experimental proof-of-concept shows the quality of the approximation and the efficiency of the corresponding linear models on example datasets.
Original languageEnglish
Title of host publicationProceedings of the 8th Asian Conference on Machine Learning
EditorsBob Durrant, Kee-Eung Kim
PublisherJMLR
Pages110-125
Publication statusPublished - 2016
MoE publication typeA4 Conference publication
EventAsian Conference on Machine Learning - Hamilton, New Zealand
Duration: 16 Nov 201618 Nov 2016
Conference number: 8

Publication series

NameProceedings of Machine Learning Research
PublisherPMLR
Volume63
ISSN (Electronic)1938-7228

Conference

ConferenceAsian Conference on Machine Learning
Abbreviated titleACML
Country/TerritoryNew Zealand
CityHamilton
Period16/11/201618/11/2016

Fingerprint

Dive into the research topics of 'Random Fourier Features For Operator-Valued Kernels'. Together they form a unique fingerprint.

Cite this