Abstract
Devoted to multi-task learning and structured output learning, operator-valued kernels provide a flexible tool to build vector-valued functions in the context of Reproducing Kernel Hilbert Spaces. To scale up these methods, we extend the celebrated Random Fourier Feature methodology to get an approximation of operator-valued kernels. We propose a general principle for Operator-valued Random Fourier Feature construction relying on a generalization of Bochner’s theorem for translation-invariant operator-valued Mercer kernels. We prove the uniform convergence of the kernel approximation for bounded and unbounded operator random Fourier features using appropriate Bernstein matrix concentration inequality. An experimental proof-of-concept shows the quality of the approximation and the efficiency of the corresponding linear models on example datasets.
| Original language | English |
|---|---|
| Title of host publication | Proceedings of the 8th Asian Conference on Machine Learning |
| Editors | Bob Durrant, Kee-Eung Kim |
| Publisher | JMLR |
| Pages | 110-125 |
| Publication status | Published - 2016 |
| MoE publication type | A4 Conference publication |
| Event | Asian Conference on Machine Learning - Hamilton, New Zealand Duration: 16 Nov 2016 → 18 Nov 2016 Conference number: 8 |
Publication series
| Name | Proceedings of Machine Learning Research |
|---|---|
| Publisher | PMLR |
| Volume | 63 |
| ISSN (Electronic) | 1938-7228 |
Conference
| Conference | Asian Conference on Machine Learning |
|---|---|
| Abbreviated title | ACML |
| Country/Territory | New Zealand |
| City | Hamilton |
| Period | 16/11/2016 → 18/11/2016 |
Fingerprint
Dive into the research topics of 'Random Fourier Features For Operator-Valued Kernels'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver