2 Citations (Scopus)
71 Downloads (Pure)

Abstract

We propose and analyse a reduced-rank method for solving least-squares regression problems with infinite dimensional output. We derive learning bounds for our method, and study under which setting statistical performance is improved in comparison to full-rank method. Our analysis extends the interest of reduced-rank regression beyond the standard low-rank setting to more general output regularity assumptions. We illustrate our theoretical insights on synthetic least-squares problems. Then, we propose a surrogate structured prediction method derived from this reduced-rank method. We assess its benefits on three different problems: image reconstruction, multi-label classification, and metabolite identification.
Original languageEnglish
Article number344
Pages (from-to)1-50
Number of pages50
JournalJournal of Machine Learning Research
Volume23
Publication statusPublished - 2022
MoE publication typeA1 Journal article-refereed

Fingerprint

Dive into the research topics of 'Vector-Valued Least-Squares Regression under Output Regularity Assumptions'. Together they form a unique fingerprint.

Cite this