On the curse of dimensionality in supervised learning of smooth regression functions

Research output: Contribution to journalArticle

Researchers

Research units

Abstract

In this paper, the effect of dimensionality on the supervised learning of infinitely differentiable regression functions is analyzed. By invoking the Van Trees lower bound, we prove lower bounds on the generalization error with respect to the number of samples and the dimensionality of the input space both in a linear and non-linear context. It is shown that in non-linear problems without prior knowledge, the curse of dimensionality is a serious problem. At the same time, we speculate counter-intuitively that sometimes supervised learning becomes plausible in the asymptotic limit of infinite dimensionality.

Details

Original languageEnglish
Pages (from-to)133-154
Number of pages22
JournalNeural Processing Letters
Volume34
Issue number2
Publication statusPublished - Oct 2011
MoE publication typeA1 Journal article-refereed

    Research areas

  • Analytic function, High dimensional, Minimax, Nonparametric regression, Supervised learning, Van Trees

ID: 12777777