On the curse of dimensionality in supervised learning of smooth regression functions

Research output: Contribution to journalArticleScientificpeer-review

Standard

On the curse of dimensionality in supervised learning of smooth regression functions. / Liitiäinen, Elia; Corona, Francesco; Lendasse, Amaury.

In: Neural Processing Letters, Vol. 34, No. 2, 10.2011, p. 133-154.

Research output: Contribution to journalArticleScientificpeer-review

Harvard

APA

Vancouver

Author

Bibtex - Download

@article{6285c9b36e004fe295bd7dcd7d45538b,
title = "On the curse of dimensionality in supervised learning of smooth regression functions",
abstract = "In this paper, the effect of dimensionality on the supervised learning of infinitely differentiable regression functions is analyzed. By invoking the Van Trees lower bound, we prove lower bounds on the generalization error with respect to the number of samples and the dimensionality of the input space both in a linear and non-linear context. It is shown that in non-linear problems without prior knowledge, the curse of dimensionality is a serious problem. At the same time, we speculate counter-intuitively that sometimes supervised learning becomes plausible in the asymptotic limit of infinite dimensionality.",
keywords = "Analytic function, High dimensional, Minimax, Nonparametric regression, Supervised learning, Van Trees",
author = "Elia Liiti{\"a}inen and Francesco Corona and Amaury Lendasse",
year = "2011",
month = "10",
doi = "10.1007/s11063-011-9188-7",
language = "English",
volume = "34",
pages = "133--154",
journal = "Neural Processing Letters",
issn = "1370-4621",
publisher = "Springer Netherlands",
number = "2",

}

RIS - Download

TY - JOUR

T1 - On the curse of dimensionality in supervised learning of smooth regression functions

AU - Liitiäinen, Elia

AU - Corona, Francesco

AU - Lendasse, Amaury

PY - 2011/10

Y1 - 2011/10

N2 - In this paper, the effect of dimensionality on the supervised learning of infinitely differentiable regression functions is analyzed. By invoking the Van Trees lower bound, we prove lower bounds on the generalization error with respect to the number of samples and the dimensionality of the input space both in a linear and non-linear context. It is shown that in non-linear problems without prior knowledge, the curse of dimensionality is a serious problem. At the same time, we speculate counter-intuitively that sometimes supervised learning becomes plausible in the asymptotic limit of infinite dimensionality.

AB - In this paper, the effect of dimensionality on the supervised learning of infinitely differentiable regression functions is analyzed. By invoking the Van Trees lower bound, we prove lower bounds on the generalization error with respect to the number of samples and the dimensionality of the input space both in a linear and non-linear context. It is shown that in non-linear problems without prior knowledge, the curse of dimensionality is a serious problem. At the same time, we speculate counter-intuitively that sometimes supervised learning becomes plausible in the asymptotic limit of infinite dimensionality.

KW - Analytic function

KW - High dimensional

KW - Minimax

KW - Nonparametric regression

KW - Supervised learning

KW - Van Trees

UR - http://www.scopus.com/inward/record.url?scp=80053568910&partnerID=8YFLogxK

U2 - 10.1007/s11063-011-9188-7

DO - 10.1007/s11063-011-9188-7

M3 - Article

VL - 34

SP - 133

EP - 154

JO - Neural Processing Letters

JF - Neural Processing Letters

SN - 1370-4621

IS - 2

ER -

ID: 12777777