The curse of dimensionality leads to problems in machine learning when dealing with high dimensionality. This aspect is particularly pronounced if intrinsically infinite dimensionality is faced such as present for spectral or functional data. Feature selection constitutes one possibility to deal with this problem. Often, it relies on mutual information as an evaluation tool for the feature importance, however, it might be overlaid by intrinsic biases such as a high correlation of neighbored function values for functional data. In this paper we propose to assess feature correlations of spectral data by an overlay of prior dependencies due to the functional nature and its similarity as measured by mutual information, enabling a quick overall assessment of the relationships between features. By integrating the Nyström approximation technique, the usually time consuming step to compute all pairwise mutual informations can be reduced to only linear complexity in the number of features.
|Otsikko||ESANN 2013 proceedings, 21st European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning|
|Tila||Julkaistu - 2013|
|OKM-julkaisutyyppi||A4 Artikkeli konferenssijulkaisuussa|
|Tapahtuma||European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning - Bruges, Belgia|
Kesto: 24 huhtikuuta 2013 → 26 huhtikuuta 2013
|Conference||European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning|
|Ajanjakso||24/04/2013 → 26/04/2013|