Dimension reduction for regression with bottleneck neural networks

Elina Parviainen*

*Corresponding author for this work

    Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

    4 Citations (Scopus)


    Dimension reduction for regression (DRR) deals with the problem of finding for high-dimensional data such low-dimensional representations, which preserve the ability to predict a target variable. We propose doing DRR using a neural network with a low-dimensional "bottleneck" layer. While the network is trained for regression, the bottleneck learns a low-dimensional representation for the data. We compare our method to Covariance Operator Inverse Regression (COIR), which has been reported to perform well compared to many other DRR methods. The bottleneck network compares favorably with COIR: it is applicable to larger data sets, it is less sensitive to tuning parameters and it gives better results on several real data sets.

    Original languageEnglish
    Title of host publicationIntelligent Data Engineering and Automated Learning, IDEAL 2010 - 11th International Conference, Proceedings
    Number of pages8
    Volume6283 LNCS
    Publication statusPublished - 2010
    MoE publication typeA4 Article in a conference publication
    EventInternational Conference on Intelligent Data Engineering and Automated Learning - Paisley, United Kingdom
    Duration: 1 Sep 20103 Sep 2010
    Conference number: 11

    Publication series

    NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
    Volume6283 LNCS
    ISSN (Print)03029743
    ISSN (Electronic)16113349


    ConferenceInternational Conference on Intelligent Data Engineering and Automated Learning
    Abbreviated titleIDEAL
    CountryUnited Kingdom


    • COIR
    • Dimension reduction for regression
    • Neural networks
    • Supervised dimension reduction

    Fingerprint Dive into the research topics of 'Dimension reduction for regression with bottleneck neural networks'. Together they form a unique fingerprint.

    Cite this