Large-Scale Sparse Kernel Canonical Correlation Analysis

Tutkimustuotos: Artikkeli kirjassa/konferenssijulkaisussavertaisarvioitu

Standard

Large-Scale Sparse Kernel Canonical Correlation Analysis. / Uurtio, Viivi; Bhadra, Sahely; Rousu, Juho.

36th International Conference on Machine Learning, ICML 2019. 2019. s. 6383-6391 (Proceedings of Machine Learning Research; Vuosikerta 97).

Tutkimustuotos: Artikkeli kirjassa/konferenssijulkaisussavertaisarvioitu

Harvard

Uurtio, V, Bhadra, S & Rousu, J 2019, Large-Scale Sparse Kernel Canonical Correlation Analysis. julkaisussa 36th International Conference on Machine Learning, ICML 2019. Proceedings of Machine Learning Research, Vuosikerta. 97, Sivut 6383-6391, Long Beach, Yhdysvallat, 09/06/2019.

APA

Uurtio, V., Bhadra, S., & Rousu, J. (2019). Large-Scale Sparse Kernel Canonical Correlation Analysis. teoksessa 36th International Conference on Machine Learning, ICML 2019 (Sivut 6383-6391). (Proceedings of Machine Learning Research; Vuosikerta 97).

Vancouver

Uurtio V, Bhadra S, Rousu J. Large-Scale Sparse Kernel Canonical Correlation Analysis. julkaisussa 36th International Conference on Machine Learning, ICML 2019. 2019. s. 6383-6391. (Proceedings of Machine Learning Research).

Author

Uurtio, Viivi ; Bhadra, Sahely ; Rousu, Juho. / Large-Scale Sparse Kernel Canonical Correlation Analysis. 36th International Conference on Machine Learning, ICML 2019. 2019. Sivut 6383-6391 (Proceedings of Machine Learning Research).

Bibtex - Lataa

@inproceedings{2976ff2a04aa4591a3eb7aacc6c7221b,
title = "Large-Scale Sparse Kernel Canonical Correlation Analysis",
abstract = "This paper presents gradKCCA, a large-scale sparse non-linear canonical correlation method. Like Kernel Canonical Correlation Analysis (KCCA), our method finds non-linear relations through kernel functions, but it does not rely on a kernel matrix, a known bottleneck for scaling up kernel methods. gradKCCA corresponds to solving KCCA with the additional constraint that the canonical projection directions in the kernel-induced feature space have preimages in the original data space. Firstly, this modification allows us to very efficiently maximize kernel canonical correlation through an alternating projected gradient algorithm working in the original data space. Secondly, we can control the sparsity of the projection directions by constraining the ℓ1 norm of the preimages of the projection directions, facilitating the interpretation of the discovered patterns, which is not available through KCCA. Our empirical experiments demonstrate that gradKCCA outperforms state-of-the-art CCA methods in terms of speed and robustness to noise both in simulated and real-world datasets.",
author = "Viivi Uurtio and Sahely Bhadra and Juho Rousu",
year = "2019",
language = "English",
series = "Proceedings of Machine Learning Research",
publisher = "PMLR",
pages = "6383--6391",
booktitle = "36th International Conference on Machine Learning, ICML 2019",

}

RIS - Lataa

TY - GEN

T1 - Large-Scale Sparse Kernel Canonical Correlation Analysis

AU - Uurtio, Viivi

AU - Bhadra, Sahely

AU - Rousu, Juho

PY - 2019

Y1 - 2019

N2 - This paper presents gradKCCA, a large-scale sparse non-linear canonical correlation method. Like Kernel Canonical Correlation Analysis (KCCA), our method finds non-linear relations through kernel functions, but it does not rely on a kernel matrix, a known bottleneck for scaling up kernel methods. gradKCCA corresponds to solving KCCA with the additional constraint that the canonical projection directions in the kernel-induced feature space have preimages in the original data space. Firstly, this modification allows us to very efficiently maximize kernel canonical correlation through an alternating projected gradient algorithm working in the original data space. Secondly, we can control the sparsity of the projection directions by constraining the ℓ1 norm of the preimages of the projection directions, facilitating the interpretation of the discovered patterns, which is not available through KCCA. Our empirical experiments demonstrate that gradKCCA outperforms state-of-the-art CCA methods in terms of speed and robustness to noise both in simulated and real-world datasets.

AB - This paper presents gradKCCA, a large-scale sparse non-linear canonical correlation method. Like Kernel Canonical Correlation Analysis (KCCA), our method finds non-linear relations through kernel functions, but it does not rely on a kernel matrix, a known bottleneck for scaling up kernel methods. gradKCCA corresponds to solving KCCA with the additional constraint that the canonical projection directions in the kernel-induced feature space have preimages in the original data space. Firstly, this modification allows us to very efficiently maximize kernel canonical correlation through an alternating projected gradient algorithm working in the original data space. Secondly, we can control the sparsity of the projection directions by constraining the ℓ1 norm of the preimages of the projection directions, facilitating the interpretation of the discovered patterns, which is not available through KCCA. Our empirical experiments demonstrate that gradKCCA outperforms state-of-the-art CCA methods in terms of speed and robustness to noise both in simulated and real-world datasets.

M3 - Conference contribution

T3 - Proceedings of Machine Learning Research

SP - 6383

EP - 6391

BT - 36th International Conference on Machine Learning, ICML 2019

ER -

ID: 36784375