Large-Scale Sparse Kernel Canonical Correlation Analysis

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

Researchers

Research units

  • Indian Institute of Technology Palakkad

Abstract

This paper presents gradKCCA, a large-scale sparse non-linear canonical correlation method. Like Kernel Canonical Correlation Analysis (KCCA), our method finds non-linear relations through kernel functions, but it does not rely on a kernel matrix, a known bottleneck for scaling up kernel methods. gradKCCA corresponds to solving KCCA with the additional constraint that the canonical projection directions in the kernel-induced feature space have preimages in the original data space. Firstly, this modification allows us to very efficiently maximize kernel canonical correlation through an alternating projected gradient algorithm working in the original data space. Secondly, we can control the sparsity of the projection directions by constraining the ℓ1 norm of the preimages of the projection directions, facilitating the interpretation of the discovered patterns, which is not available through KCCA. Our empirical experiments demonstrate that gradKCCA outperforms state-of-the-art CCA methods in terms of speed and robustness to noise both in simulated and real-world datasets.

Details

Original languageEnglish
Title of host publicationProceedings of the 36th International Conference on Machine Learning
Publication statusPublished - 2019
MoE publication typeA4 Article in a conference publication
EventInternational Conference on Machine Learning - Long Beach, United States
Duration: 9 Jun 201915 Jun 2019
Conference number: 36

Publication series

NameProceedings of Machine Learning Research
PublisherPMLR
Volume97
ISSN (Electronic)2640-3498

Conference

ConferenceInternational Conference on Machine Learning
Abbreviated titleICML
CountryUnited States
CityLong Beach
Period09/06/201915/06/2019

ID: 36784375