Towards Domain-Agnostic Contrastive Learning

Vikas Verma, Minh-Thang Luong, Kenji Kawaguchi, Hieu Pham, Quoc V. Le

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingsScientificpeer-review

64 Downloads (Pure)

Abstract

Despite recent successes, most contrastive self-supervised learning methods are domain-specific, relying heavily on data augmentation techniques that require knowledge about a particular domain, such as image cropping and rotation. To overcome such limitation, we propose a domain-agnostic approach to contrastive learning, named DACL, that is applicable to problems where domain-specific data augmentations are not readily available. Key to our approach is the use of Mixup noise to create similar and dissimilar examples by mixing data samples differently either at the input or hidden-state levels. We theoretically analyze our method and show advantages over the Gaussian-noise based contrastive learning approach. To demonstrate the effectiveness of DACL, we conduct experiments across various domains such as tabular data, images, and graphs. Our results show that DACL not only outperforms other domain-agnostic noising methods, such as Gaussian-noise, but also combines well with domain-specific methods, such as SimCLR, to improve self-supervised visual representation learning.
Original languageEnglish
Title of host publicationProceedings of the 38 th International Conference on Machine Learning
PublisherJMLR
Number of pages12
Publication statusPublished - 2021
MoE publication typeA4 Conference publication
EventInternational Conference on Machine Learning - Virtual, Online
Duration: 18 Jul 202124 Jul 2021
Conference number: 38

Publication series

NameProceedings of Machine Learning Research
PublisherPMLR
Volume139
ISSN (Electronic)2640-3498

Conference

ConferenceInternational Conference on Machine Learning
Abbreviated titleICML
CityVirtual, Online
Period18/07/202124/07/2021

Fingerprint

Dive into the research topics of 'Towards Domain-Agnostic Contrastive Learning'. Together they form a unique fingerprint.

Cite this