GraphMix: Regularized Training of Graph Neural Networks for Semi-Supervised Learning

Vikas Verma, Meng Qu, Alex Lamb, Yoshua Bengio, Juho Kannala, Tang Jian

Research output: Working paperScientific

Abstract

We present GraphMix, a regularization technique for Graph Neural Network based semi-supervised object classification, leveraging the recent advances in the regularization of classical deep neural networks. Specifically, we propose a unified approach in which we train a fully-connected network jointly with the graph neural network via parameter sharing, interpolation-based regularization, and self-predicted-targets. Our proposed method is architecture agnostic in the sense that it can be applied to any variant of graph neural networks which applies a parametric transformation to the features of the graph nodes. Despite its simplicity, with GraphMix we can consistently improve results and achieve or closely match state-of-the-art performance using even simpler architectures such as Graph Convolutional Networks, across three established graph benchmarks: the Cora, Citeseer and Pubmed citation network datasets, as well as three newly proposed datasets: Cora-Full, Co-author-CS and Co-author-Physics.
Original languageEnglish
PublisherAAAI Press
Publication statusPublished - 2021
MoE publication typeD4 Published development or research report or study

Fingerprint

Dive into the research topics of 'GraphMix: Regularized Training of Graph Neural Networks for Semi-Supervised Learning'. Together they form a unique fingerprint.

Cite this