On the Generalization of Equivariant Graph Neural Networks

Rafał Karczewski*, Amauri H. Souza, Vikas Garg*

*Corresponding author for this work

Research output: Contribution to journalConference articleScientificpeer-review

42 Downloads (Pure)

Abstract

E(n)-Equivariant Graph Neural Networks (EGNNs) are among the most widely used and successful models for representation learning on geometric graphs (e.g., 3D molecules). However, while the expressivity of EGNNs has been explored in terms of geometric variants of the Weisfeiler-Leman isomorphism test, characterizing their generalization capability remains open. In this work, we establish the first generalization bound for EGNNs. Our bound depicts a dependence on the weighted sum of logarithms of the spectral norms of the weight matrices (EGNN parameters). In addition, our main result reveals interesting novel insights: i) the spectral norms of the initial layers may impact generalization more than the final ones; ii) ε-normalization is beneficial to generalization ' confirming prior empirical evidence. We leverage these insights to introduce a spectral norm regularizer tailored to EGNNs. Experiments on real-world datasets substantiate our analysis, demonstrating a high correlation between theoretical and empirical generalization gaps and the effectiveness of the proposed regularization scheme.

Original languageEnglish
Pages (from-to)23159-23186
Number of pages28
JournalProceedings of Machine Learning Research
Volume235
Publication statusPublished - 2024
MoE publication typeA4 Conference publication
EventInternational Conference on Machine Learning - Vienna, Austria
Duration: 21 Jul 202427 Jul 2024
Conference number: 41

Fingerprint

Dive into the research topics of 'On the Generalization of Equivariant Graph Neural Networks'. Together they form a unique fingerprint.

Cite this