Distill n' Explain: explaining graph neural networks using simple surrogates

Tamara Pereira, Erik Nascimento, Lucas E. Resck, Diego Mesquita, Amauri Souza

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingsScientificpeer-review

3 Citations (Scopus)
36 Downloads (Pure)


Explaining node predictions in graph neural networks (GNNs) often boils down to finding graph substructures that preserve predictions. Finding these structures usually implies back-propagating through the GNN, bonding the complexity (e.g., number of layers) of the GNN to the cost of explaining it. This naturally begs the question: Can we break this bond by explaining a simpler surrogate GNN? To answer the question, we propose Distill n' Explain (DnX). First, DnX learns a surrogate GNN via knowledge distillation. Then, DnX extracts node or edge-level explanations by solving a simple convex program. We also propose FastDnX, a faster version of DnX that leverages the linear decomposition of our surrogate model. Experiments show that DnX and FastDnX often outperform state-of-the-art GNN explainers while being orders of magnitude faster. Additionally, we support our empirical findings with theoretical results linking the quality of the surrogate model (i.e., distillation error) to the faithfulness of explanations.

Original languageEnglish
Title of host publicationProceedings of The 26th International Conference on Artificial Intelligence and Statistics (AISTATS) 2023
EditorsFrancisco Ruiz, Jennifer Dy, Jan-Willem van de Meent
Number of pages16
Publication statusPublished - 2023
MoE publication typeA4 Conference publication
EventInternational Conference on Artificial Intelligence and Statistics - Valencia, Spain
Duration: 25 Apr 202327 Apr 2023
Conference number: 26

Publication series

NameProceedings of Machine Learning Research
ISSN (Print)2640-3498


ConferenceInternational Conference on Artificial Intelligence and Statistics
Abbreviated titleAISTATS
Internet address


Dive into the research topics of 'Distill n' Explain: explaining graph neural networks using simple surrogates'. Together they form a unique fingerprint.

Cite this