Advances in distributed Bayesian inference and graph neural networks

Research output: ThesisDoctoral ThesisCollection of Articles


Bayesian statistics and graph neural networks comprise a bag of tools widely employed in machine learning and applied sciences. The former rests on solid theoretical foundations, but its application depends on techniques that scale poorly as data increase. The latter is notorious for large-scale applications (e.g., in bioinformatics and natural language processing), but is largely only based on empirical intuitions. This thesis aims to i) broaden the scope of applications for Bayesian inference, and ii) deepen the understanding of core design principles of graph neural networks. First, we focus on distributed Bayesian inference under limited communication. We advance the state-of-the-art of embarrassingly parallel Markov chain Monte Carlo (MCMC) with a novel method that leverages normalizing flows as density estimators. On the same front, we also propose an extension of stochastic gradient Langevin dynamics for federated data, which are inherently distributed in a non-IID manner and cannot be centralized due to privacy constraints. Second, we develop a methodology for meta-analysis which allows the combination of Bayesian posteriors from different studies. Our approach is agnostic to study-specific complexities, which are all encapsulated in their respective posteriors. This extends the application of Bayesian meta-analysis to likelihood-free posteriors, which would otherwise be challenging. Our method also enables us to reuse posteriors from computationally costly analyses and update them post-hoc, without rerunning the analyses. Finally, we revisit two popular graph neural network components: spectral graph convolutions and pooling layers. Regarding convolutions, we propose a novel architecture and show that it is possible to achieve state-of-the-art performance by adding a minimal set of features to the most basic formulation of polynomial spectral convolutions. On the topic of pooling, we challenge the need for intricate pooling schemes and show that they do not play a role in the performance of graph networks in relevant benchmarks.
Translated title of the contributionAdvances in distributed Bayesian inference and graph neural networks
Original languageEnglish
QualificationDoctor's degree
Awarding Institution
  • Aalto University
  • Kaski, Samuel, Supervising Professor
Print ISBNs978-952-64-0608-4
Electronic ISBNs978-952-64-0609-1
Publication statusPublished - 2021
MoE publication typeG5 Doctoral dissertation (article)


  • Bayesian statistics
  • graph neural networks
  • machine learning


Dive into the research topics of 'Advances in distributed Bayesian inference and graph neural networks'. Together they form a unique fingerprint.

Cite this