Fast optimize-and-sample method for differentiable Galerkin approximations of multi-layered Gaussian process priors

Muhammad F. Emzir, Niki A. Loppi, Zheng Zhao, Syeda S. Hassan, Simo Sarkka

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingsScientificpeer-review


Multi-layered Gaussian process (field) priors are non-Gaussian priors, which offer a capability to handle Bayesian inference on both smooth and discontinuous functions. Previously, performing Bayesian inference using these priors required the construction of a Markov chain Monte Carlo sampler. To converge to the stationary distribution, this sampling technique is computationally inefficient and hence the utility of the approach has only been demonstrated for small canonical test problems. Furthermore, in numerous Bayesian inference applications, such as Bayesian inverse problems, the uncertainty quantification of the hyper-prior layers is of less interest, since the main concern is to quantify the randomness of the process/field of interest. In this article, we propose an alternative approach, where we optimize the hyper-prior layers, while inference is performed only for the lowest layer. Specifically, we use the Galerkin approximation with automatic differentiation to accelerate optimization. We validate the proposed approach against several existing non-stationary Gaussian process methods and demonstrate that it can significantly decrease the execution time while maintaining comparable accuracy. We also apply the method to an X-ray tomography inverse problem. Due to its improved performance and robustness, this new approach opens up the possibility for applying the multi-layer Gaussian field priors to more complex problems.

Original languageEnglish
Title of host publication2022 25th International Conference on Information Fusion, FUSION 2022
Number of pages7
ISBN (Electronic)978-1-7377497-2-1
Publication statusPublished - 2022
MoE publication typeA4 Conference publication
EventInternational Conference on Information Fusion - Linkoping, Sweden
Duration: 4 Jul 20227 Jul 2022
Conference number: 25


ConferenceInternational Conference on Information Fusion
Abbreviated titleFUSION


  • Bayesian learning
  • Galerkin approximations
  • Gaussian Processes
  • inverse problems
  • Markov chain Monte Carlo


Dive into the research topics of 'Fast optimize-and-sample method for differentiable Galerkin approximations of multi-layered Gaussian process priors'. Together they form a unique fingerprint.

Cite this