Bayesian Hierarchical Stacking: Some Models Are (Somewhere) Useful

Yuling Yao*, Gregor Pirš, Aki Vehtari, Andrew Gelman

*Corresponding author for this work

Research output: Contribution to journalArticleScientificpeer-review

18 Citations (Scopus)
188 Downloads (Pure)

Abstract

Stacking is a widely used model averaging technique that asymptotically yields optimal predictions among linear averages. We show that stacking is most effective when model predictive performance is heterogeneous in inputs, and we can further improve the stacked mixture with a hierarchical model. We generalize stacking to Bayesian hierarchical stacking. The model weights are varying as a function of data, partially-pooled, and inferred using Bayesian inference. We further incorporate discrete and continuous inputs, other structured priors, and time series and longitudinal data. To verify the performance gain of the proposed method, we derive theory bounds, and demonstrate on several applied problems.

Original languageEnglish
Pages (from-to)1043-1071
Number of pages29
JournalBayesian Analysis
Volume17
Issue number4
DOIs
Publication statusPublished - Dec 2022
MoE publication typeA1 Journal article-refereed

Keywords

  • Bayesian hierarchical modeling
  • Conditional prediction
  • Covariate shift
  • Model averaging
  • Prior construction
  • Stacking

Fingerprint

Dive into the research topics of 'Bayesian Hierarchical Stacking: Some Models Are (Somewhere) Useful'. Together they form a unique fingerprint.

Cite this