Abstract
Stacking is a widely used model averaging technique that asymptotically yields optimal predictions among linear averages. We show that stacking is most effective when model predictive performance is heterogeneous in inputs, and we can further improve the stacked mixture with a hierarchical model. We generalize stacking to Bayesian hierarchical stacking. The model weights are varying as a function of data, partially-pooled, and inferred using Bayesian inference. We further incorporate discrete and continuous inputs, other structured priors, and time series and longitudinal data. To verify the performance gain of the proposed method, we derive theory bounds, and demonstrate on several applied problems.
Original language | English |
---|---|
Pages (from-to) | 1043-1071 |
Number of pages | 29 |
Journal | Bayesian Analysis |
Volume | 17 |
Issue number | 4 |
DOIs | |
Publication status | Published - Dec 2022 |
MoE publication type | A1 Journal article-refereed |
Keywords
- Bayesian hierarchical modeling
- Conditional prediction
- Covariate shift
- Model averaging
- Prior construction
- Stacking