Projective inference in high-dimensional problems: Prediction and feature selection

Juho Piironen, Markus Paasiniemi, Aki Vehtari

Research output: Contribution to journalArticleScientificpeer-review

47 Citations (Scopus)
119 Downloads (Pure)


This paper reviews predictive inference and feature selection for generalized linear models with scarce but high-dimensional data. We demonstrate that in many cases one can benefit from a decision theoretically justified two-stage approach: first, construct a possibly non-sparse model that predicts well, and then find a minimal subset of features that characterize the predictions. The model built in the first step is referred to as the reference model and the operation during the latter step as predictive projection. The key characteristic of this approach is that it finds an excellent tradeoff between sparsity and predictive accuracy, and the gain comes from utilizing all available information including prior and that coming from the left out features. We review several methods that follow this principle and provide novel methodological contributions. We present a new projection technique that unifies two existing techniques and is both accurate and fast to compute. We also propose a way of evaluating the feature selection process using fast leave-one-out cross-validation that allows for easy and intuitive model size selection. Furthermore, we prove a theorem that helps to understand the conditions under which the projective approach could be beneficial. The key ideas are illustrated via several experiments using simulated and real world data.

Original languageEnglish
Pages (from-to)2155-2197
Number of pages43
JournalElectronic Journal of Statistics
Issue number1
Publication statusPublished - 1 Jan 2020
MoE publication typeA1 Journal article-refereed


  • Feature selection
  • Post-selection inference
  • Prediction
  • Projection
  • Sparsity


Dive into the research topics of 'Projective inference in high-dimensional problems: Prediction and feature selection'. Together they form a unique fingerprint.

Cite this