Self-Supervised Forecasting in Electronic Health Records with Attention-Free Models

Yogesh Kumar, Alexander Ilin, Henri Salo, Sangita Kulathinal, Maarit K. Leinonen, Pekka Marttinen

Research output: Contribution to journalArticleScientificpeer-review

2 Citations (Scopus)


Despite the proven effectiveness of Transformer neural networks across multiple domains, their performance with Electronic Health Records (EHR) can be nuanced. The unique, multidimensional sequential nature of EHR data can sometimes make even simple linear models with carefully engineered features more competitive. Thus, the advantages of Transformers, such as efficient transfer learning and improved scalability are not always fully exploited in EHR applications.

In this work, we aim to forecast the demand for healthcare services, by predicting the number of patient visits to healthcare facilities. The challenge amplifies when dealing with divergent patient subgroups, like those with rare diseases, which are characterized by unique health trajectories and are typically smaller in size. To address this, we employ a self-supervised pretraining strategy, Generative Summary Pretraining (GSP), which predicts future summary statistics based on past health records of a patient. Our models are pretrained on a health registry of nearly one million patients, then fine-tuned for specific subgroup prediction tasks, showcasing the potential to handle the multifaceted nature of EHR data.

In evaluation, SANSformer consistently surpasses robust EHR baselines, with our GSP pretraining method notably amplifying model performance, particularly within smaller patient subgroups. Our results illuminate the promising potential of tailored attention-free models and self-supervised pretraining in refining healthcare utilization predictions across various patient demographics.

Original languageEnglish
Pages (from-to)1-17
Number of pages17
JournalIEEE Transactions on Artificial Intelligence
Publication statusE-pub ahead of print - 2024
MoE publication typeA1 Journal article-refereed


  • Biological system modeling
  • Computational modeling
  • Data models
  • Deep Learning
  • Electronic Health Records
  • Healthcare
  • Healthcare Utilization
  • Medical services
  • Predictive models
  • Task analysis
  • Transfer Learning
  • Transformers


Dive into the research topics of 'Self-Supervised Forecasting in Electronic Health Records with Attention-Free Models'. Together they form a unique fingerprint.

Cite this