1 Citation (Scopus)
5 Downloads (Pure)

Abstract

Drug-induced liver injury (DILI) presents a significant challenge due to its complexity, small datasets, and severe class imbalance. While unsupervised pretraining is a common approach to learn molecular representations for downstream tasks, it often lacks insights into how molecules interact with biological systems. We therefore introduce VitroBERT, a bidirectional encoder representations from transformers (BERT) model pretrained on large-scale in vitro assay profiles to generate biologically informed molecular embeddings. When leveraged to predict in vivo DILI endpoints, these embeddings delivered up to a 29% improvement in biochemistry-related tasks and a 16% gain in histopathology endpoints compared to unsupervised pretraining (MolBERT). However, no significant improvement was observed in clinical tasks. Furthermore, to address the critical issue of class imbalance, we evaluated multiple loss functions-including BCE, weighted BCE, Focal loss, and weighted Focal loss-and identified weighted Focal loss as the most effective. Our findings demonstrate the potential of integrating biological context into molecular models and highlight the importance of selecting appropriate loss functions in improving model performance of highly imbalanced DILI-related tasks.

Original languageEnglish
Article number119
JournalJournal of Cheminformatics
Volume17
Issue number1
DOIs
Publication statusPublished - Dec 2025
MoE publication typeA1 Journal article-refereed

Keywords

  • BERT
  • DILI
  • Molecular embeddings
  • Toxicity

Fingerprint

Dive into the research topics of 'VitroBert: modeling DILI by pretraining BERT on in vitro data'. Together they form a unique fingerprint.

Cite this