Trajectory-Based Road Autolabeling With Lidar-Camera Fusion in Winter Conditions

Research output: Contribution to journalArticleScientificpeer-review

1 Citation (Scopus)
24 Downloads (Pure)

Abstract

Robust road segmentation in all road conditions is required for safe autonomous driving and advanced driver assistance systems. Supervised deep learning methods provide accurate road segmentation in the domain of their training data but cannot be trusted in out-of-distribution scenarios. Including the whole distribution in the trainset is challenging, as each sample must be labeled by hand. Trajectory-based self-supervised methods offer a potential solution as they can learn from the traversed route without manual labels. However, existing trajectory-based methods use learning schemes that rely only on the camera or only on the lidar. In this paper, trajectory-based learning is implemented jointly with lidar and camera for increased performance. Our method outperforms recent standalone camera- and lidar-based methods when evaluated with a challenging winter driving dataset, including countryside and suburban driving scenes.

Original languageEnglish
Pages (from-to)108873-108882
Number of pages10
JournalIEEE Access
Volume13
DOIs
Publication statusPublished - 2025
MoE publication typeA1 Journal article-refereed

Funding

This work was supported by Henry Ford Foundation Finland and Aalto Doctoral School Program.

Keywords

  • autonomous driving
  • road segmentation
  • trajectory-based learning
  • Winter driving conditions

Fingerprint

Dive into the research topics of 'Trajectory-Based Road Autolabeling With Lidar-Camera Fusion in Winter Conditions'. Together they form a unique fingerprint.
  • Science-IT

    Hakala, M. (Manager)

    School of Science

    Facility/equipment: Facility

Cite this