Abstract
Detection of the drivable area in all conditions is crucial for autonomous driving and advanced driver assistance systems. However, the amount of labeled data in adverse driving conditions is limited, especially in winter, and supervised methods generalize poorly to conditions outside the training distribution. For easy adaption to all conditions, the need for human annotation should be removed from the learning process. In this paper, Trajectory-Aided Drivable area Auto-labeling with Pretrained self-supervised features (TADAP) is presented for automated annotation of the drivable area in winter driving conditions. A sample of the drivable area is extracted based on the trajectory estimate from the global navigation satellite system. Similarity with the sample area is determined based on pre-trained self-supervised visual features. Image areas similar to the sample area are considered to be drivable. These TADAP labels were evaluated with a novel winter driving dataset, collected in varying driving scenes. A prediction model trained with the TADAP labels achieved a +9.6 improvement in intersection over union compared to the previous state-of-the-art of self-supervised drivable area detection.
Original language | English |
---|---|
Number of pages | 10 |
Journal | IEEE Transactions on Intelligent Vehicles |
DOIs | |
Publication status | E-pub ahead of print - 7 May 2024 |
MoE publication type | A1 Journal article-refereed |
Keywords
- autonomous vehicles
- Feature extraction
- Global navigation satellite system
- Meteorology
- Roads
- self-supervised visual learning
- Semantic scene understanding
- Snow
- Training
- Trajectory
- winter driving conditions