Abstract
Autonomous driving is challenging in adverse road and weather conditions in which there might not be lane lines, the road might be covered in snow and the visibility might be poor. We extend the previous work on end-to-end learning for autonomous steering to operate in these adverse real-life conditions with multimodal data. We collected 28 hours of driving data in several road and weather conditions and trained convolutional neural networks to predict the car steering wheel angle from front-facing color camera images and lidar range and reflectance data. We compared the CNN model performances based on the different modalities and our results show that the lidar modality improves the performances of different multimodal sensor-fusion models. We also performed on-road tests with different models and they support this observation.
Original language | English |
---|---|
Title of host publication | Proceedings of ICPR 2020 - 25th International Conference on Pattern Recognition |
Publisher | IEEE |
Pages | 699-706 |
Number of pages | 8 |
ISBN (Electronic) | 9781728188089 |
DOIs | |
Publication status | Published - 2020 |
MoE publication type | A4 Conference publication |
Event | International Conference on Pattern Recognition - Virtual, Online, Milan, Italy Duration: 10 Jan 2021 → 15 Jan 2021 Conference number: 25 |
Publication series
Name | Proceedings - International Conference on Pattern Recognition |
---|---|
Publisher | IEEE |
ISSN (Print) | 1051-4651 |
Conference
Conference | International Conference on Pattern Recognition |
---|---|
Abbreviated title | ICPR |
Country/Territory | Italy |
City | Milan |
Period | 10/01/2021 → 15/01/2021 |