Abstract
Localization without Global Navigation Satellite Systems (GNSS) is a critical functionality in autonomous operations of unmanned aerial vehicles (UAVs). Vision-based localization on a known map can be an effective solution, but it is burdened by two main problems: places have different appearance depending on weather and season, and the perspective discrepancy between the UAV camera image and the map make matching hard. In this letter, we propose a localization solution relying on matching of UAV camera images to georeferenced orthophotos with a trained convolutional neural network model that is invariant to significant seasonal appearance difference (winter-summer) between the camera image and map. We compare the convergence speed and localization accuracy of our solution to six reference methods. The results show major improvements with respect to reference methods, especially under high seasonal variation. We finally demonstrate the ability of the method to successfully localize a real UAV, showing that the proposed method is robust to perspective changes.
Original language | English |
---|---|
Article number | 9830867 |
Pages (from-to) | 10232-10239 |
Number of pages | 8 |
Journal | IEEE Robotics and Automation Letters |
Volume | 7 |
Issue number | 4 |
DOIs | |
Publication status | Published - 1 Oct 2022 |
MoE publication type | A1 Journal article-refereed |
Keywords
- Location awareness
- Autonomous aerial vehicles
- Semantics
- Cameras
- Visualization
- Global navigation satellite system
- Feature extraction