Can You Trust Your Pose? Confidence Estimation in Visual Localization

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingsScientificpeer-review

2 Citations (Scopus)

Abstract

Camera pose estimation in large-scale environments is still an open question and, despite recent promising results, it may still fail in some situations. The research so far has focused on improving subcomponents of estimation pipelines, to achieve more accurate poses. However, there is no guarantee for the result to be correct, even though the correctness of pose estimation is critically important in several visual localization applications, such as in autonomous navigation. In this paper we bring to attention a novel research question, pose confidence estimation, where we aim at quantifying how reliable the visually estimated pose is. We develop a novel confidence measure to fulfill this task and show that it can be flexibly applied to different datasets, indoor or outdoor, and for various visual localization pipelines. We also show that the proposed techniques can be used to accomplish a secondary goal: improving the accuracy of existing pose estimation pipelines. Finally, the proposed approach is computationally light-weight and adds only a negligible increase to the computational effort of pose estimation.
Original languageEnglish
Title of host publicationProceedings of ICPR 2020 - 25th International Conference on Pattern Recognition
PublisherIEEE
Pages5004-5011
ISBN (Electronic)9781728188089
DOIs
Publication statusPublished - 2020
MoE publication typeA4 Conference publication
EventInternational Conference on Pattern Recognition - Virtual, Online, Milan, Italy
Duration: 10 Jan 202115 Jan 2021
Conference number: 25

Conference

ConferenceInternational Conference on Pattern Recognition
Abbreviated titleICPR
Country/TerritoryItaly
CityMilan
Period10/01/202115/01/2021

Fingerprint

Dive into the research topics of 'Can You Trust Your Pose? Confidence Estimation in Visual Localization'. Together they form a unique fingerprint.

Cite this