A Look at Improving Robustness in Visual-inertial SLAM by Moment Matching

Arno Solin, Rui Li, Andrea Pilzer

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingsScientificpeer-review

66 Downloads (Pure)

Abstract

The fusion of camera sensor and inertial data is a leading method for ego-motion tracking in autonomous and smart devices. State estimation techniques that rely on nonlinear filtering are a strong paradigm for solving the associated information fusion task. The de facto inference method in this space is the celebrated extended Kalman filter (EKF), which relies on first-order linearizations of both the dynamical and measurement model. This paper takes a critical look at the practical implications and limitations posed by the EKF, especially under faulty visual feature associations and the presence of strong confounding noise. As an alternative, we revisit the assumed density formulation of Bayesian filtering and employ a moment matching (unscented Kalman filtering) approach to both visual-inertial odometry and visual SLAM. Our results highlight important aspects in robustness both in dynamics propagation and visual measurement updates, and we show state-of-the-art results on EuRoC MAV drone data benchmark.

Original languageEnglish
Title of host publicationProceedings of the 25th International Conference on Information Fusion, FUSION 2022
PublisherInternational Society of Information Fusion
ISBN (Electronic)978-1-7377497-2-1
DOIs
Publication statusPublished - 2022
MoE publication typeA4 Conference publication
EventInternational Conference on Information Fusion - Linkoping, Sweden
Duration: 4 Jul 20227 Jul 2022
Conference number: 25

Conference

ConferenceInternational Conference on Information Fusion
Abbreviated titleFUSION
Country/TerritorySweden
CityLinkoping
Period04/07/202207/07/2022

Fingerprint

Dive into the research topics of 'A Look at Improving Robustness in Visual-inertial SLAM by Moment Matching'. Together they form a unique fingerprint.

Cite this