A Look at Improving Robustness in Visual-inertial SLAM by Moment Matching

Tutkimustuotos: Artikkeli kirjassa/konferenssijulkaisussaConference article in proceedingsScientificvertaisarvioitu

32 Lataukset (Pure)


The fusion of camera sensor and inertial data is a leading method for ego-motion tracking in autonomous and smart devices. State estimation techniques that rely on nonlinear filtering are a strong paradigm for solving the associated information fusion task. The de facto inference method in this space is the celebrated extended Kalman filter (EKF), which relies on first-order linearizations of both the dynamical and measurement model. This paper takes a critical look at the practical implications and limitations posed by the EKF, especially under faulty visual feature associations and the presence of strong confounding noise. As an alternative, we revisit the assumed density formulation of Bayesian filtering and employ a moment matching (unscented Kalman filtering) approach to both visual-inertial odometry and visual SLAM. Our results highlight important aspects in robustness both in dynamics propagation and visual measurement updates, and we show state-of-the-art results on EuRoC MAV drone data benchmark.

OtsikkoProceedings of the 25th International Conference on Information Fusion, FUSION 2022
KustantajaInternational Society of Information Fusion
ISBN (elektroninen)978-1-7377497-2-1
DOI - pysyväislinkit
TilaJulkaistu - 2022
OKM-julkaisutyyppiA4 Artikkeli konferenssijulkaisussa
TapahtumaInternational Conference on Information Fusion - Linkoping, Ruotsi
Kesto: 4 heinäk. 20227 heinäk. 2022
Konferenssinumero: 25


ConferenceInternational Conference on Information Fusion


Sukella tutkimusaiheisiin 'A Look at Improving Robustness in Visual-inertial SLAM by Moment Matching'. Ne muodostavat yhdessä ainutlaatuisen sormenjäljen.

Siteeraa tätä