Movement tracking by optical flow assisted inertial navigation

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

Abstract

Robust and accurate six degree-of-freedom tracking on portable devices remains a challenging problem, especially on small hand-held devices such as smartphones. For improved robustness and accuracy, complementary movement information from an IMU and a camera is often fused. Conventional visual-inertial methods fuse information from IMUs with a sparse cloud of feature points tracked by the device camera. We consider a visually dense approach, where the IMU data is fused with the dense optical flow field estimated from the camera data. Learning-based methods applied to the full image frames can leverage visual cues and global consistency of the flow field to improve the flow estimates. We show how a learning-based optical flow model can be combined with conventional inertial navigation, and how ideas from probabilistic deep learning can aid the robustness of the measurement updates. The practical applicability is demonstrated on real-world data acquired by an iPad in a challenging low-texture environment.

Original languageEnglish
Title of host publicationProceedings of 2020 23rd International Conference on Information Fusion, FUSION 2020
PublisherIEEE
ISBN (Electronic)9780578647098
DOIs
Publication statusPublished - Jul 2020
MoE publication typeA4 Article in a conference publication
EventInternational Conference on Information Fusion - Virtual, Online, South Africa
Duration: 6 Jul 20209 Jul 2020
Conference number: 23

Conference

ConferenceInternational Conference on Information Fusion
Abbreviated titleFUSION
CountrySouth Africa
CityVirtual, Online
Period06/07/202009/07/2020

Fingerprint Dive into the research topics of 'Movement tracking by optical flow assisted inertial navigation'. Together they form a unique fingerprint.

Cite this