Learning State-Space Models for Mapping Spatial Motion Patterns

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingsScientificpeer-review

2 Citations (Scopus)

Abstract

Mapping the surrounding environment is essential for the successful operation of autonomous robots. While extensive research has focused on mapping geometric structures and static objects, the environment is also influenced by the movement of dynamic objects. Incorporating information about spatial motion patterns can allow mobile robots to navigate and operate successfully in populated areas. In this paper, we propose a deep state-space model that learns the map representations of spatial motion patterns and how they change over time at a certain place. To evaluate our methods, we use two different datasets: one generated dataset with specific motion patterns and another with real-world pedestrian data. We test the performance of our model by evaluating its learning ability, mapping quality, and application to downstream tasks. The results demonstrate that our model can effectively learn the corresponding motion pattern, and has the potential to be applied to robotic application tasks.
Original languageEnglish
Title of host publicationProceedings of the 11th European Conference on Mobile Robots, ECMR 2023
EditorsLino Marques, Ivan Markovic
PublisherIEEE
Number of pages6
ISBN (Electronic)979-8-3503-0704-7
DOIs
Publication statusPublished - 4 Sept 2023
MoE publication typeA4 Conference publication
EventEuropean Conference on Mobile Robots - Coimbra, Portugal
Duration: 4 Sept 20237 Sept 2023
Conference number: 11

Publication series

NameEuropean Conference on Mobile Robots Conference Proceedings
ISSN (Electronic)2767-8733

Conference

ConferenceEuropean Conference on Mobile Robots
Abbreviated titleECMR
Country/TerritoryPortugal
CityCoimbra
Period04/09/202307/09/2023

Fingerprint

Dive into the research topics of 'Learning State-Space Models for Mapping Spatial Motion Patterns'. Together they form a unique fingerprint.

Cite this