Motion pattern recognition in 4D point clouds

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

6 Citations (Scopus)
141 Downloads (Pure)

Abstract

We address an actively discussed problem in signal processing, recognizing patterns from spatial data in motion. In particular, we suggest a neural network architecture to recognize motion patterns from 4D point clouds. We demonstrate the feasibility of our approach with point cloud datasets of hand gestures. The architecture, PointGest, directly feeds on unprocessed timelines of point cloud data without any need for voxelization or projection. The model is resilient to noise in the input point cloud through abstraction to lower-density representations, especially for regions of high density. We evaluate the architecture on a benchmark dataset with ten gestures. PointGest achieves an accuracy of 98.8%, outperforming five state-of-the-art point cloud classification models.

Original languageEnglish
Title of host publicationProceedings of the 2020 IEEE 30th International Workshop on Machine Learning for Signal Processing, MLSP 2020
PublisherIEEE
Number of pages6
ISBN (Electronic)9781728166629
DOIs
Publication statusPublished - Sep 2020
MoE publication typeA4 Article in a conference publication
EventIEEE International Workshop on Machine Learning for Signal Processing - Aalto University, Espoo, Finland
Duration: 21 Sep 202024 Sep 2020
Conference number: 30
https://ieeemlsp.cc

Publication series

NameIEEE International Workshop on Machine Learning for Signal Processing
ISSN (Print)2161-0363
ISSN (Electronic)2161-0371

Workshop

WorkshopIEEE International Workshop on Machine Learning for Signal Processing
Abbreviated titleMLSP
Country/TerritoryFinland
CityEspoo
Period21/09/202024/09/2020
Internet address

Keywords

  • 4D point clouds
  • Deep learning
  • Gesture recognition

Fingerprint

Dive into the research topics of 'Motion pattern recognition in 4D point clouds'. Together they form a unique fingerprint.

Cite this