Comparison of Classifiers in Audio and Acceleration Based Context Classification in Mobile Phones

Okko Räsänen, Jussi Leppänen, Unto Laine, Jukka Saarinen

Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

7 Citations (Scopus)

Abstract

This work studies combination of audio and acceleration sensory streams for automatic classification of user context. Instead of performing sensory fusion at a feature level, we study the combination of classifier output distributions using a number of different classifiers. Performance of the algorithms is evaluated using a data set collected with casually worn mobile phones from a variety of real world environments and user activities. Results from the experiments show that combination of audio and acceleration data enhances classification accuracy of physical activities with all classifiers, whereas environment classification does not benefit notably from acceleration features.
Original languageEnglish
Title of host publicationEUSIPCO The 2011 European Signal Processing Conference (EUSIPCO-2011), Barcelona, Spain, August 29 - September 2, 2011
Publication statusPublished - 2011
MoE publication typeA4 Article in a conference publication

Keywords

  • context classification, machine learning, multimodal signal processing

Fingerprint

Dive into the research topics of 'Comparison of Classifiers in Audio and Acceleration Based Context Classification in Mobile Phones'. Together they form a unique fingerprint.

Cite this