Mobile multisensory augmentations with the CultAR platform

Antti Nurminen*

*Corresponding author for this work

    Research output: Chapter in Book/Report/Conference proceedingConference contributionScientificpeer-review

    Abstract

    Human sensory system is a complex mechanism, providing us with a wealth of data from our environment. Our nervous system constantly updates our awareness of the environment based on this multisensory input. We are attuned to cues, which may alert of a danger, or invite for closer inspection. We present the first integrated mobile platform with state-of-the-art visual, aural and haptic augmentation interfaces, supporting localization and directionality where applicable. With these interfaces, we convey cues to our users in the context of urban cultural experiences. We discuss the orchestration of such multimodal outputs and provide indicative guidelines based on our work.

    Original languageEnglish
    Title of host publicationSIGGRAPH Asia 2015 Mobile Graphics and Interactive Applications, SA 2015
    PublisherACM
    ISBN (Electronic)9781450339285
    DOIs
    Publication statusPublished - 2 Nov 2015
    MoE publication typeA4 Article in a conference publication
    EventACM SIGGRAPH ASIA - Kobe, Japan
    Duration: 2 Nov 20156 Nov 2015

    Conference

    ConferenceACM SIGGRAPH ASIA
    Country/TerritoryJapan
    CityKobe
    Period02/11/201506/11/2015

    Keywords

    • Augmented reality
    • Haptics
    • Mobile mixed reality
    • Multimodal interfaces

    Fingerprint

    Dive into the research topics of 'Mobile multisensory augmentations with the CultAR platform'. Together they form a unique fingerprint.

    Cite this