Augmenting Multi-Party Face-to-Face Interactions Amongst Strangers with User Generated Content

Research output: Contribution to journalArticleScientificpeer-review

Researchers

  • Mikko Kytö
  • David McGookin

Research units

Abstract

We present the results of an investigation into the role of curated representations of self, which we term Digital Selfs, in augmented multi-party face-to-face interactions. Advancements in wearable technologies (such as Head-Mounted Displays) have renewed interest in augmenting face-to-face interaction with digital content. However, existing work focuses on algorithmic matching between users, based on data-mining shared interests from individuals’ social media accounts, which can cause information that might be inappropriate or irrelevant to be disclosed to others. An alternative approach is to allow users to manually curate the digital augmentation they wish to present to others, allowing users to present those aspects of self that are most important to them and avoid undesired disclosure. Through interviews, video analysis, questionnaires and device logging, of 23 participants in 6 multi-party gatherings where individuals were allowed to freely mix, we identified how users created Digital Selfs from media largely outside existing social media accounts, and how Digital Selfs presented through HMDs were employed in multi-party interactions, playing key roles in facilitating strangers to interact with each other. We present guidance for the design of future multi-party digital augmentations in collaborative scenarios.

Details

Original languageEnglish
Pages (from-to)527–562
Number of pages36
JournalCOMPUTER SUPPORTED COOPERATIVE WORK: THE JOURNAL OF COLLABORATIVE COMPUTING
Volume26
Issue number4-6
Publication statusPublished - 2017
MoE publication typeA1 Journal article-refereed

    Research areas

  • Digital self, Face-to-face interaction, Familiarisation, Head-mounted display, Strangers

Download statistics

No data available

ID: 13522713