What you see is what you can change: Human-centered machine learning by interactive visualization

Research output: Contribution to journalArticleScientificpeer-review

Researchers

  • Dominik Sacha
  • Michael Sedlmair
  • Leishi Zhang
  • John A. Lee
  • Jaakko Peltonen

  • Daniel Weiskopf
  • Stephen C. North
  • Daniel A. Keim

Research units

  • Infovisible LLC
  • Universität Konstanz
  • Vienna University of Technology
  • Middlesex University
  • Universite Catholique de Louvain
  • Tampere University
  • University of Stuttgart

Abstract

Visual analytics (VA) systems help data analysts solve complex problems interactively, by integrating automated data analysis and mining, such as machine learning (ML) based methods, with interactive visualizations. We propose a conceptual framework that models human interactions with ML components in the VA process, and that puts the central relationship between automated algorithms and interactive visualizations into sharp focus. The framework is illustrated with several examples and we further elaborate on the interactive ML process by identifying key scenarios where ML methods are combined with human feedback through interactive visualization. We derive five open research challenges at the intersection of ML and visualization research, whose solution should lead to more effective data analysis. (C) 2017 Elsevier B.V. All rights reserved.

Details

Original languageEnglish
Pages (from-to)164-175
Number of pages12
JournalNeurocomputing
Volume268
Publication statusPublished - 13 Dec 2017
MoE publication typeA1 Journal article-refereed

    Research areas

  • Machine learning, Information visualization, Interaction, Visual analytics, VISUAL ANALYTICS, INFORMATION VISUALIZATION, DIMENSIONALITY REDUCTION, UNCERTAINTY, PERSPECTIVE, DIRECTIONS, FRAMEWORK, SELECTION, MODELS

ID: 16821393