TY - JOUR
T1 - Information Visualization Evaluation Using Crowdsourcing
AU - Borgo, Rita
AU - Micallef, Luana
AU - Bach, Benjamin
AU - McGee, Fintan
AU - Lee, Bongshin
N1 - First and second author contributed equally
PY - 2018
Y1 - 2018
N2 - Visualization researchers have been increasingly leveraging crowdsourcing approaches to overcome a number of limitations of controlled laboratory experiments, including small participant sample sizes and narrow demographic backgrounds of study participants. However, as a community, we have little understanding on when, where, and how researchers use crowdsourcing approaches for visualization research. In this paper, we review the use of crowdsourcing for evaluation in visualization research. We analyzed 190 crowdsourcing experiments, reported in 82 papers that were published in major visualization conferences and journals between 2006 and 2017. We tagged each experiment along 36 dimensions that we identified for crowdsourcing experiments. We grouped our dimensions into six important aspects: study design & procedure, task type, participants, measures & metrics, quality assurance, and reproducibility. We report on the main findings of our review and discuss challenges and opportunities for improvements in conducting crowdsourcing studies for visualization research.
AB - Visualization researchers have been increasingly leveraging crowdsourcing approaches to overcome a number of limitations of controlled laboratory experiments, including small participant sample sizes and narrow demographic backgrounds of study participants. However, as a community, we have little understanding on when, where, and how researchers use crowdsourcing approaches for visualization research. In this paper, we review the use of crowdsourcing for evaluation in visualization research. We analyzed 190 crowdsourcing experiments, reported in 82 papers that were published in major visualization conferences and journals between 2006 and 2017. We tagged each experiment along 36 dimensions that we identified for crowdsourcing experiments. We grouped our dimensions into six important aspects: study design & procedure, task type, participants, measures & metrics, quality assurance, and reproducibility. We report on the main findings of our review and discuss challenges and opportunities for improvements in conducting crowdsourcing studies for visualization research.
KW - Categories and Subject Descriptors (according to ACM CCS)
KW - I.3.3 [Computer Graphics]: Picture/Image Generation—Line and curve generation
UR - https://www.microsoft.com/en-us/research/publication/information-visualization-evaluation-using-crowdsourcing/
U2 - 10.1111/cgf.13444
DO - 10.1111/cgf.13444
M3 - Review Article
SN - 0167-7055
VL - 37
SP - 573
EP - 595
JO - Computer Graphics Forum
JF - Computer Graphics Forum
IS - 3
ER -