Crowdsourcing and Human-Centred Experiments (Dagstuhl Seminar 15481)

Authors Daniel Archambault, Tobias Hoßfeld, Helen C. Purchase and all authors of the abstracts in this report



PDF
Thumbnail PDF

File

DagRep.5.11.103.pdf
  • Filesize: 1.08 MB
  • 24 pages

Document Identifiers

Author Details

Daniel Archambault
Tobias Hoßfeld
Helen C. Purchase
and all authors of the abstracts in this report

Cite As Get BibTex

Daniel Archambault, Tobias Hoßfeld, and Helen C. Purchase. Crowdsourcing and Human-Centred Experiments (Dagstuhl Seminar 15481). In Dagstuhl Reports, Volume 5, Issue 11, pp. 103-126, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2016) https://doi.org/10.4230/DagRep.5.11.103

Abstract

This report documents the program and the outcomes of Dagstuhl Seminar 15481 "Evaluation in the Crowd: Crowdsourcing and Human-Centred Experiments". Human-centred empirical evaluations play important roles in the fields of human-computer interaction, visualization, graphics, multimedia, and psychology. The advent of crowdsourcing platforms, such as Amazon Mechanical Turk or Microworkers, has provided a revolutionary methodology to conduct human-centred experiments. Through such platforms, experiments can now collect data from hundreds, even thousands, of participants from a diverse user community over a matter of weeks, greatly increasing the ease with which we can collect data as well as the power and generalizability of experimental results. However, such an experimental platform does not come without its problems: ensuring participant investment in the task, defining experimental controls, and understanding the ethics behind deploying such experiments en-masse. 

The major interests of the seminar participants were focused in different working groups on (W1) Crowdsourcing Technology, (W2) Crowdsourcing Community, (W3) Crowdsourcing vs. Lab, (W4) Crowdsourcing & Visualization, (W5) Crowdsourcing & Psychology, (W6) Crowdsourcing & QoE Assessment.

Subject Classification

Keywords
  • Crowdsourcing; Human Computation; Crowdsourcing Design
  • Mechanisms
  • Engineering; Practical Experience; Computer Graphics; Applied Perception; HCI; Visualization

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail