Open Access. Powered by Scholars. Published by Universities.®

Engineering Commons

Open Access. Powered by Scholars. Published by Universities.®

Artificial Intelligence and Robotics

Research Collection School Of Computing and Information Systems

Crowdsourcing

Publication Year

Articles 1 - 2 of 2

Full-Text Articles in Engineering

Worker Demographics And Earnings On Amazon Mechanical Turk: An Exploratory Analysis, Kotaro Hara, Kristy Milland, Benjamin V. Hanrahan, Chris Callison-Burch, Abigail Adams, Saiph Savage, Jeffrey P. Bigham May 2019

Worker Demographics And Earnings On Amazon Mechanical Turk: An Exploratory Analysis, Kotaro Hara, Kristy Milland, Benjamin V. Hanrahan, Chris Callison-Burch, Abigail Adams, Saiph Savage, Jeffrey P. Bigham

Research Collection School Of Computing and Information Systems

Prior research reported that workers on Amazon Mechanical Turk (AMT) are underpaid, earning about $2/h. But the prior research did not investigate the difference in wage due to worker characteristics (e.g., country of residence). We present the first data-driven analysis on wage gap on AMT. Using work log data and demographic data collected via online survey, we analyse the gap in wage due to different factors. We show that there is indeed wage gap; for example, workers in the U.S. earn $3.01/h while those in India earn $1.41/h on average.


Traccs: Trajectory-Aware Coordinated Urban Crowd-Sourcing, Cen Chen, Shih-Fen Cheng, Aldy Gunawan, Archan Misra, Koustuv Dasgupta, Deepthi Chander Nov 2014

Traccs: Trajectory-Aware Coordinated Urban Crowd-Sourcing, Cen Chen, Shih-Fen Cheng, Aldy Gunawan, Archan Misra, Koustuv Dasgupta, Deepthi Chander

Research Collection School Of Computing and Information Systems

We investigate the problem of large-scale mobile crowd-tasking, where a large pool of citizen crowd-workers are used to perform a variety of location-specific urban logistics tasks. Current approaches to such mobile crowd-tasking are very decentralized: a crowd-tasking platform usually provides each worker a set of available tasks close to the worker's current location; each worker then independently chooses which tasks she wants to accept and perform. In contrast, we propose TRACCS, a more coordinated task assignment approach, where the crowd-tasking platform assigns a sequence of tasks to each worker, taking into account their expected location trajectory over a wider time …