Truthful Incentives in Crowdsourcing Tasks using Regret Minimization Mechanisms

Monetary incentives in Crowdsourcing platforms

Designing the right incentive structure and pricing policies for workers is the central component of online crowdsourcing platforms, e.g. Mechanical Turk.

  • The job requester’s goal is to maximize the utility derived from the task under a limited budget.
  • Workers’ goal is to maximize their individual profit by deciding which tasks to perform and at what price.

Yet, current crowdsourcing platforms only offer a limited capability to the requester in designing the pricing policies and often rules of thumb are used to price tasks. This limitation could result in an inefficient use of the requester’s budget or workers becoming disinterested in the task.

Price negotiation with workers

Previous work in this direction [Singer et al., HCOMP’11] has focused on designing online truthful mechanisms in the bidding model. This requires eliciting the true costs experienced by the worker, which can be challenging for such platforms. In this paper, we focus on the posted price model, where workers are offered a take-it-or-leave-it price, which is more easily implemented in online crowdsourcing platforms. Figure-1 shows the way price negotiation happens in our posted price model.


Figure 1: Negotiation between requester and worker in the posted-price model compared to that in the bidding model. b_i denotes the bid shared by i_th worker, c_i being the true cost experienced by him. p_i denotes the payment offered to the worker.

Our approach

The main challenge in determining the payments is the unknown distribution of the workers’ cost (“cost curve”) F(p), illustrated in Figure 2.  This leads to the challenge of trading exploration and exploitation – the mechanism needs to “explore” by experimenting with potentially suboptimal prices and has to “exploit” its learning by offering the price that appears best so far. We cast this problem as a multi-armed bandit (MAB) problem, however under a strict budget constraint B and use the approach of regret minimization in online learning to design our mechanism.

Figure 2. costcurve-FB

Figure 2: Upper bound on utility achievable through budget constraint (in red), unknown price curve of workers (in blue) and optimal price (in green). B is the fixed budget and N is the fixed number of workers.

Our mechanism BP-UCB

  • We present a novel posted-price mechanism, BP-UCB, for online budgeted procurement, which is guaranteed to be budget feasible, achieves near-optimal utility for the requester and be incentive compatible (truthful) for workers.
  • We prove no-regret bounds for it and our analysis yields insights into an explicit separation of the regret in terms of wasted budget through overpayment and rejected offers through underpayment.

Experimental Results

We carry out extensive experiments to understand the practical performance of our mechanisms on simulated cost distributions (Figure 3 below). Additionally, to demonstrate the effectiveness of our approach on real world inputs, we carried out a Mechanical Turk study to collect real cost distributions from the workers (Figure 4 below).

Figure 3. results-synthetic-uniform

Figure 3: Simulated cost distributions. Left: Compared to state of the art posted-price mechanism (pp’12) [Badanidiyuru et al., EC’12], our mechanism (bp-ucb) shows up to a 150% increase in utility for given budet. Right: shows that average regret of our mechanism bp-ucb diminishes with increasing budget.

Figure 4: results-mturk

Figure 4: Cost distribution from Mechanical Turk. Left: shows utility achieved for random arrival of workers and shows a 180% increase in utility compared to state of art posted-price mechanism (bp-ucb vs pp’12). Right: demonstrates the robustness of the mechanism against extreme adversarial inputs, simulated by arrival of workers in ascending order of their costs.

For more, see our full paper, Truthful Incentives in Crowdsourcing Tasks using Regret Minimization Mechanisms.
Adish Singla, ETH Zurich
Andreas Krause, ETH Zurich

One thought on “Truthful Incentives in Crowdsourcing Tasks using Regret Minimization Mechanisms

  1. First I would like to say that I enjoyed your post, I think that designing effective mechanisms for crowd sourcing platforms is an important and interesting problem. It seems that you approach estimates the ‘cost curve’ for the pool of workers but did not consider user’s demographic information. Does is make sense to learn a different ‘cost curve’ for different groups of users?

Comments are closed.