BIRS Workshop Lecture Videos
and CJ Ho: Bandits in Crowdsourcing Liu, Yang
This talk will overview some of the successes in connecting Multi-armed Bandit (MAB) algorithm to crowdsourcing research. With MAB being a powerful tool for decision making and machine learning tasks, there are some salient challenges when we adopt this framework for crowdsourcing studies, including large exploration space, budget constraint, and lack of ground-truth. We will survey several recent results that help overcome the above hurdles. Time permitting, we will go through a couple of examples that use a bandit framework as a reputation system to provide long-term incentives.
Item Citations and Data
Attribution-NonCommercial-NoDerivatives 4.0 International