Abstract
Intervention Strategies for Increasing Engagement in Crowdsourcing: Platform, Predictions, and Experiments / 3861
Avi Segal, Ya’akov (Kobi) Gal, Ece Kamar, Eric Horvitz, Alex Bowyer, Grant Miller
Volunteer-based crowdsourcing depend critically on maintaining the engagement of participants. We explore a methodology for extending engagement in citizen science by combining machine learning with intervention design. We first present a platform for using real-time predictions about forthcoming disengagement to guide interventions. Then we discuss a set of experiments with delivering different messages to users based on the proximity to the predicted time of disengagement. The messages address motivational factors that were found in prior studies to influence users' engagements. We evaluate this approach on Galaxy Zoo, one of the largest citizen science application on the web, where we traced the behavior and contributions of thousands of users who received intervention messages over a period of a few months. We found sensitivity of the amount of user contributions to both the timing and nature of the message. Specifically, we found that a message emphasizing the helpfulness of individual users significantly increased users' contributions when delivered ac- cording to predicted times of disengagement, but not when delivered at random times. The influence of the message on users' contributions was more pronounced as additional user data was collected and made available to the classifier.