Language:
You're in PublicationsSTFU NOOB!: Predicting Crowdsourced Decisions on Toxic Behavior in Online Games

 

STFU NOOB!: Predicting Crowdsourced Decisions on Toxic Behavior in Online Games

One problem facing players of competitive games is negative, or toxic, behavior. League of Legends, the largest eSport game, uses a crowdsourcing platform called the Tribunal to judge whether a reported toxic player should be punished or not. The Tribunal is a two stage system requiring reports from those players that directly observe toxic behavior, and human experts that review aggregated reports. While this system has successfully dealt with the vague nature of toxic behavior by majority rules based on many votes, it naturally requires tremendous cost, time, and human efforts.

In this paper, we propose a supervised learning approach for predicting crowdsourced decisions on toxic behavior with largescale labeled data collections; over 10 million user reports involved in 1.46 million toxic players and corresponding crowdsourced decisions. Our result shows good performance in detecting overwhelmingly majority cases and predicting crowdsourced decisions on them. We demonstrate good portability of our classifier across regions. Finally, we estimate the practical implications of our approach, potential cost savings and victim protection.