Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
80 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

ToxBuster: In-game Chat Toxicity Buster with BERT (2305.12542v1)

Published 21 May 2023 in cs.CL and cs.CY

Abstract: Detecting toxicity in online spaces is challenging and an ever more pressing problem given the increase in social media and gaming consumption. We introduce ToxBuster, a simple and scalable model trained on a relatively large dataset of 194k lines of game chat from Rainbow Six Siege and For Honor, carefully annotated for different kinds of toxicity. Compared to the existing state-of-the-art, ToxBuster achieves 82.95% (+7) in precision and 83.56% (+57) in recall. This improvement is obtained by leveraging past chat history and metadata. We also study the implication towards real-time and post-game moderation as well as the model transferability from one game to another.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Zachary Yang (9 papers)
  2. Yasmine Maricar (1 paper)
  3. MohammadReza Davari (8 papers)
  4. Nicolas Grenon-Godbout (1 paper)
  5. Reihaneh Rabbany (48 papers)
Citations (3)