Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
41 tokens/sec
GPT-4o
59 tokens/sec
Gemini 2.5 Pro Pro
41 tokens/sec
o3 Pro
7 tokens/sec
GPT-4.1 Pro
50 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Content Moderation Justice and Fairness on Social Media: Comparisons Across Different Contexts and Platforms (2403.06034v1)

Published 9 Mar 2024 in cs.HC and cs.CY

Abstract: Social media users may perceive moderation decisions by the platform differently, which can lead to frustration and dropout. This study investigates users' perceived justice and fairness of online moderation decisions when they are exposed to various illegal versus legal scenarios, retributive versus restorative moderation strategies, and user-moderated versus commercially moderated platforms. We conduct an online experiment on 200 American social media users of Reddit and Twitter. Results show that retributive moderation delivers higher justice and fairness for commercially moderated than for user-moderated platforms in illegal violations; restorative moderation delivers higher fairness for legal violations than illegal ones. We discuss the opportunities for platform policymaking to improve moderation system design.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (47)
  1. When Online Harassment Is Perceived as Justified. Proceedings of the International AAAI Conference on Web and Social Media 12, 1 (June 2018). https://doi.org/10.1609/icwsm.v12i1.15036 Number: 1.
  2. Classification and Its Consequences for Online Harassment: Design Insights from HeartMob. Proceedings of the ACM on Human-Computer Interaction 1, CSCW (Dec. 2017), 1–19. https://doi.org/10.1145/3134659
  3. Report Of The Facebook Data Transparency Advisory Group. Technical Report. Yale Law School, The Justice Collaboratory. 44 pages. https://law.yale.edu/yls-today/news/facebook-data-transparency-advisory-group-releases-final-report
  4. Jie Cai and Donghee Yvette Wohn. 2019. What are Effective Strategies of Handling Harassment on Twitch? Users’ Perspectives. In Companion Publication of the 2019 Conference on Computer Supported Cooperative Work and Social Computing (CSCW ’19 Companion). Association for Computing Machinery, New York, NY, USA, 166–170. https://doi.org/10.1145/3311957.3359478
  5. Jie Cai and Donghee Yvette Wohn. 2021. After Violation But Before Sanction: Understanding Volunteer Moderators’ Profiling Processes Toward Violators in Live Streaming Communities. Proceedings of the ACM on Human-Computer Interaction 5, CSCW2 (Oct. 2021), 410:1–410:25. https://doi.org/10.1145/3479554
  6. Moderation Visibility: Mapping the Strategies of Volunteer Moderators in Live Streaming Micro Communities. In Proceedings of the 2021 ACM International Conference on Interactive Media Experiences (IMX ’21). Association for Computing Machinery, New York, NY, USA, 61–72. https://doi.org/10.1145/3452918.3458796
  7. Jason A. Colquitt and Jessica B. Rodell. 2015. Measuring Justice and Fairness. (July 2015). https://doi.org/10.1093/oxfordhb/9780199981410.013.0008
  8. Jason A. Colquitt and Kate P. Zipay. 2015. Justice, Fairness, and Employee Reactions. Annual Review of Organizational Psychology and Organizational Behavior 2, 1 (2015), 75–99. https://doi.org/10.1146/annurev-orgpsych-032414-111457 _eprint: https://doi.org/10.1146/annurev-orgpsych-032414-111457.
  9. Commercial Versus Volunteer: Comparing User Perceptions of Toxicity and Transparency in Content Moderation Across Social Media Platforms. Frontiers in Human Dynamics 3 (Feb. 2021), 626409. https://doi.org/10.3389/fhumd.2021.626409
  10. ”Jol” or ”Pani”?: How Does Governance Shape a Platform’s Identity? Proceedings of the ACM on Human-Computer Interaction 5, CSCW2 (Oct. 2021), 473:1–473:25. https://doi.org/10.1145/3479860
  11. Shielding or Silencing?: An Investigation into Content Moderation during the Sheikh Jarrah Crisis. Proceedings of the ACM on Human-Computer Interaction 8, GROUP (Feb. 2024), 1–21. https://doi.org/10.1145/3633071
  12. Towards Transparency by Design for Artificial Intelligence. Science and Engineering Ethics 26, 6 (Dec. 2020), 3333–3361. https://doi.org/10.1007/s11948-020-00276-4
  13. Batya Friedman and Helen Nissenbaum. 1996. Bias in computer systems. ACM Transactions on Information Systems 14, 3 (July 1996), 330–347. https://doi.org/10.1145/230538.230561
  14. Tarleton Gillespie. 2018. Custodians of the internet: platforms, content moderation, and the hidden decisions that shape social media. Yale University Press, New Haven London.
  15. The promise of restorative justice in addressing online harm. https://www.brookings.edu/techstream/the-promise-of-restorative-justice-in-addressing-online-harm/
  16. Libby Hemphill. 2022. Very Fine People: What Social Media Platforms Miss About White Supremacist Speech. Technical Report. The Anti-Defamation League. https://www.adl.org/resources/report/very-fine-people
  17. ”Did You Suspect the Post Would be Removed?”: Understanding User Reactions to Content Removals on Reddit. Proceedings of the ACM on Human-Computer Interaction 3, CSCW (Nov. 2019), 192:1–192:33. https://doi.org/10.1145/3359294
  18. A Trade-off-centered Framework of Content Moderation. ACM Transactions on Computer-Human Interaction 30, 1 (March 2023), 3:1–3:34. https://doi.org/10.1145/3534929
  19. Understanding international perceptions of the severity of harmful content online. PLOS ONE 16, 8 (Aug. 2021), e0256762. https://doi.org/10.1371/journal.pone.0256762 Publisher: Public Library of Science.
  20. Yubo Kou. 2021. Punishment and Its Discontents: An Analysis of Permanent Ban in an Online Game Community. Proceedings of the ACM on Human-Computer Interaction 5, CSCW2 (Oct. 2021), 334:1–334:21. https://doi.org/10.1145/3476075
  21. Kyle Langvardt. 2018. Regulating Online Content Moderation. Georgetown Law Journal 106, 5 (June 2018), 1353–1389. https://go.gale.com/ps/i.do?p=AONE&sw=w&issn=00168092&v=2.1&it=r&id=GALE%7CA548321177&sid=googleScholar&linkaccess=abs Publisher: Georgetown University Law Center.
  22. Kalev Leetaru. 2018. Is Social Media Content Moderation An Impossible Task? https://www.forbes.com/sites/kalevleetaru/2018/09/08/is-social-media-content-moderation-an-impossible-task/ Section: AI & Big Data.
  23. ”I Got Flagged for Supposed Bullying, Even Though It Was in Response to Someone Harassing Me About My Disability.”: A Study of Blind TikTokers’ Content Moderation Experiences. In Proceedings of the 2024 CHI Conference on Human Factors in Computing Systems (CHI ’24). Association for Computing Machinery, New York, NY, USA, 1–15. https://doi.org/10.1145/3613904.3642148 arXiv:2401.11663 [cs].
  24. Renkai Ma and Yubo Kou. 2022. ”I’m not sure what difference is between their content and mine, other than the person itself”: A Study of Fairness Perception of Content Moderation on YouTube. Proceedings of the ACM on Human-Computer Interaction 6, CSCW2 (Nov. 2022), 425:1–425:28. https://doi.org/10.1145/3555150
  25. Renkai Ma and Yubo Kou. 2023. ”Defaulting to boilerplate answers, they didn’t engage in a genuine conversation”: Dimensions of Transparency Design in Creator Moderation. Proceedings of the ACM on Human-Computer Interaction 7, CSCW1 (April 2023), 44:1–44:26. https://doi.org/10.1145/3579477
  26. Transparency, Fairness, and Coping: How Players Experience Moderation in Multiplayer Online Games. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (CHI ’23). Association for Computing Machinery, New York, NY, USA, 1–21. https://doi.org/10.1145/3544548.3581097
  27. How Do Users Experience Moderation?: A Systematic Literature Review. Proceedings of the ACM on Human-Computer Interaction 7, CSCW2 (Oct. 2023), 278:1–278:30. https://doi.org/10.1145/3610069
  28. H. Messmer and H. U. Otto (Eds.). 1992. Restorative Justice on Trial: Pitfalls and Potentials of Victim-Offender Mediation — International Research Perspectives —. Springer Netherlands. https://doi.org/10.1007/978-94-015-8064-9
  29. Peter Morici. 2021. Facebook and Twitter should not be in the censorship business. https://www.marketwatch.com/story/facebook-and-twitter-should-not-be-in-the-censorship-business-11611871551 Section: Economy & Politics.
  30. Sarah Myers West. 2018. Censored, suspended, shadowbanned: User interpretations of content moderation on social media platforms. New Media & Society 20, 11 (Nov. 2018), 4366–4383. https://doi.org/10.1177/1461444818773059
  31. User Opinions on Effective Strategies Against Social Media Toxicity. In Hawaii International Conference on System Sciences. https://doi.org/10.24251/HICSS.2021.366
  32. Characterizations of Online Harassment: Comparing Policies Across Social Media Platforms. In Proceedings of the 19th International Conference on Supporting Group Work. ACM, Sanibel Island Florida USA, 369–374. https://doi.org/10.1145/2957276.2957297
  33. Sarah T. Roberts. 2017. Content moderation. https://escholarship.org/uc/item/7371c1hf
  34. Sarita Schoenebeck and Lindsay Blackwell. 2021. Reimagining Social Media Governance: Harm, Accountability, and Repair. SSRN Scholarly Paper ID 3895779. Social Science Research Network, Rochester, NY. https://doi.org/10.2139/ssrn.3895779
  35. Drawing from justice theories to support targets of online harassment. New Media & Society 23, 5 (May 2021), 1278–1300. https://doi.org/10.1177/1461444820913122
  36. Youth Trust in Social Media Companies and Expectations of Justice: Accountability and Repair After Online Harassment. Proceedings of the ACM on Human-Computer Interaction 5, CSCW1 (April 2021), 2:1–2:18. https://doi.org/10.1145/3449076
  37. Joseph Seering and Sanjay R. Kairam. 2022. Who Moderates on Twitch and What Do They Do? Quantifying Practices in Community Moderation on Twitch. Proceedings of the ACM on Human-Computer Interaction 7, GROUP (Dec. 2022), 18:1–18:18. https://doi.org/10.1145/3567568
  38. Moderator engagement and community development in the age of algorithms. New Media & Society 21, 7 (July 2019), 1417–1443. https://doi.org/10.1177/1461444818821316 Publisher: SAGE Publications.
  39. The Many Faces of Fairness: Exploring the Institutional Logics of Multistakeholder Microlending Recommendation. In Proceedings of the 2023 ACM Conference on Fairness, Accountability, and Transparency (FAccT ’23). Association for Computing Machinery, New York, NY, USA, 1652–1663. https://doi.org/10.1145/3593013.3594106
  40. The Psychological Well-Being of Content Moderators: The Emotional Labor of Commercial Moderation and Avenues for Improving Support. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. ACM, Yokohama Japan, 1–14. https://doi.org/10.1145/3411764.3445092
  41. What Do We Mean When We Talk About Transparency? Toward Meaningful Transparency in Commercial Content Moderation. International Journal of Communication 13, 0 (March 2019), 18. https://ijoc.org/index.php/ijoc/article/view/9736 Number: 0.
  42. J. Thibaut and L. Walker. 1976. Procedural Justice: A Psychological Analysis. https://doi.org/10.2307/448155
  43. Harassment Experiences of Women and LGBTQ Live Streamers and How They Handled Negativity. In Proceedings of the 2021 ACM International Conference on Interactive Media Experiences (IMX ’21). Association for Computing Machinery, New York, NY, USA, 7–19. https://doi.org/10.1145/3452918.3458794
  44. Donghee Yvette Wohn. 2019. Volunteer Moderators in Twitch Micro Communities: How They Get Involved, the Roles They Play, and the Emotional Labor They Experience. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI ’19). Association for Computing Machinery, New York, NY, USA, 1–13. https://doi.org/10.1145/3290605.3300390
  45. Sensemaking, Support, Safety, Retribution, Transformation: A Restorative Justice Approach to Understanding Adolescents’ Needs for Addressing Online Harm. In CHI Conference on Human Factors in Computing Systems. ACM, New Orleans LA USA, 1–15. https://doi.org/10.1145/3491102.3517614
  46. Susan Yamamoto and Evelyn M. Maeder. 2019. Creating the Punishment Orientation Questionnaire: An Item Response Theory Approach. Personality and Social Psychology Bulletin 45, 8 (Aug. 2019), 1283–1294. https://doi.org/10.1177/0146167218818485
  47. Designing Technology and Policy Simultaneously: Towards A Research Agenda and New Practice. In Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems (CHI EA ’23). Association for Computing Machinery, New York, NY, USA, 1–6. https://doi.org/10.1145/3544549.3573827
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Jie Cai (44 papers)
  2. Aashka Patel (1 paper)
  3. Azadeh Naderi (1 paper)
  4. Donghee Yvette Wohn (6 papers)
X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets