Dice Question Streamline Icon: https://streamlinehq.com

Outcomes of NCIM Content Moderation on Online Platforms

Determine the outcomes of current content moderation processes for non-consensual intimate media (NCIM) on online platforms, including whether reported NCIM is removed, how promptly removal occurs, and what actions platforms take in response to such reports.

Information Square Streamline Icon: https://streamlinehq.com

Background

The paper situates non-consensual intimate media (NCIM) within the broader context of online sexual abuse and platform governance, noting that while hashing-based solutions and legal frameworks exist, they are partial or uneven in practice. The authors highlight that sociotechnical understanding of NCIM is still nascent and emphasize the need to empirically investigate how content moderation currently handles NCIM reports in real-world settings.

This open problem aims to clarify the practical consequences of moderation workflows when NCIM is reported, spanning removal effectiveness, timelines, and downstream actions on user accounts. Such clarity is foundational for assessing platform accountability and the sufficiency of existing policy or legal remedies.

References

However, critical questions still remain unanswered: What are the outcomes of current content moderation processes for NCIM? Do platforms overlook or ignore the problem of NCIM? How can we design systems and policies that effectively respond to the problem of NCIM?

Reporting Non-Consensual Intimate Media: An Audit Study of Deepfakes (2409.12138 - Qiwei et al., 18 Sep 2024) in Section 2.1 (Related Research: Addressing NCIM online)