Dice Question Streamline Icon: https://streamlinehq.com

Platform Attention to NCIM

Ascertain whether major social media platforms overlook or ignore non-consensual intimate media (NCIM) in practice, particularly when abuse is reported through internal privacy policies or other reporting mechanisms.

Information Square Streamline Icon: https://streamlinehq.com

Background

The authors point out that despite increased attention to content moderation, empirical evidence on how platforms respond to NCIM remains limited. Given the tension between permissive adult content policies and privacy harms, it is crucial to establish whether platforms actually act on NCIM reports or allow such content to persist.

This problem investigates the real-world responsiveness of platforms to NCIM, seeking evidence of neglect or effective handling. It is central to evaluating the adequacy of current moderation practices and the need for stronger regulatory frameworks.

References

However, critical questions still remain unanswered: What are the outcomes of current content moderation processes for NCIM? Do platforms overlook or ignore the problem of NCIM? How can we design systems and policies that effectively respond to the problem of NCIM?

Reporting Non-Consensual Intimate Media: An Audit Study of Deepfakes (2409.12138 - Qiwei et al., 18 Sep 2024) in Section 2.1 (Related Research: Addressing NCIM online)