Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Enabling Developers, Protecting Users: Investigating Harassment and Safety in VR (2403.05499v1)

Published 8 Mar 2024 in cs.HC, cs.CY, cs.ET, and cs.CR

Abstract: Virtual Reality (VR) has witnessed a rising issue of harassment, prompting the integration of safety controls like muting and blocking in VR applications. However, the lack of standardized safety measures across VR applications hinders their universal effectiveness, especially across contexts like socializing, gaming, and streaming. While prior research has studied safety controls in social VR applications, our user study (n = 27) takes a multi-perspective approach, examining both users' perceptions of safety control usability and effectiveness as well as the challenges that developers face in designing and deploying VR safety controls. We identify challenges VR users face while employing safety controls, such as finding users in crowded virtual spaces to block them. VR users also find controls ineffective in addressing harassment; for instance, they fail to eliminate the harassers' presence from the environment. Further, VR users find the current methods of submitting evidence for reports time-consuming and cumbersome. Improvements desired by users include live moderation and behavior tracking across VR apps; however, developers cite technological, financial, and legal obstacles to implementing such solutions, often due to a lack of awareness and high development costs. We emphasize the importance of establishing technical and legal guidelines to enhance user safety in virtual environments.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (77)
  1. A. Kolesnichenko, J. McVeigh-Schultz, and K. Isbister, “Understanding emerging design practices for avatar systems in the commercial social VR ecology,” in Proceedings of the 14th Designing Interactive Systems Conference (DIS), 2019, pp. 241–252.
  2. J. McVeigh-Schultz, A. Kolesnichenko, and K. Isbister, “Shaping pro-social interaction in VR: an emerging design framework,” in Proceedings of the 39th Conference on Human Factors in Computing Systems (CHI), 2019, pp. 1–12.
  3. J. McVeigh-Schultz, E. Márquez Segura, N. Merrill, and K. Isbister, “What’s it mean to "be social" in VR? mapping the social VR design ecology,” in Proceedings of the 13th Designing Interactive Systems Conference (DIS) Companion, 2018, pp. 289–294.
  4. G. M. Garrido, V. Nair, and D. Song, “Sok: Data privacy in virtual reality,” arXiv preprint arXiv:2301.05940, 2023.
  5. L. Blackwell, N. Ellison, N. Elliott-Deflo, and R. Schwartz, “Harassment in social virtual reality: Challenges for platform governance,” Proceedings of the ACM on Human-Computer Interaction (CSCW), vol. 3, pp. 1–25, 2019.
  6. M. Kim, M. Ellithorpe, and S. Burt, “Anonymity and its role in digital aggression: A systematic review,” Aggression and Violent Behavior, 2023.
  7. M. Duggan, “The broader context of online harassment,” Pew Research Center, 2017. [Online]. Available: https://www.pewresearch.org/internet/2017/07/11/the-broader-context-of-online-harassment/
  8. S. Burke Winkelman, J. Oomen-Early, A. D. Walker, L. Chu, and A. Yick-Flanagan, “Exploring cyber harassment among women who use social media,” Universal Journal of Public Health, vol. 3, no. 5, p. 194, 2015.
  9. D. Adams, A. Bah, C. Barwulor, N. Musaby, K. Pitkin, and E. M. Redmiles, “Ethics emerging: the story of privacy and security perceptions in virtual reality,” in Proceedings of the 14th Symposium on Usable Privacy and Security (SOUPS), 2018, pp. 427–442.
  10. Z. Qingxiao, D. T. Ngoc, W. Lingqing, and H. Yun, “Facing the illusion and reality of safety in social VR,” arXiv preprint arXiv:2204.07121, 2022.
  11. W. Duffield, “A grope in Meta’s space,” Tech Dirt, 2021. [Online]. Available: https://www.cato.org/commentary/grope-metas-space
  12. J. Belamire, “My first virtual reality groping,” Medium, 2016. [Online]. Available: https://medium.com/athena-talks/my-first-virtual-reality-sexual-assault-2330410b62ee
  13. S. Frenkel and K. Browning, “The Metaverse’s dark side: Here come harassment and assaults,” New York Times, 2021. [Online]. Available: https://www.nytimes.com/2021/12/30/technology/metaverse-harassment-assaults.html
  14. “Hate in social VR,” Anti-Defamation League, 2018. [Online]. Available: https://www.adl.org/resources/report/hate-social-vr
  15. “Metaverse: another cesspool of toxic content,” SomeOfUs, 2022. [Online]. Available: https://www.eko.org/images/Metaverse_report_May_2022.pdf
  16. B. Duranske, “Reader Roundtable: “Virtual Rape” Claim Brings Belgian Police to Second Life,” Visually Blind, 2007. [Online]. Available: http://virtuallyblind.com/2007/04/24/open-roundtable-allegations-of-virtual-rape-bring-belgian-police-to-second-life/
  17. “One incident of abuse and harassment every 7 minutes,” Center for Countering Digital Hate Inc, 2021. [Online]. Available: https://counterhate.com/research/facebooks-metaverse
  18. Q. Zheng, S. Xu, L. Wang, Y. Tang, R. C. Salvi, G. Freeman, and Y. Huang, “Understanding Safety Risks and Safety Design in Social VR Environments,” Proceedings of the ACM on Human-Computer Interaction (CSCW), vol. 7, pp. 1–37, 2023.
  19. M. A. Lemley and E. Volokh, “Law, virtual reality, and augmented reality,” University of Pennslyvania Law Review, vol. 166, p. 1051, 2017.
  20. L. Tychsen and L. L. Thio, “Concern of photosensitive seizures evoked by 3D video displays or virtual reality headsets in children: current perspective,” Eye and brain, pp. 45–48, 2020.
  21. A. Stanton, “Dealing with harassment in VR,” UploadVR (UVR Media, LLC), 2016. [Online]. Available: https://uploadvr.com/dealing-with-harassment-in-vr/
  22. “Use the safe zone in meta horizon worlds,” Meta, 2022. [Online]. Available: https://www.meta.com/help/quest/articles/horizon/safety-and-privacy-in-horizon-worlds/safe-zone-in-horizon/
  23. G. Freeman, S. Zamanifard, D. Maloney, and D. Acena, “Disturbing the peace: Experiencing and mitigating emerging harassment in social virtual reality,” Proceedings of the ACM on Human-Computer Interaction (CSCW), vol. 6, pp. 1–30, 2022.
  24. S. D. Hazelwood and S. Koon-Magnin, “Cyber stalking and cyber harassment legislation in the united states: A qualitative analysis,” International Journal of Cyber Criminology, vol. 7, no. 2, p. 155, 2013.
  25. C. Southworth, J. Finn, S. Dawson, C. Fraser, and S. Tucker, “Intimate partner violence, technology, and stalking,” Violence against women, vol. 13, no. 8, pp. 842–856, 2007.
  26. N. Lapidot-Lefler and A. Barak, “Effects of anonymity, invisibility, and lack of eye-contact on toxic online disinhibition,” Computers in Human Behavior, vol. 28, pp. 434–443, 2012.
  27. K. Thomas, D. Akhawe, M. Bailey, D. Boneh, E. Bursztein, S. Consolvo, N. Dell, Z. Durumeric, P. G. Kelley, D. Kumar et al., “Sok: Hate, harassment, and the changing landscape of online abuse,” in Proceedings of the 42nd Symposium on Security and Privacy (SP).   IEEE, 2021, pp. 247–267.
  28. M. Wei, S. Consolvo, P. G. Kelley, T. Kohno, F. Roesner, and K. Thomas, ““There’s so much responsibility on users right now:” Expert Advice for Staying Safer From Hate and Harassment,” in Proceedings of the 43rd Conference on Human Factors in Computing Systems (CHI), 2023, pp. 1–17.
  29. D. Kumar, P. G. Kelley, S. Consolvo, J. Mason, E. Bursztein, Z. Durumeric, K. Thomas, and M. Bailey, “Designing toxic content classification for a diversity of perspectives,” in Proceedings of the 17th Symposium on Usable Privacy and Security (SOUPS), 2021, pp. 299–318.
  30. D. Freed, N. N. Bazarova, S. Consolvo, E. J. Han, P. G. Kelley, K. Thomas, and D. Cosley, “Understanding Digital-Safety Experiences of Youth in the US,” in Proceedings of the 43rd Conference on Human Factors in Computing Systems (CHI), 2023, pp. 1–15.
  31. R. Bhalerao, N. McDonald, H. Barakat, V. Hamilton, D. McCoy, and E. Redmiles, “Ethics and Efficacy of Unsolicited Anti-Trafficking SMS Outreach,” Proceedings of the ACM on Human-Computer Interaction (CSCW), vol. 6, pp. 1–39, 2022.
  32. A. McDonald, C. Barwulor, M. L. Mazurek, F. Schaub, and E. M. Redmiles, “"It’s stressful having all these phones": Investigating Sex Workers’ Safety Goals, Risks, and Practices Online,” in Proceedings of the 30th USENIX Security Symposium (USENIX Security), 2021, pp. 375–392.
  33. A. Strohmayer, J. Clamen, and M. Laing, “Technologies for social justice: Lessons from sex workers on the front lines,” in Proceedings of the 39th Conference on Human Factors in Computing Systems (CHI), 2019, pp. 1–14.
  34. M. Almansoori, A. Gallardo, J. Poveda, A. Ahmed, and R. Chatterjee, “A Global Survey of Android Dual-Use Applications Used in Intimate Partner Surveillance,” Proceedings on Privacy Enhancing Technologies (PoPETs), vol. 4, pp. 120–139, 2022.
  35. S. Stephenson, M. Almansoori, P. Emami-Naeini, and R. Chatterjee, ““It’s the Equivalent of Feeling Like You’re in Jail”: Lessons from Firsthand and Secondhand Accounts of IoT-Enabled Intimate Partner Abuse,” in Proceedings of the 32nd USENIX Security Symposium (USENIX Security), 2023, pp. 105–122.
  36. S. Stephenson, M. Almansoori, P. Emami-Naeini, D. Y. Huang, and R. Chatterjee, “Abuse vectors: A framework for conceptualizing IoT-enabled interpersonal abuse,” in Proceedings of the 32nd USENIX Security Symposium (USENIX Security), 2023, pp. 69–86.
  37. J. Fox and W. Y. Tang, “Women’s experiences with general and sexual harassment in online video games: Rumination, organizational responsiveness, withdrawal, and coping strategies,” New media & society, vol. 19, no. 8, pp. 1290–1307, 2017.
  38. L. McLean and M. D. Griffiths, “Female gamers’ experience of online harassment and social support in online gaming: A qualitative study,” International Journal of Mental Health and Addiction, vol. 17, pp. 970–994, 2019.
  39. K. Shriram and R. Schwartz, “All are welcome: Using VR ethnography to explore harassment behavior in immersive social virtual reality,” in Proceedings of the 25th Virtual Reality Conference (VR).   IEEE, 2017, pp. 225–226.
  40. E. Deldari, D. Freed, J. Poveda, and Y. Yao, “An Investigation of Teenager Experiences in Social Virtual Reality from Teenagers’, Parents’, and Bystanders’ Perspectives,” in Proceedings of the 19th Symposium on Usable Privacy and Security (SOUPS), 2023, pp. 1–17.
  41. V. Angelov, E. Petkov, G. Shipkovenski, and T. Kalushkov, “Modern virtual reality headsets,” in Proceedings of the 1st International congress on human-computer interaction, optimization and robotic applications (HORA), 2020, pp. 1–5.
  42. “Oculus store.” [Online]. Available: https://www.oculus.com/experiences/quest
  43. “Steam.” [Online]. Available: https://store.steampowered.com/vr
  44. “Sidequest.” [Online]. Available: https://sidequestvr.com/all-apps
  45. “Bigscreen,” 2023. [Online]. Available: https://www.bigscreenvr.com/software
  46. “Pavlov VR,” 2023. [Online]. Available: https://store.steampowered.com/app/555160/Pavlov_VR/
  47. “Vrchat,” 2023. [Online]. Available: https://hello.vrchat.com
  48. V. Sharma, “Introducing a personal boundary for horizon worlds and venues,” Meta, 2022. [Online]. Available: https://about.fb.com/news/2022/02/personal-boundary-horizon/
  49. “Safety and trust system.” [Online]. Available: https://docs.vrchat.com/docs/vrchat-safety-and-trust-system
  50. “Comfort and safety.” [Online]. Available: https://recroom.com/safety
  51. “Modulate,” Modulate, 2023. [Online]. Available: https://www.modulate.ai/tox-mod
  52. D. Maloney and G. Freeman, “Falling asleep together: What makes activities in social virtual reality meaningful to users,” in Proceedings of the 7th Symposium on Computer-Human Interaction in Play (CHI PLAY), 2020, pp. 510–521.
  53. D. Maloney, G. Freeman, and A. Robb, “It is complicated: Interacting with children in social virtual reality,” in Proceedings of the 1st Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW).   IEEE, 2020, pp. 343–347.
  54. D. Maloney, G. Freeman, and A. Robb, “A Virtual Space for All: Exploring Children’s Experience in Social Virtual Reality,” in Proceedings of the 7th Symposium on Computer-Human Interaction in Play (CHI PLAY), 2020, pp. 472–483.
  55. M. Khlif, “Virtual reality consistency with common concerns of humanity: an overview,” in Proceedings of the 7th Conference on Information Technology Trends (ITT).   IEEE, 2020, pp. 218–223.
  56. X. Deng and J. Ruan, “Users’ privacy in the Second Life Library,” in Proceedings of the 2nd Symposium on IT in Medicine & Education, vol. 1, 2009, pp. 337–340.
  57. L. A. Sparrow, M. Antonellos, M. Gibbs, and M. Arnold, “From “Silly” to “Scumbag”: Reddit discussion of a case of groping in a virtual reality game,” in Proceedings of the 12th DiGRA International Conference, The Digital Games Research Association (DiGRA), 2020.
  58. L. Wu, K. B. Chen, and E. P. Fitts, “Effect of body-gender transfer in virtual reality on the perception of sexual harassment,” in Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 65, no. 1, 2021, pp. 1089–1093.
  59. K. Schulenberg, L. Li, C. Lancaster, D. Zytko, and G. Freeman, ““We Don’t Want a Bird Cage, We Want Guardrails”: Understanding & Designing for Preventing Interpersonal Harm in Social VR through the Lens of Consent,” Proceedings of the ACM on Human-Computer Interaction (CSCW), vol. 7, pp. 1–30, 2023.
  60. L. Blackwell, J. Dimond, S. Schoenebeck, and C. Lampe, “Classification and its consequences for online harassment: Design insights from heartmob,” Proceedings of the ACM on Human-Computer Interaction (CSCW), vol. 1, pp. 1–19, 2017.
  61. K. Collins, “Tech is overwhelmingly white and male, and white men are just fine with that,” Quartz, 2017. [Online]. Available: https://qz.com/940660/tech-is-overwhelmingly-male-and-men-are-just-fine-with-that
  62. “Employed persons by detailed occupation, sex, race, and hispanic or latino ethnicity,” US Department of Labor, Bureau of Labor Statistics, 2018. [Online]. Available: https://www.bls.gov/cps/cpsaat11.htm
  63. L. A. Goodman, “Snowball sampling,” The annals of mathematical statistics, pp. 148–170, 1961.
  64. R. Bellini, E. Tseng, N. Warford, A. Daffalla, T. Matthews, S. Consolvo, J. P. Woelfer, P. G. Kelley, M. L. Mazurek, D. Cuomo et al., “Sok: Safer digital-safety research involving at-risk users,” in Proceedings of the 45th Symposium on Security and Privacy (SP), 2024, pp. 71–71.
  65. J. Bisson and M. Andrew, “Psychological treatment of post-traumatic stress disorder (ptsd),” Cochrane database of systematic reviews, no. 3, 2007.
  66. “To get over something, write about it,” 2023. [Online]. Available: https://hbr.org/2014/11/to-get-over-something-write-about-it
  67. Qualtrics Survey. [Online]. Available: https://www.qualtrics.com/
  68. A. Radford, J. W. Kim, T. Xu, G. Brockman, C. McLeavey, and I. Sutskever, “Robust speech recognition via large-scale weak supervision,” OpenAI Blog, 2022.
  69. N. McDonald, S. Schoenebeck, and A. Forte, “Reliability and inter-rater reliability in qualitative research: Norms and guidelines for cscw and hci practice,” Proceedings of the ACM on human-computer interaction (CSCW), vol. 3, pp. 1–23, 2019.
  70. J. M. Roehl and D. J. Harland, “Imposter participants: overcoming methodological challenges related to balancing participant privacy with data quality when using online recruitment and data collection,” The Qualitative Report, vol. 27, no. 11, pp. 2469–2485, 2022.
  71. A. C. Cote, ““I can defend myself” women’s strategies for coping with harassment while gaming online,” Games and culture, vol. 12, no. 2, pp. 136–155, 2017.
  72. D. Zytko and J. Chan, “The Dating Metaverse: Why We Need to Design for Consent in Social VR,” IEEE Transactions on Visualization and Computer Graphics, vol. 29, no. 5, pp. 2489–2498, 2023.
  73. M. Arzaghi and J. V. Henderson, “Networking off madison avenue,” The Review of Economic Studies, vol. 75, no. 4, pp. 1011–1038, 2008.
  74. R. Tahir, F. Ahmed, H. Saeed, S. Ali, F. Zaffar, and C. Wilson, “Bringing the kid back into youtube kids: Detecting inappropriate content on video streaming platforms,” in Proceedings of the 11th Conference on Advances in Social Networks Analysis and Mining (ASONAM), 2019, pp. 464–469.
  75. E. Chandrasekharan, M. Samory, A. Srinivasan, and E. Gilbert, “The bag of communities: Identifying abusive behavior online with preexisting internet data,” in Proceedings of the 37th conference on human factors in computing systems (CHI), 2017, pp. 3175–3187.
  76. K. Schulenberg, L. Li, G. Freeman, S. Zamanifard, and N. J. McNeese, “Towards Leveraging AI-based Moderation to Address Emergent Harassment in Social Virtual Reality,” in Proceedings of the 43rd Conference on Human Factors in Computing Systems (CHI), 2023, pp. 1–17.
  77. N. Sabri, B. Chen, A. Teoh, S. P. Dow, K. Vaccaro, and M. Elsherief, “Challenges of Moderating Social Virtual Reality,” in Proceedings of the 43rd Conference on Human Factors in Computing Systems (CHI), 2023, pp. 1–20.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (3)
  1. Abhinaya S. B. (2 papers)
  2. Aafaq Sabir (1 paper)
  3. Anupam Das (36 papers)
Citations (1)

Summary

Investigating the Efficacy of Safety Controls in VR: A Multi-perspective User and Developer Study

Introduction

Virtual Reality (VR) technology has increasingly become a medium for immersive communication and interaction. Despite its potential for innovative social connections and entertainment, VR experiences are marred by instances of harassment that threaten user safety. Addressing the issue of harassment in VR is crucial to foster a safe and inclusive environment. This paper presents a comprehensive examination of harassment in VR, focusing on the user experiences with existing safety controls and the perspectives of developers on the implementation challenges and solutions for enhancing user safety.

User Experiences with Safety Controls

Our paper included interviews with 18 individuals who have faced harassment in VR. The analysis revealed that harassment in VR takes various forms, including verbal abuse, sexual harassment, and physical intimidation, mirroring the complexity of online abuse while exploiting the immersive nature of VR. Despite the availability of safety controls like muting, blocking, and reporting mechanisms, participants experienced numerous challenges in using these tools effectively. Key findings include:

  • Awareness and Usage: While some users were familiar with the basic safety controls available in VR applications, a lack of awareness or the cumbersome processes involved deterred the consistent use of these controls.
  • Effectiveness and Limitations: Participants found safety controls to be partially effective, with limitations in real-time efficacy and feedback to the harasser. For instance, muting or blocking doesn't remove the harasser from the environment or stop others from witnessing the harassment.
  • Reporting Challenges: The process of reporting harassment incidents is perceived as time-consuming and complex, often requiring evidence that users may not have readily available. This, coupled with unclear feedback on the actions taken against reported misconduct, discourages users from utilizing reporting mechanisms.

Developer Perspectives on VR Safety

Interviews with nine VR developers provided insights into the challenges and considerations in designing and deploying safety controls. Developers acknowledged the significance of user safety but highlighted several obstacles, including:

  • Resource Limitations and Prioritization: Small development teams and prioritization of other features over safety controls are common, with a consensus that safety is often addressed reactively rather than proactively.
  • Technical and Legal Hurdles: The implementation of sophisticated safety features, such as live moderation and behavior tracking, faces technical, financial, and legal constraints. Developers also expressed concerns regarding privacy issues associated with pervasive user monitoring.
  • Lack of Industry-wide Standards: The absence of universal guidelines or standards for VR safety controls complicates the development process and makes it difficult to ensure a consistent user experience across different VR platforms.

Towards Improved Safety in VR

The paper identifies several recommendations for improving safety in VR environments based on user and developer perspectives. These include:

  • Enhancing Awareness and Usability: Simplifying the process of activating and using safety controls can encourage more users to take advantage of these features. Additionally, developers and platform owners should invest in user education and awareness campaigns about safety controls.
  • Developing Standardized Safety Protocols: Establishing industry-wide safety standards and guidelines can help unify the approach to user safety across VR platforms and applications.
  • Innovative Use of Technology for Moderation: Leveraging AI and machine learning for real-time abuse detection and moderation can offer scalable solutions while minimizing privacy concerns. Incorporating community-driven moderation approaches can also enhance the effectiveness of safety controls.
  • Balancing Privacy with Safety: Addressing safety concerns must not come at the expense of user privacy. Developing privacy-preserving measures for identity verification and behavior monitoring is crucial.

Conclusion

The juxtaposition of user experiences with developer insights in this paper underscores the complexity of addressing harassment in VR. Ensuring a safe VR environment necessitates a multi-faceted approach that combines technological innovation, community involvement, and industry collaboration. By prioritizing user safety and implementing effective and user-friendly safety controls, VR developers and platform owners can create more inclusive and welcoming virtual spaces for all users.