Sexual harassment exists in all walks of life and the 3D virtual reality simulation that is the metaverse is no exception. However, experts warn the immersive, all-consuming nature of virtual reality means sexual violence has even worse repercussions than harassment in other digital landscapes.
Nina Jane Patel, a psychotherapist who conducts research on the metaverse, has first-hand experience of sexual violence in the virtual environment. The 43-year-old mother-of-four recently revealed her “surreal nightmare” of being “gang raped” in virtual reality.
“You are literally stepping into a 360-degree digital environment,” Patel tells The Independent. “Because virtual reality has been designed to be as real as possible, it is similar to inviting someone into your living room, so the violation feels more acute than it would feel on a social media platform.”
Writing in a Medium post at the end of December Patel said that it felt like it had happened in real life due to the technological advances of simulation, and was in Facebook’s Horizon Venues metaverse when the incident occurred. “Within 60 seconds of joining I was verbally and sexually harassed. Three or four male avatars, with male voices, essentially, virtually gang-raped my avatar and took photos — as I tried to get away they yelled: ‘Don’t pretend you didn’t love it,’ and ‘go rub yourself off to the photo’.”
“Sexual harassment and violence is a big problem in the metaverse in its current state,” she reflects. “I’ve had several women get in touch to say they have experienced sexual harassment there.”
Ms Patel, who researches the psychological and physiological impact of immersive experiences, said online reviews of Horizon Worlds include many people sharing stories of harassment. As the internet has evolved, she argues, anonymity has been prioritised over accountability. But it is now time to “learn from mistakes”, she adds, warning of the damage impact if problems in the metaverse are not addressed.
“I have talked to many women who, since the dawn of the internet, are shrugging off sexual advances, harassment and verbal assaults on a daily basis,” she says, “and accept it as weirdos on the internet. But we are coming to a time we can no longer accept this, because of the embodied, all-immersive, nature of the metaverse, which is a multi-sensory experience.”
In her view, violent acts in the virtual reality could lead to a rise in incidents in the physical realm. “We must have a zero-tolerance approach, as we develop a safer metaverse, for the sake of our children,” she says. “These impressionable and still maturing minds are at risk of mental and emotional degradation if we don’t take safety in the metaverse seriously and make sure the correct safeguards are in place for our children to truly benefit from the tech.”
Last October, Mark Zuckerberg changed Facebook’s parent company’s name to Meta, saying: “Over time, I hope that we are seen as a metaverse company, and I want to anchor our work and identity on what we’re building toward.” A spokesperson for Meta says they have recently introduced a “new personal boundary” that eases the process of dodging “unwanted interactions” in the virtual space.
“We are sorry this happened,” the representative added. “We want everyone in Horizon Venues to have a positive experience, to easily find the safety tools that can help them – and help us investigate and take action.”
Patel’s comments come after research into the metaverse carried out by the Centre for Countering Digital Hate discovered a hundred potential violations of Meta’s policies for virtual reality in the space of 11 hours and 30 minutes of recordings of people’s conduct in the app – translating to one violation every seven minutes.
The study, conducted in December last year, saw researchers closely analyse VRChat, which is the most popular social app that can be purchased in Meta’s VR app store. The report revealed abusive behaviour included “minors being exposed to graphic sexual content, bullying, harassment and abuse of other users, including minors, minors being groomed to repeat racist and extremist messages, threats of violence, and content mocking the 9/11 terror attacks.”
Researchers at the centre said: “The reporting system wasn’t fit for purpose: we could only lodge reports in the moderation system for 51 of these 100 incidents. None of these 51 reports of abusive behaviour, including sexual harassment of minors, were acknowledged by Meta in any way.”
Callum Hood, who is the organisation’s head of research, and was closely involved in the study, said: “We saw people sexually harassing people who said they were minors. Users in one group chat were sending hardcore pornography to the group. Users were bragging about imposing that on other users. They were deliberately moving from one world to another world or one group chat to another to target users with this porn.”
Having spent a great deal of time exploring the metaverse and observing user behaviour, Hood’s experiences have not filled him “with a compulsion” to return. He describes an incident where two male users followed female users around, “crowding them, stalking them, looking at them closely, breathing on them” – adding that they appeared to have deliberately logged on to do so.
“There were a number of examples where users threatened to rape other users,” he says “In some cases, they would threaten repeatedly. We also had a couple of cases in which users would quite aggressively demand social media contact details for another user – sometimes getting other users protesting, saying: ‘You know I’m a minor?’. Typically, it was men to women.”
He cites another worrying example of an older teenager racially abusing others as well as coercing a younger teen to use racist language. Some of the behaviours Hook saw highlighted the potential mental health issues of the perpetrators, he says, compounded by the fact the platform is used by so many young teenagers.
“Meta has deliberately set the minimum age of 13-plus,” he explains. “Young people are a key audience, but in some cases their voice or tone appeared to be substantially younger than 13. There was a user who appeared to be very young, who was abused by lots of other users. There are no effective age controls.”
If Meta refuses to take these issues of harassment and abuse seriously, he adds, the problem is “going to get worse and worse. There are already people who are grasping the new opportunities posed by the metaverse and VR to abuse and harass people and spread hatred.”
Elena Martellozzo, associate professor in Criminology at Middlesex University’s Centre for Child Abuse and Trauma Studies, argues that while the benefits of the metaverse include a sense of anonymity and freedom to play, there are downsides due to its potentially heightening the lack of inhibition that people sometimes display in digital spheres.
“The disinhibition process refers to the fact you become less inhibited online when you don’t have that face-to- face interaction,” says Martellozzo. “Even in emails, people can be a little bit more frank and borderline aggressive. If the conversation was happening face-to-face, people would be way more cautious. The metaverse enhances this disinhibition process even more greatly. And it makes it feel even more real for whoever is interacting.”
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies