It is 1.45pm, and in a virtual online chatroom a man using a cartoonish avatar is chatting with another user who from her childish giggling is clearly a young girl.
Dressed as a shark, he follows her around, before convincing her to join him in a private world where others cannot see or hear them. He creates a portal and the two disappear inside.
Another girl, who appears to be a young teen, is standing on the roof of a tall building, looking down. A male user, using a monkey avatar, taunts her from below, shouting: “Jump! jump!”. He later gets other young users together to teach them ‘lessons of life’, and writes the chilling instruction: “Kill yourself”.
In another room, a robot avatar with a deep voice is putting his hands on a little fairy, who also has a child’s voice. She tries to move away from them, asking: “Who is this paedophile?”
“Me,” the robot replies. “I just want to put you in my pocket and bring you home, little fairy girl. I’d put you in my sink and give you a bubble bath.”
Such alarming online scenes might seem like something you would find in the deepest parts of the dark web. But they are actually from the most popular chatroom app on one of this year’s must-have gaming devices, Oculus.
Millions of families now have the VR headsets, now called Meta Quest, while the Oculus app needed to play on them was the most popular on Apple’s App Store over Christmas, with 1.3 million downloads worldwide. While most parents might reasonably expect the device – owned by Meta, formerly known as Facebook – is safe for their children to use, the reality is chillingly different.
Within minutes of entering one of the virtual reality worlds on VRChat, where adults freely mix with young children who are clearly much younger than the Oculus’s age limit of over 13, we are witnessing shocking scenes.
They included racist, homophobic and misogynistic language, bullying and threats, as well as explicitly sexual conversation and abusive behaviour, including simulating sex, virtual sexual harassment and masturbation.
In almost all the incidents, minors were present. Children were also regularly exposed to graphic sexual content and pornography, including digital avatars showing naked bodies and users drawing explicit images.
The Oculus VR devices are central to Meta’s pursuit of the “metaverse” – an interconnected 3D world where people interact via digital avatars. Announcing his company’s change of name to reflect their goal in October, Facebook founder Mark Zuckerberg promised that metaverse products would be designed “for safety, privacy and inclusion”.
But that is not what we found in virtual rooms in VRChat, which is listed fourth in the Oculus’s most popular apps. Conversations in the chat rooms are littered with explicit and offensive language. In one room, users – including children – crowded around a SpongeBob SquarePants avatar as he pretended to commit sex acts.
Although Meta this week announced it would introduce a mandatory four-foot personal boundary between people’s digital avatars in its own apps, campaigners point out it does not apply to others on their platform, like VR Chat.
While it is difficult to accurately identify those behind the avatars, it is clear that the rooms attract vulnerable teens.
One, dressed as a skeleton, told a male monkey avatar that his dad “left me after 15 years” and that “I haven’t seen my mother in a while”. When the monkey, who had an older voice, wrote ‘kill yourself’ in the air, the skeleton replied, “I have a knife on me right now actually. I have five knives in my room.”
Our findings are backed up by user reviews, one said the game is “full of genuinely worrying people talking explicitly and sexually to young kids.”
Meta’s VR policies prohibit bullying, abuse and harassment, including “sexualising minors in any way”.
But while Oculus has a form where users can report abuse, the Center for Countering Digital Hate claims Meta rarely takes them seriously, reporting 100 policy violations on Oculus, they did not receive a response.
Imran Ahmed, the charity’s chief executive, branded it “a cesspit of hate, pornography and child grooming.”
Hannah Ruschen, Child Safety Online Policy Officer at the NSPCC, said: “Immersive environments are the new frontier in the Wild West Web and children are at huge risk of abuse because tech bosses haven’t learned that growth before safety has consequences.”
A spokesman for Meta said it provides tools to report and block others, adding: “Quest devices are 13+ and were not designed for younger children.
“We encourage parents to monitor use, limit the time using the headset, and ensure they take breaks.”
Gaming risks: what parents need to know
Do you really know what your children do when they are playing video games, and the types of gaming that can put them at most risk? Here the NSPCC outline six aspects of gaming that all parents need to be aware of.
In many online games some players have more power than others.
Those who have gained more achievements are looked up to by those who are newer to the game, and passing on their knowledge to others is often used as currency.
Children who want to earn more badges or points, and be looked up at themselves, are often vulnerable to grooming by adults who could help them.
You might think your kids are just playing on one gaming device, but often they are using a number of platforms simultaneously, putting themselves at more risk.
A child may be playing a game while also talking to someone in a separate community, or on a chat service such as Discord. It’s also common for gamers to meet a player in a game and then connect with them on one of several social networks.
Offenders often use multiple accounts to target a child, knowing there is no system in place for different platforms and social networks to tell each other about suspicious or abusing accounts.
If your child is a gaming fan they’ll probably watch others live streaming themselves playing and commentating on games.
However, while watching livestreams is increasingly popular among kids, the technology that can flag harmful content and behavior as it happens has not caught up with the pace of this development. It means there is more risk a child can be exposed to material that would be removed on a more traditional platform.
Real time text and voice chat
Many games allow players to talk to and text others while playing.
Companies find it difficult to keep on top of abusive chat and step in to remove suspicious people.
In voice chat, girls are particularly at risk as their voices are easily distinguishable making them vulnerable to misogynist and sexist abuse. So called ‘pile ons’ where people conspire to harass a single user, are common.
Immersive gaming and VR
In many immersive worlds, where we can embody avatars, interact with others and explore 3D landscapes, children can mix with adults with little to no moderation. Even on kid-friendly Roblox there have been reports of virtual strip clubs with children seen virtually lap-dancing to earn money to spend in the game.
The NSPCC is calling on the Government to toughen up the Online Safety Bill to hold tech firms and bosses to account for failing to protect children in these immersive environments
Bullying and harassment
Research shows one in six children have experienced bullying or abuse when playing games connected with other users online. 72% of women have experienced sexist abuse when gaming, while LGBTQ gamers are at particular risk.
A type of cyber-attack called doxxing is common – where personally identifying information is shared as a way to abuse members of marginalized groups.