The NSPCC has issued a stern warning to technology companies following reports of British police investigating a case of sexual abuse within the metaverse. The incident involved a young girl’s digital persona being attacked by a gang of adult men in an immersive video game, marking a concerning trend in virtual reality offenses. The NSPCC asserts that online abuse in such environments can have devastating real-world consequences and urges tech firms to prioritize child safety. The charity calls for immediate action and emphasizes the need for law enforcement to have access to evidence and resources to protect children effectively.
About 21% of children aged five to ten reportedly owned a virtual reality headset in 2022, with 6% regularly engaging in virtual reality. Richard Collard, associate head of child safety online policy at the NSPCC, criticizes tech companies for rolling out products without prioritizing children’s safety. He urges companies to step up efforts to protect children from abuse in virtual reality spaces and emphasizes the importance of understanding the harm taking place on their platforms.
The NSPCC previously urged the government to provide guidance and funding for virtual reality-related offenses. The charity also called for regular reviews of the Online Safety Act to ensure emerging harms are covered by the law. Ian Critchley from the National Police Chiefs’ Council emphasizes the evolving nature of grooming tactics used by offenders, highlighting the essential role of collective efforts to protect young people online.
The Online Safety Act, passed last year, grants regulators the power to sanction social media companies for content on their platforms, but enforcement is pending. Meta, the owner of Facebook and Instagram operating a metaverse, asserts that such behavior has no place on their platform. They mention an automatic protection feature called “personal boundary” and express readiness to investigate the incident as details become available.