Facebook-owned applications are the most commonly used platforms of online groomers in England and Wales, according to data released by the National Society for the Prevention of Cruelty to Children (NSPCC). The organization obtained the figures via freedom of information (FOI) requests.
In both England Wales, the NSPCC found that over 10,000 online grooming offenses were recorded by local authorities in the two and a half years since a law made sending sexual messages to children illegal for adults.
In those 10,000, 5,784 -- more than half of the offenses -- were performed using Facebook and apps owned by the social media company, including WhatsApp, Instagram, and Messenger.
The NSPCC called on the British government to deliver the Online Harms Bill within 18 months.
"In February, Digital Minister Matt Warman promised to publish an Online Harms Bill following proposals set out in a white paper," the charity noted. "However, frustration is growing at delays to the legislation not now expected until the end of the year and concerns we might not see a regulator until 2023."
The release of these alarming child grooming figures follows Facebook plans to integrate end-to-end encryption across all of its platforms, saying that it will make for privacy loopholes. Currently, WhatsApp is the only Facebook-owned app that uses such encryption.
However, Facebook's plans have been opposed by activists, arguing that end-to-end encryption could make child exploitation cases undetectable.
Shareholder and advocacy group Proxy Impact had voiced out its concerns ahead of Facebook's annual shareholder's meeting last week, saying that encryption "will provide child predators cover that will exponentially expand their outreach and the number of victims."
The governments of the U.S., U.K., and Australia also rallied for Facebook to either call off its encryption plans or allow the government to have access to content that could protect children from sexual predators.
Facebook, so far, has been unwilling to provide back doors to its encryption, fearing it could be exploited by malicious attackers. The company argues that that other technology, including PhotoDNA, makes it possible for the company to find child exploitation content.
It should be noted, however, that the data begins in early 2017 and runs to October 2019, so it may not account for the most recent trends in platforms that young people use. But the NSPCC maintains its information demonstrates the need for new regulation and warned that the coronavirus lockdown had created "a perfect storm for abusers."