'Loophole' in law on messaging apps leaves children vulnerable to sexual abuse, says NSPCC
17 February 2025, 15:23 | Updated: 18 February 2025, 11:16

Nearly 39,000 child sex abuse image crimes were recorded last year - with an "unacceptable loophole" in the law leaving children vulnerable on messaging services, says the NSPCC.
Snapchat was the app that came up most often in the 7,300 cases where a platform was recorded, according to the children's charity.
The NSPCC says it believes the secrecy offered by one-to-one messaging services is being used "to harm children and go undetected".
It says Home Office data shows more than 38,685 such crimes were logged in England and Wales in 2023/24 - equivalent to more than 100 a day.
Police made a note of the service used in just over 7,300 of those cases. Of those, 50% took place on Snapchat, 11% on Instagram, 7% on Facebook and 6% on WhatsApp.
The NSPCC is among several charities, including Barnardo's, who have written to the home secretary and technology secretary urging them to strengthen the implementation of the Online Safety Act.
Ofcom is in charge of enforcing the new law, but charities say its recent code of practice contains a loophole as it only requires direct messaging services to remove content if it's "technically feasible".
The NSPCC also wants the platforms themselves to ensure they aren't a "safe haven" for abusers.
It says those that use end-to-end encryption - where the company is unable to view the messages - can be "blinded to child sexual abuse material being shared".
An illustration of the type of crime taking place can be seen in the experience of one 13-year-old victim cited by the NSPCC.
She said: "I sent nude pics and videos to a stranger I met on Snapchat. I think he's in his thirties. I don't know what to do next.
"I told him I didn't want to send him any more pictures and he started threatening me, telling me that he'll post the pictures online."
NSPCC chief executive Chris Sherwood called the situation "deeply alarming" and called on the government to take urgent action.
"Having separate rules for private messaging services lets tech bosses off the hook from putting robust protections for children in place. This enables crimes to continue to flourish on their platforms even though we now have the Online Safety Act," he said.
The act was passed in 2023 and requires social media firms to reduce illegal and harmful content, but its protections are only just taking effect through Ofcom codes of practice.
Read more from Sky News:
How a photo led 11-year-old to become sex abuse victim
Age checks for porn site must be in place by July
Instagram's new feature as govt tightens online safety rules
Last month, the Internet Watch Foundation (IWF), a charity that helps remove child abuse material, also said the codes gave platforms a "blatant get-out clause".
However, an Ofcom spokesperson said it anticipated most services would be able to remove harmful content.
"The law says that measures in our codes of practice must be technically feasible," said a statement.
"However, we expect the vast majority of platforms will be able to take content down and we will hold them to account if they don't.
"There'll be measures all platforms will need to take to protect children, such as reviewing child sexual abuse material when they become aware of it and reporting it to law enforcement."
A government spokesperson said: "Child sexual exploitation and abuse is despicable, and has a devastating impact on victims. UK law is clear: child sexual abuse is illegal and social media is no exception, so companies must ensure criminal activity cannot proliferate on their sites.
"The government is committed to the robust implementation of the Online Safety Act to ensure it delivers on its aim to make the UK the safest place online for children.
"We have already introduced four new laws to crack down on child sexual abuse online, but tech company design choices cannot be used as an excuse not to root out these heinous crimes - and we will not hesitate to go further to protect children from vile online predators."
Snapchat told Sky News the report does not reflect the seriousness of its efforts to counter this kind of abuse and that when it is made aware of sexually exploitative content on its platforms, "we remove it, lock the violating account, and report to authorities".
"Snapchat is designed to make it difficult for predators to find and interact with young people and has extra safeguards in place to help prevent strangers from connecting with teens," a Snapchat spokesperson said.
"Our Family Centre also allows parents to see who their teens are friends with and talking to on Snapchat.
"We work with expert NGOs, and industry peers to jointly attack these problems and don't believe the methodology used in this report reflects the seriousness of our collective commitment and efforts," they said.
(c) Sky News 2025: 'Loophole' in law on messaging apps leaves children vulnerable to sexual abuse, says NSPCC