WhatsApp is failing to stop paedophiles sharing child abuse images, say police

TELEMMGLPICT000085914409.jpeg
WhatsApp logo Credit: UK Tech

WhatsApp is failing to stop paedophiles sharing child sex abuse images and grooming children and should face new laws unless it takes urgent action, says the UK’s top police officer for child protection.

Simon Bailey, the National Police Chiefs Council’s lead on child protection, singled out the Facebook-owned messaging app after evidence that paedophiles have set up groups on the site with titles such as ‘Only Child Pornography’, ‘CP’ and ‘Gay Kids Sex Only.’

"I particularly have concerns - as does the National Crime Agency (NCA) - about WhatsApp and their response to the way their platform is being used. It's being used by offenders to facilitate the sharing of indecent images and to groom children online,” said Mr Bailey.

"I would like them to ensure people can't use their system to share indecent imagery and ensure they are able to monitor and identify inappropriate approaches to and grooming of young people.

"They need to tell the police as soon as that happens. They are not as proactive as I would like them to be. A lot more needs to be done.”

It follows two investigations which uncovered apps, previously on sale on Google Play store, that directed paedophiles to child abuse image and video sharing groups on WhatsApp, Britain’s most popular messaging platform.

One investigation by cyber specialists in India uncovered groups operating on WhatsApp hosted using US, UK, Indian and Pakistani numbers weeks after the apps had been removed from Google. The findings have been passed to Indian government investigators.

Last week Israeli specialists AntiToxin Technologies also uncovered groups, at least one of which was full and not accepting any more participants.

Mr Bailey said: "What we ultimately need to see is companies developing AI and algorithms whereby images can't be uploaded, can't be shared and children can go online and not fear being groomed.”

"We should be encouraging these companies to take their social responsibility seriously. If that doesn't work then legislation would probably be the last resort."

His concerns have been endorsed by Sajid Javid, the Home Secretary, who promised new legislation to be outlined this Spring in a White Paper setting “clear responsibilities for tech companies including making sure children are protected online.”

“Tech firms must go faster and further to tackle child sexual exploitation on their platforms,” said a spokesman for Mr Javid.

The apps that directed paedophiles to illegal child porn sharing sites were available on Google’s play store until they were removed over Christmas by the company.

Some of the apps had been downloaded more than 100,000 times and could still function on the phones of people who purchased them, according to AntiToxin Technologies, an Israel-based child protection organisation that uncovered the trade.

They direct people to groups on WhatsApp where hundreds of paedophiles share illegal child porn, most of which is recently produced and largely comprising videos of children, says AntiToxin. Two tested by AntiToxin last week were still active.

An investigation by Nitish Chandan, a cyber security and cyber law specialist, also uncovered active apps with links to groups on WhatsApp where paedophiles were sharing child abuse images.

“It ranged from videos and pictures of child sexual abuse to messages offering services like video chat with children,” said Mr Chandan, who said they had passed their dossier of evidence to the Indian ministry of home affairs.

Most were in Hindi but with numbers from the US, UK, Pakistan and Middle East. There was also evidence that some of the groups were gravitating to Telegram, “possibly because WhatsApp has come onto their radar and has been helping the enforcement authorities.”

A spokesman for WhatsApp said: “WhatsApp has a zero-tolerance policy around child sexual abuse. We deploy our most advanced technology, including artificial intelligence to scan profile photos and actively ban accounts suspected of sharing this vile content.

“These efforts lead us to ban approximately 130,000 accounts over a ten day period. We also respond to law enforcement requests around the world and immediately report abuse to the National Center for Missing and Exploited Children.

“Recent reports have shown that both app stores and communications services are being misused to spread abusive content, which is why technology companies must work together to stop it.”

The spokesman said that after being alerted to the apps, it had asked Google to remove all WhatsApp group sharing apps from Google Play Store.

Andy Burrows, NSPCC Associate Head of Child safety Online, said: “It is appalling that WhatsApp appears to be doing nothing while child sexual offenders are freely sharing this abhorrent content on the messaging app. 

“It is abundantly clear that platforms time and again fail to protect children on their sites. Self-regulation has not, and will not, protect children from abuse.  

“It is imperative that the Government introduces an independent regulator that forces social networks to make their sites safe for children to use, with tough consequences if they fail.”

License this content