Media playback is unsupported on your device

Media captionMinister Matt Hancock tells BBC Breakfast both government and social media companies can do more.

Children at risk of online grooming should be sent automatic as part of the government’s safety strategy, the NSPCC has said.

The children’s charity said existing algorithms could be used to flag suspected groomers to moderators.

A “staggering” 1,316 offences were recorded in the first six months of a new child grooming law being introduced last year in England and Wales.

Minister Matt Hancock said he would be robust with social media companies.

The minister for Digital, Culture, Media and Sport said the government was working on making the the safest place in the world to go online and that can and “must” include grooming alerts.

He told BBC Breakfast that as a father of three young children it was something that “really mattered” to him.

Before the new offence of sexual communication with a child was introduced in April, could not intervene until groomers attempted to their targets face-to-face.

Of the 1,316 cases recorded, the youngest victim was a seven-year-old girl, although girls aged between 12 and 15 were the most likely to be targeted by predators.

Facebook, Instagram and Snapchat were the most common sites used by offenders, making up 63% of all incidents.

The NSPCC, which campaigned to bring in the new legislation, has criticised social media companies for not making the most of the technology they already use to enforce the law.

‘He had all the power’

One victim, who was years old when she started chatting to a man online, told the BBC about the “traumatic” physical sexual abuse she experienced.

“I have flashbacks and have to have medication to control those,” she said.

“When I was about 12 he wanted to meet up with me in person. He had all the power. He had totally manipulated me to believe that I was doing something wrong and it would be me who would be punished for this.”

Image copyright
Getty Images

Image caption

Of the 1,316 cases recorded, the youngest victim was a seven-year-old girl

Algorithms – the calculations that tell computers what to do – are currently used by social media companies to flag up images of child abuse, hate speech and extremist material.

The charity said the same techniques should be used to pick up “grooming language” and then send an automatic alert to both the child and moderators.

Tony Stower, head of child safety online at the NSPCC, said: “Despite the staggering number of grooming offences in just six months, government and social networks are not properly working together and using all the tools available to stop this crime from happening.

“Government’s Internet Safety Strategy must require social networks to build in technology to keep their young users safe, rather than relying on police to step in once harm has already been done.”

Facebook said it was already using technology to identify grooming behaviour.

The NSPCC said an existing voluntary of practice does not go far enough and has called for a mandatory to be put in place.

The Office said £20m was spent pursuing offenders in 2017.

But it added that social media sites should “take on the challenge” of online grooming and take all possible steps to prevent children being exploited on their platforms.

Vera Baird, victim affairs lead at the Association of Police and Crime Commissioners, said she expected the number of cases to be higher given the “endemic” scale of online grooming.

She said alerts are “imperative” to prevention, but should be accompanied by sex and relationships education so that children know how to respond to such a warning.

Source link


Please enter your comment!
Please enter your name here