We Have The Technology Already to Tackle Online Child Grooming Claims Charity

One of Britain’s largest child welfare charities is demanding social media giants use existing technology already available to prevent the growing tide of grooming of children often as young as seven.

The NSPCC insists algorithms are already used by social networks such as Facebook, Instagram and Snapchat to target adverts at specific audiences as well as detect illegal content online. Now the charity wants the same techniques to be alert young children to potential grooming behaviour, alert moderators as well as police forces.

It is also urging the UK government to crackdown and act swiftly before more children fall victim to online grooming after a Freedom of Information request to police forces in England and Wales reported 1,316 offences since the introduction of the Sexual Communication with a Child law was introduced six months ago.

Social media giants Facebook, Instagram or Snapchat were found to be the most common methods used by groomers with girls aged 12 and 15 the most likely to be targeted by predators. The youngest victims were aged seven.

According to the NSPCC, however, not enough is still not being done to tackle the problem.

 

— NSPCC (@NSPCC) 29 January 2018

Tony Stower, NSPCC head of child safety online, said: “Despite the staggering number of grooming offences in just six months, government and social networks are not properly working together and using all the tools available to stop this crime from happening. 

He warned: “Government’s Internet Safety Strategy must require social networks to build in technology to keep their young users safe, rather than relying on police to step in once harm has already been done. If government makes a code for social networks that is entirely optional and includes no requirement for platforms to tackle grooming, this is a massive missed opportunity and children will continue to be put at risk.”

 

Until the new anti-grooming law came into force in April 2017 police could not intervene until groomers met their victims In 2015 former England international footballer Adam Johnson sent sexual messages to a 15-year-old girl, before meeting her and engaging in sexual activity. 

Now police forces in England and Wales can step in sooner to prevent similar grooming cases from arising.

The charity insists technology already exists where grooming language used by adults speaking to children online can be picked up, allowing an alert to be sent to youngsters to warn them about the chat they’re having and offering them support if needed.

The charity believes the Department for Digital, Culture, Media and Sport could make this happen. Yet it has said it will only produce a code for social networks that will only be voluntary, and that code will not include measures to prevent grooming.

The NSPCC argues that this doesn’t go far enough and wants social networks to automatically flag up potential grooming situations to moderators.

At present algorithms already automatically flag child abuse images, hate speech and extremist content to moderators for removal. 

 

Now the charity is urging the Home Office to work with industry to use existing technology to flag unusual account patterns associated with grooming behaviours. For example, friending and following many young people with no mutual friends and no geographic links, getting a high number of rejected friend requests from children, or spikes in views of posts made by under-18 accounts.

Where moderators believe criminal activity is taking place, they can then notify police.

Speaking on January 29, Matt Hancock, the minister for digital, culture, media and sport, insisted the UK government was determined to make Britain the safest place in the world to go online. He accepted grooming alerts was a “must” as part of this strategy.

Sourse: sputniknews.com

No votes yet.
Please wait...

Leave a Reply

Your email address will not be published. Required fields are marked *