Instagram grooming of children as young as five triples

The number of children targeted for grooming and abuse on Instagram has more than tripled with some as young as five falling victim, figures show.

The NSPCC urged a “Wild West Web” clampdown after it found 5,161 reports of sexual communication with a child were recorded in just 18-months.

Facebook, Snapchat and Instagram were used in 70 per cent of incidents, according to figures obtained by the NSPCC.

Hide Ad
Hide Ad

The charity’s chief executive, Peter Wanless, accused social media firms of “10 years of failed self-regulation”.

The NSPCC is calling for more measures to protect children online. Picture: John DevlinThe NSPCC is calling for more measures to protect children online. Picture: John Devlin
The NSPCC is calling for more measures to protect children online. Picture: John Devlin

“These figures are overwhelming evidence that keeping children safe cannot be left to social networks,” he said.

In incidents where police also recorded the contact method, Instagram was used by groomers 126 times between April and September 2017 - but that increased to 428 for the same period last year.

Girls aged 12 to 15 were the most likely targets but roughly one in five victims were 11 or under, according to the records which also include an age.

Freedom of Information data from 39 of the 43 police forces in England and Wales was obtained by the charity.

The figures, from April 2017 to September 2018, are revealed amid growing criticism of how social media firms protect children on their platforms.

They coincide with a forthcoming Government white paper about online harms which the NSPCC hopes will include new laws to tackle grooming.

Mr Wanless added: “We cannot wait for the next tragedy before tech companies are made to act.

Hide Ad
Hide Ad

“It is hugely concerning to see the sharp spike in grooming offences on Instagram, and it is vital that the platform designs basic protection more carefully into the service it offers young people.”

In one harrowing account of abuse given by the charity, a victim told how she was groomed by a 24-year-old man when she was just 13.

She met him in person through a friend and he initially said he was 16, then 18, before he added her on Facebook and Snapchat the same evening.

The girl said it “escalated very quickly” before he encouraged her to share photos of herself and meet for sex after school.

“He drove me somewhere quiet ... and took me into the woods and had sex with me,” she said.

“He drove me in the direction of home straight afterwards, refusing to even talk, and then kicked me out of the car at the traffic lights. I was bleeding and crying.”

The girl’s mother added: “Somebody has got to take responsibility for what happens to children on their platforms. Simply put, if social media didn’t exist, this would never have happened.”

Hide Ad
Hide Ad

Police disclosed which channels groomers used in 1,317 of the 1,944 offences between April and September 2018.

Instagram was used in 32 per cent of cases, Facebook in 23 per cent and Snapchat in 14 per cent.

In the same period 41.2 per cent of reports were in the north of England, 20.6 per cent in the south east and 12.7 per cent in the Midlands, with the remainder elsewhere in the country.

Surrey, Sussex, Northampton and City of London police did not provide data for the period, the NSPCC said.

A National Crime Agency spokesman said: “It is vital that online platforms used by children and young people have in place robust mechanisms and processes to prevent, identify and report sexual exploitation and abuse, including online grooming.

“Children and young people also need easy access to mechanisms allowing them to alert platforms to potential offending.

“The National Crime Agency helps industry to enhance their reporting tools and where possible, shares knowledge and expertise to support industry to improve standards and security online.”

A spokesman for Facebook and Instagram said: “Keeping young people safe on our platforms is our top priority and child exploitation of any kind is not allowed.

“We use advanced technology and work closely with the police and Child Exploitation Online Protection Command to aggressively fight this type of content and protect young people.”