Snapchat ‘wholly irresponsible’ for allowing explicit teen images

Snapchat has been accused of being “wholly irresponsible” for allowing accounts allegedly promoting explicit images of teenagers to be searchable on its app.

Snapchat has come in for criticism over 'wishy-washy' responses. Picture: Yui Mok/PA Wire

A number of accounts have been highlighted by British school staff after they became concerned about how easily children were being exposed to and traumatised by such material on the platform.

Holly Tea, a school social, emotional and mental health interventions officer in Nottinghamshire, said she had seen children “really badly affected” by the issue.

Sign up to our daily newsletter

The i newsletter cut through the noise

She claims she and colleagues reported a number of accounts to the social media giant, but had not yet been made aware of any action taken against them.

Ms Tea said she was aware of the app also being used to circulate images that were captioned claiming to show students that were not in fact pictured and were then being used as a tool for bullying.

“The concern is it’s very easy to search for these type of accounts and it poses a real problem,” she said.

“I don’t think they [Snapchat] realise how detrimental this can be to young people.

“Children come into school in the morning and everyone is saying ‘oh did you see that picture of so and so’ and it’s really upsetting. They can isolate themselves, even self-harm as a result of it.

“I didn’t realise how bad it was until I really started looking into it.

“It greatly concerns me because of how fast social media is and I think they’re just not responding fast enough. You shouldn’t have such a site if you can’t respond to it quickly.”

Ms Tea said she had flagged the issue and accounts she and colleagues had discovered with the official Snapchat Support account via Twitter, telling the firm one student’s life “is being made a misery” because of the issue.

Responding to her tweet, the Snapchat Support account said its “safety team will investigate and take appropriate action on these accounts”, but Ms Tea called the reply “wishy-washy”.

“They say they don’t tolerate this kind of stuff, but it looks like they do,” she said.

The social media firm has not yet responded to a request for comment, but some of the accounts that appeared in search results appear to have since been removed.

The incident comes in the wake of the Children’s Commissioner for England urging tech companies to take more responsibility in protecting children on their platforms.

Anne Longfield questioned whether firms had lost control of the content that was appearing on their sites.

“With great power comes great responsibility, and it is your responsibility to support measures that give children the information and tools they need growing up in this digital world - or to admit that you cannot control what anyone sees on your platforms,” she said.

There have also been numerous calls from government for more regulation to be introduced with regards to the policing of social media and how it handles both personal data and malicious content on its platforms.