Police uncover teenage girl ‘suicide’ Instagram group
- Published
Social media groups involving teenage girls that led to "suicidal crises" and "serious self-harm" have been uncovered by police, BBC News has learned.
Twelve girls, aged between 12 and 16 years old and from across southern England, were part of an Instagram chat group whose name refers to suicide.
The group was discovered when three of the girls went missing and were found seriously unwell in London.
Instagram says it found no suicide or self-harm related content in the group.
'Serious self-harm'
BBC News has obtained a police briefing about the investigation which discovered the online groups.
It says "peer-to-peer influence increased suicidal ideation amongst the children involved to the extent that several escalated to suicidal crises and serious self-harm."
The group came to the notice of police when three of the girls, who had been reported missing, travelled by train to meet in London.
They were found seriously unwell in a street and taken by ambulance to hospital for emergency treatment.
One of the girls mentioned they had first met each other online and discussed suicide, according to the briefing which was published on 25 March.
Police officers then examined digital devices to identify the name of the online group and its other members.
Seven of the 12 girls had self-harmed prior to being traced by the police. Children's social care services from seven different local authorities have been involved in safeguarding children identified as members of the group.
BBC News understands that some of the children had met on other social media platforms but were part of a closed Instagram group - a direct message thread - whose title explicitly mentions the words "suicide" and "missing".
In November 2020, Instagram introduced new technology to recognise self-harm and suicide content on its app.
Fears about the impact of this content on young and vulnerable people been raised since 14-year-old schoolgirl Molly Russell killed herself after seeing graphic images on the platform.
Molly's father, Ian, told the BBC at the time that the "pushy algorithms" of social media "helped kill my daughter".
Facebook, which owns Instagram, does not deny that the name of the closed group referenced "suicide" but says it has not been removed from the platform because the content of the messages does not break its rules.
In a statement, a company spokesperson said it was co-operating with the police.
"[We] reviewed reports but found no content that broke our rules, nor in fact any suicide or self-harm related content," they said.
"We don't allow graphic content, or content that promotes or encourages suicide or self-harm, and will remove it when we find it.
"We'll continue to support the police and will respond to any valid legal request for information."
British Transport Police, which led the investigation, declined to comment.
If you've been affected by issues raised in this story, sources of support are available via the BBC Action Line here.
You can call Samaritans free on 116 123, email them at jo@samaritans.org, or visit www.samaritans.org to find your nearest branch.
YOUR SATURDAY FILM SORTED: Chill out with one of our films this weekend
GAME TRANSFER PHENOMENA: What is it and do you have it?
- Published11 November 2020
- Published14 November 2020
- Published8 February 2021