Instagram: Girl tells how she was 'hooked' on self-harm images
- Published
At the age of 12, Libby, became "hooked" on posting and viewing self-harm images on Instagram - including pictures of cutting, burning and overdosing.
Her father, Ian, says his family reported such images to Instagram, but the social media company did nothing.
Speaking to the BBC, Libby, now 16, recalls sharing pictures of her fresh cuts with 8,000 followers.
She described how she was drawn in to an online community centred around self-harm photos.
"You start becoming a part of it - you get almost stuck to it," she says.
"I was very hooked on it.
"It was almost like you had to keep up with it otherwise people would turn away and stop caring."
She says the three main images were cutting, burning and overdosing.
'It made it safe to do it worse'
She says while Instagram didn't make her self-harm, the images she saw on the site "accelerated the severity" of the cuts.
"I'd see people and then my brain would go: 'That's OK. It doesn't matter how bad it gets because they're not dead, it hasn't killed them yet,'" she says.
"It made it safe to do it worse."
Libby's dad Ian was shocked by some of the images he saw: "It ranged from scratching, right through to Stanley knives and scalpels.
"I'm an ex-military man. I didn't see stuff like that when I was in the army."
If you’ve been affected by self-harm, eating disorders or emotional distress, help and support is available via the BBC Action Line.
It wasn't just the pictures that were shocking but also the comments underneath giving advice on how to self-harm.
Ian remembers posters saying: "You shouldn't have done it this way, you should have done it like that. Don't do it here, do it there because there's more blood."
"That is not someone trying to help you - that is someone getting off on it," he says.
'A dazed world'
For Ian and his family, the pressure of trying to keep his daughter safe was unimaginable.
"I honestly don't know how we did get through it," he says.
"You will never understand the stress.
"We were living in a dazed world while all this was happening.
"You couldn't leave her on her own. So we were just working round each other: 'You've got to go to work. I've got to go to work. Who's going to look after Libby?'"
The family say they attempted to report the images to Instagram but received a response that the pictures did not breach their community standards.
"They don't like to be contacted. They make it very difficult, or they did back then," Ian says.
"If you've got an issue and you want to speak to someone, there's nothing.
"Parents can do everything they want to try to prevent kids going on Instagram but where there's a will there's a way.
"Until one of their close family members fall down that rabbit hole they won't do anything about it.
"Until it affects them or their wallet, they are not interested.
"Instagram needs to put its hand up and say we've created a monster we cannot control."
Libby has now stopped harming and is getting good professional help. She is hoping to become a paramedic or mental health nurse.
However, her father says that unless Instagram acts "there are going to be more Libbys and more Mollys out there".
Molly Russell was 14 when she took her own life in 2017 after viewing disturbing content about suicide on social media.
Molly's father, Ian, told the BBC he believed Instagram helped kill his daughter.
In the days after the BBC reported on Molly's death, youth suicide prevention charity Papyrus say it saw a "spike" in calls to its UK helpline from families reporting similar stories.
What has Instagram said?
Instagram said its thoughts were with Molly's family and those affected by suicide or self-harm.
They said they have deployed engineers to start making changes to make it harder for people to search for and find self-harm content.
The company, which is owned by Facebook, acknowledged it had a "deep responsibility" to ensuring the safety of young people on the platform and had started a review of its policies around suicide and self-injury content.
The company also said it would
Start to make it harder for people to search for and find self-harm content
Restrict the ability of users to find the content through hashtags
Introduce sensitivity screens over self-harm content
Stop recommending accounts that post self-harm content
It says anybody can report content or accounts that they believe to be against the community guidelines.
Families can ask the company to remove accounts if the user is physically or mentally incapacitated. They can also report accounts belonging to a child under the age of 13.
The company says it does not usually close accounts because a parent has requested it, arguing that parents are in the best position to monitor and advise teenagers on responsible social media use.
It says Instagram has a responsibility to users and believes young people should be able to express themselves and find communities of support such as LGBT groups.
New Facebook vice-president Sir Nick Clegg said the company would do "whatever it takes" to make the platform safer for young people.
He also added that experts had said not all related content should be banned as it provided a way for people to get help.
"I know this sounds counter-intuitive, but they do say that in some instances it's better to keep some of the distressing images up if that helps people make a cry for help and then get the support they need," he said.
Analysis
By BBC correspondent Angus Crawford
At the heart of problem is an algorithm. Or really a series of algorithms. Complex instructions written into code.
They underpin the mechanics of social media. Analysing everything you do on a platform - pinging you more of the content you like and adverts for things you never knew you wanted.
Interest transforms into clicks, which translates into engagement and finally "sales" - with data being scraped all the time - that's the business model.
But therein lies the problem. If you like pictures of puppies, you'll get more of them. If you seek out material on self-harm and suicide - the algorithm may push you further and further down that pathway.
Add to that the scale of the operation - Instagram says it has one billion users.
How do you effectively police that without driving your users away - consumers, especially teenagers, are picky, impatient and averse to anything that puts "friction" into their enjoyment. Annoy your users and they'll leave for good.
Finally there's verification - anyone who has a phone and an email can sign up for a social media account. And you can be totally anonymous - bad behaviour loves dark places.
To be fair to Instagram it has started making changes - restricting hashtags, no more "recommending" of self-harm accounts. Soon they'll be blurring images of self-harm.
But here's the dilemma for the tech companies - how do you tinker with an algorithm at the heart of your platform to make people safer, if those changes could undermine the very business model you are trying to protect?
What are politicians doing?
Health Secretary Matt Hancock said he was "horrified" by Molly's death and feels "desperately concerned to ensure young people are protected".
Speaking on the BBC's Andrew Marr show, Mr Hancock called on social media sites to "purge" material promoting self-harm and suicide.
When asked if social media could be banned, Mr Hancock said: "Ultimately parliament does have that sanction, yes" but added "it's not where I'd like to end up."
"If we think they need to do things they are refusing to do, then we can and we must legislate," he said.
Culture Secretary Jeremy Wright told MPs the government is "considering very carefully" calls to impose a legal duty of care on social media companies.
He said there had been some activity by social media companies but not enough, adding that it would be "wrong to assume that this House or this Government can sit back and allow the social media companies to do this voluntarily".
Labour's deputy leader and culture spokesman Tom Watson accused Facebook of being more focused on "profiting from children" rather than protecting them.
Related topics
- Published27 January 2019
- Published23 January 2019
- Published31 January 2019