What to do if you see an Instagram post about suicide
- Published
Malaysian police say a 16-year-old girl killed herself earlier this week, after she asked her Instagram followers whether she should live or die.
The Malaysian teenager had hosted a poll on her Instagram story, with the question: "Really Important, Help Me Choose D / L", where D stood for death, while L stood for life, according to police.
At one point, the poll showed that 69% of respondents chose D, police say.
Malaysian politicians are demanding an investigation, with the youth minister saying: "I am genuinely worried about the state of our youth's mental health."
The case raises serious questions about the impact of online bullying, and whether technology companies are doing enough to protect vulnerable users.
And it's a global issue - in the UK, the government recently warned it could ban social media firms if they fail to remove harmful content, after a 14-year-old, Molly Russell, took her own life after viewing content about suicide on social media.
What should I do if I see someone posting about taking their life?
If you think someone you know is in immediate danger, suicide prevention organisations recommend you call the emergency services for help.
If they aren't in immediate danger, things you could do include reporting the post, contacting the poster if you know them, and putting them in touch with helplines.
"A lot of social media platforms will reach out to users if they post things that suggest they could be in danger," says Dr Lucy Biddle, a medical sociology lecturer at the University of Bristol who has researched suicide-related internet use.
She recommends reporting posts you're concerned about so the platform can contact the poster and let them know where they can go for help., external
Instagram says: "We have a deep responsibility to make sure people using Instagram feel safe and supported. As part of our own efforts, we urge everyone to use our reporting tools and to contact emergency services if they see any behaviour that puts people's safety at risk."
On the app, you can tap the three-dot icon and then select "report", followed by "it's inappropriate", and "self-injury".
Facebook, which owns Instagram, has previously said that its teams review reports and try to support people with suicidal thoughts, external, for example, by encouraging people to reach out to their friend, and providing information on suicide or self-harm helplines.
Meawhile, Dr Biddle says that if the poster is "someone you know", you could consider talking to them.
"Lots of people have this fear about asking someone if they're feeling suicidal, because they think it may make it worse. In fact, that's not the case. It's really important to reach out to someone you're concerned about. It's important not to ignore it or assume that these things are attention-seeking."
She adds that, for younger internet users, "if you feel you can't do that, get an adult to do it with you, or for you".
Another thing you could do is to encourage them to contact a helpline, for professional support.
Papyrus, a youth suicide prevention charity, external in the UK, says: "If you are worried about someone online, perhaps because of posts they have shared or comments they have made, you could let them know about our helpline, or other avenues of support. You can lead by example by being supportive, informed and aware."
Is there a link between internet use and suicide?
The Samaritans, a suicide prevention charity, external, says in its online guidance that "suicide is complex and is rarely caused by one thing".
It says that internet use can lead to "both positive and negative outcomes for young people" - with some people finding help and support online, but others consuming content that glamorises or encourages suicide.
"There is quite a complex link - internet use can helpful for some people, but very detrimental for others," says Dr Biddle.
Online communities, for example, can offer support to people who are thinking about self-harm, but could also "unhelpfully normalise" ideas about self-harm which means they "can't see that they need to seek help", or become isolated from family and friends that can offer real-life support.
From her research, which involved interviewing people who used the internet for suicide-related purposes, or were hospitalised after suicide attempts, she also found that "some people saw the internet and social media as a space where they could try out their thoughts with no repercussions, unlike asking someone they knew directly or going to a parent".
"I imagine if you get people reinforcing your thoughts [about suicide] in that way, that could be quite harmful," she says, adding "in all these spaces, you can get trolls that can change the whole nature of a group".
Dr Biddle recommends people who want to talk about these issues use "a well-moderated forum, run by an official charity" as a safer space, and to be aware of each platform's community guidelines and how they can flag and block content.
Australian media health charity Orygen has also published guidelines about how young people can communicate safely about suicide online, external, with advice about blocking unsafe content and taking control of what content you see.
If you are feeling emotionally distressed and would like details of organisations which offer advice and support, click here. In the UK you can call for free, at any time, to hear recorded information on 0800 066 066. In Malaysia you can get help here, external.
- Published8 April 2019
- Published22 January 2019
- Published7 February 2019