Social media data needed for 'harm' research, say doctors
- Published
Leading UK psychiatrists say they will never understand the risks and benefits of social media use on children's mental health unless companies hand over their data to researchers.
Tech companies must be made to share data and pay a tax to fund important research, they say in a report, external.
There is growing evidence internet use can harm mental health but research is still lacking, it adds.
But a civil rights group said children should not be treated "like lab rats".
An independent regulator for online safety is planned by the government.
The report, by the Royal College of Psychiatrists, calls on the regulator to require social-media companies to share data on how children and young people are using the likes of Instagram, Facebook and Twitter - not just how much time they spend online.
The data collected would be anonymous, the report says.
Ian Russell, who believes Instagram was partly responsible for his daughter Molly taking her life aged 14, is backing the calls.
The types of images his daughter looked at on social media before she died included ones that suggest depression "will inevitably lead to suicide" and "normalise a dark, bleak world" which children "get sucked into", he told BBC Radio 4's Today programme.
"When you start looking at those images on the internet, the algorithms that are there start pushing more and more of those images and such content to you, and they connect you to other people who are in a similar desperate place," he said in a separate interview on BBC Breakfast.
Mr Russell said without research using data from social media companies, "we will never know how content can lead our children and young people to self-harm or, in the most tragic cases, take their own lives".
There can be many positives to children using technology, such as online support, instant communication with friends and access to information, but screen time can also be potentially harmful to health, psychiatrists say.
For example, online content can be distressing and children can become addicted to screens at the expense of sleep, exercise and family time.
Research was still fragmented but initial evidence of "negative physical, mental health and cognitive associations" required further inter-disciplinary, nationally funded research, the report states.
Most social media platforms say users must be at least 13 years old to have an account, but research has suggested many younger children bypass these restrictions.
In 2019 the BBC launched an app and website, Own It, to help young people to "be the boss of your online life".
'Toxic content'
Sarah Lechmere, who has struggled with eating disorders, told the BBC's Victoria Derbyshire programme social media posts led her to pro-anorexia sites which gave her "tips" on how to self-harm.
The sites gave "lots of different methods that can give people ideas," she said.
She said Instagram and other social media sites made it "very difficult" to avoid "toxic content".
Report co-author Dr Bernadka Dubicka, who chairs the child and adolescent faculty at the Royal College of Psychiatrists, said she was seeing a growing number of children self-harming and attempting suicide as a result of their use of social media and online discussions.
She told the Today programme research was "hampered" by the lack of access to social media data, leading to "double standards" in how young people are protected online compared with offline.
Are you aged 18 to 25? What effect has social media had upon your mental health? Email haveyoursay@bbc.co.uk, external.
Dr Dubicka also called for a tax on social media companies "that is proportional to their global turnover" to pay for research.
Privacy 'should be priority'
Civil rights group Big Brother Watch said it recognised the importance of research into the impact of social media, but that users must be "empowered to choose what data they give away, who to and for what purposes".
The campaign group's director, Silkie Carlo, said young people should have "autonomy" on social media "without being made to feel like lab rats".
Citing the Cambridge Analytica data scandal in 2014, she added: "At a time when data and privacy rights face significant threats online and trust is low, user control should be recognised as a priority."
The government has already said it will create an independent regulator for online safety from April, as part of a package of measure to keep UK users safe online, external, known as the Online Harms White Paper.
The regulator will be able to demand companies write reports outlining how they protect people online, a spokesman said.
The reports will then be made public so that parents and children "can make informed decisions about their internet use", the spokesman added.
There are also plans to introduce a 2% tax rate against the sales large digital companies make in the UK.
Tech UK, which represents technology companies, said data, in isolation, "will rarely provide the full picture".
If you’ve been affected by self-harm, eating disorders or emotional distress, help and support is available via the BBC Action Line.
What is the advice to parents?
For children under the age of one:
avoid screen time
For two- to five-year-olds:
ensure any screen time is part of a varied and balanced day, including physical activity and face-to-face conversation
spend at least three hours a day on physical activity
children should spend no more than one hour sitting watching or playing with screens
For five- to 11-year-olds:
develop a plan with your child for screen time and try to stick to it
ensure children have a balance of activities in the day, with physical activity, face-to-face conversation and tech-free times
encourage mealtimes to be tech free
ensure you have spoken to your children about how to keep safe online, check they are keeping safe and make it clear you will support them if they feel unsafe or upset online
try to ensure there are no screens in the bedroom at night
For 11- to 16-year- olds:
develop a plan or check your existing one is still appropriate
encourage a balance of physical activity, face-to-face social time, schoolwork and family time
encourage mealtimes to be tech free
keep having conversations about keeping safe online and offer space to talk about upsetting things teenagers might see online
make it clear you will support them if they feel unsafe or upset online
try to ensure there are no screens in the bedroom at night
Marjorie Wallace, chief executive of the mental health charity Sane, said it was "shocking" so little had been done to protect vulnerable young people.
"There has been an exponential rise in the numbers of people contacting our helpline in recent years who have self-harmed, now comprising 70% of our callers, some directly linked to the 24-7 pressures of social media," she said.
- Published28 October 2019
- Published7 May 2019
- Published27 January 2019
- Published23 January 2019
- Published14 August 2019
- Published10 October 2019