Instagram: US states investigate how platform targets children
- Published
A group of US states is investigating how Instagram targets children, despite it posing potential risks to them.
The group - made up of both Democrat and Republican states - is investigating Instagram and Facebook's parent company Meta to determine if consumer protection laws were broken.
It comes after a company whistleblower testified in the US that the company knew its products can harm children.
A Meta spokesman on Thursday denied that their platforms are unhealthy.
Massachusetts Attorney General Maura Healey, a Democrat who first announced the inquiry, tweeted: "Facebook, or Meta, has known Instagram is linked to depression, eating disorders & suicide among young people."
"We will identify if any laws were broken and end the abuse for good."
Nebraska Attorney General Doug Peterson, a Republican, said that the companies "treat our children as mere commodities to manipulate for longer screen time engagement and data extraction".
"These social media platforms are extremely dangerous and have been proven to cause both physical and mental harm in young people," added New York Attorney General Letitia James.
Facebook, which owns Instagram and WhatsApp changed its name to Meta last month after a series of scandals.
A Meta spokesman pushed back against the consortium's allegations.
"These accusations are false and demonstrate a deep misunderstanding of the facts," a spokesman said in a statement.
"While challenges in protecting young people online impact the entire industry, we've led the industry in combating bullying and supporting people struggling with suicidal thoughts, self-injury, and eating disorders," he added.
The announcement comes after a series of explosive reports based on of documents leaked by former Facebook employee Frances Haugen.
In testimony to lawmakers in the US, she said the company knowingly pushed its platforms to young children despite knowing that they could cause health issues.
In September, the platform abandoned plans for a child-focused app after a group of over 40 state attorneys general wrote and urged them to cancel it.
Instagram, like other platforms, requires users to be over 13, but the company has admitted that it knows some users are younger.
Has Meta been singled out fairly?
When Meta's CEO Zuckerberg responded to Frances Haugen's leaks he said something intriguing.
"If we attack organisations making an effort to study their impact on the world, we're effectively sending the message that it's safer not to look at all, in case you find something that could be held against you."
What Zuckerberg was essentially saying is if companies get whacked for conducting research into the effects of their products - they won't conduct the research at all.
It's important because Instagram is by no means the only social media platform that teens use. TikTok for example has an enormous teen following, as does Snapchat. How do they affect teen mental health? Well we don't know, because if they have conducted this research, they haven't released it.
This is Meta' great grievance - that it is facing the consequences of bothering to understand the impact of their products.
But that isn't quite right either. The nature of Instagram, its algorithmic push that promotes beauty, style, success - is different to other apps.
It would be far better if Facebook gave its data to independent analysts to able to research harmful effects.
But it would be far worse if the company simply stopped looking under the bonnet, in case it found something it didn't like.
- Published24 September 2021
- Published26 October 2021
- Published15 September 2021
- Published25 October 2021