Twitter insiders: We can't protect users from trolling under Musk
- Published
Twitter insiders have told the BBC that the company is no longer able to protect users from trolling, state-co-ordinated disinformation and child sexual exploitation, following lay-offs and changes under owner Elon Musk.
Exclusive academic data plus testimony from Twitter users backs up their allegations, suggesting hate is thriving under Mr Musk's leadership, with trolls emboldened, harassment intensifying and a spike in accounts following misogynistic and abusive profiles.
Current and former employees of the company tell BBC Panorama that features intended to protect Twitter users from trolling and harassment are proving difficult to maintain, amid what they describe as a chaotic working environment in which Mr Musk is shadowed by bodyguards at all times. I've spoken to dozens, with several going on the record for the first time.
The former head of content design says everyone on her team - which created safety measures such as nudge buttons - has been sacked. She later resigned. Internal research by Twitter suggests those safety measures reduced trolling by 60%. An engineer working for Twitter told me "nobody's taking care" of this type of work now, likening the platform to a building that seems fine from the outside, but inside is "on fire".
Twitter has not replied to the BBC's request for comment.
My investigation also reveals:
Concerns that child sexual exploitation is on the rise on Twitter and not being sufficiently raised with law enforcement
Targeted harassment campaigns aimed at curbing freedom of expression, and foreign influence operations - once removed daily from Twitter - are going "undetected", according to a recent employee.
Exclusive data showing how misogynistic online hate targeting me is on the rise since the takeover, and that there has been a 69% increase in new accounts following misogynistic and abusive profiles.
Rape survivors have been targeted by accounts that have become more active since the takeover, with indications they've been reinstated or newly created.
Abuse on Twitter is nothing new for me - I'm a reporter who shares my coverage of disinformation, conspiracies and hate there. But throughout most of last year I noticed it steadily lessening across all of the social media sites. And then in November I realised it had got worse on Twitter again.
It turns out, I was right. A team from the International Center for Journalists and the University of Sheffield have been tracking the hate I receive, and their data revealed the abuse targeted at me on Twitter had more than tripled since Mr Musk took over, compared with the same period in the year before.
All of the social media sites have been under pressure to tackle online hate and harmful content - but they say they're taking measures to deal with it. Measures that no longer seem to be top of the agenda at Twitter.
In San Francisco, the home of Twitter's headquarters, I set out to look for answers. What better place to get them than from an engineer - responsible for the computer code that makes Twitter work. Because he's still working there, he's asked us to conceal his identity, so we're calling him Sam.
"For someone on the inside, it's like a building where all the pieces are on fire," he revealed.
"When you look at it from the outside the façade looks fine, but I can see that nothing is working. All the plumbing is broken, all the faucets, everything."
He says the chaos has been created by the huge disruption in staffing. At least half of Twitter's workforce have been sacked or chosen to leave since Musk bought it. Now people from other teams are having to shift their focus, he says.
"A totally new person, without the expertise, is doing what used to be done by more than 20 people," says Sam. "That leaves room for much more risk, many more possibilities of things that can go wrong."
He says previous features still exist but those who designed and maintained them have left - he thinks they are now left unmanned.
"There are so many things broken and there's nobody taking care of it, that you see this inconsistent behaviour," he tells me.
The level of disarray, in his view, is because Mr Musk doesn't trust Twitter employees. He describes him bringing in engineers from his other company - electric car manufacturer Tesla - and asking them to evaluate engineers' code over just a few days before deciding who to sack. Code like that would take "months" to understand, he tells me.
He believes this lack of trust is betrayed by the level of security Mr Musk surrounds himself with.
"Wherever he goes in the office, there are at least two bodyguards - very bulky, tall, Hollywood movie-[style] bodyguards. Even when [he goes] to the restroom," he tells me.
He thinks for Mr Musk it's about money. He says cleaning and catering staff were all sacked - and that Mr Musk even tried to sell the office plants to employees.
Lisa Jennings Young, Twitter's former head of content design, was one of the people who specialised in introducing features designed to protect users from hate. Twitter was a hotbed for trolling long before Mr Musk took over, but she says her team had made good headway at limiting this. Internal Twitter research, seen by the BBC, appears to back this up.
"It was not at all perfect. But we were trying, and we were making things better all the time," she says. It is the first time she's publicly spoken of her experience since she left after Mr Musk's takeover.
Ms Jennings Young's team worked on several new features including safety mode, which can automatically block abusive accounts. They also designed labels applied to misleading tweets, and something called the "harmful reply nudge". The "nudge" alerts users before they send a tweet in which AI technology has detected trigger words or harmful language.
Twitter's own research, seen by the BBC, appears to show the "nudge" and other safety tools being effective.
"Overall 60% of users deleted or edited their reply when given a chance through the nudge," she says. "But what was more interesting, is that after we nudged people once, they composed 11% fewer harmful replies in the future."
These safety features were being implemented around the time my abuse on Twitter seemed to reduce, according to data collated by the University of Sheffield and International Center for Journalists. It's impossible to directly correlate the two, but given what the evidence tells us about the efficacy of these measures, it's possible to draw a link.
But after Mr Musk took over the social media company in late October 2022, Lisa's entire team was laid off, and she herself chose to leave in late November. I asked Ms Jennings Young what happened to features like the harmful reply nudge.
"There's no-one there to work on that at this time," she told me. She has no idea what has happened to the projects she was doing.
So we tried an experiment.
She suggested a tweet that she would have expected to trigger a nudge. "Twitter employees are lazy losers, jump off the Golden Gate bridge and die." I shared it on a private profile in response to one of her tweets, but to Ms Jennings Young's surprise, no nudge was sent. Another tweet with offensive language we shared was picked up - but Lisa says the nudge should have picked up a message wishing death on a user, not just swear words. As Sam had predicted, it didn't seem to be working as it was designed to.
During this investigation, I've had messages from many people who've told me how the hate they receive on Twitter has been increasing since Mr Musk took over - sharing examples of racism, antisemitism and misogyny.
Ellie Wilson, who lives in Glasgow, was raped while at university and began posting about that experience on social media last summer. At the time, she received a supportive response on Twitter.
But when she tweeted about her attacker in January after he was sentenced, she was subject to a wave of hateful messages. She received abusive and misogynistic replies - with some even telling her she deserved to be raped.
"[What] I find most difficult [is] the people that say that I wasn't raped or that this didn't happen and that I'm lying. It's sort of like a secondary trauma," Ms Wilson told me.
Her Twitter following was smaller before the takeover, but when I looked into accounts targeting her with hate this time around, I noticed the trolls' profiles had become more active since the takeover, suggesting they'd been suspended previously and recently reinstated.
Some of the accounts had even been set up around the time of Mr Musk's takeover. They appeared to be dedicated to sending out hate, without profile pictures or identifying features. Several follow and interact with content from popular accounts that have been accused of promoting misogyny and hate - reinstated on Twitter after Musk decided to restore thousands of suspended accounts, including that of controversial influencer Andrew Tate.
"By allowing those people a platform, you're empowering them. And you're saying, 'This is OK, you can do that.'"
Several of the accounts also targeted other rape survivors she's in contact with.
Andrew Tate did not respond to the BBC's request for comment.
New research from the Institute for Strategic Dialogue - a UK think tank that investigates disinformation and hate - echoes what I've uncovered about the troll accounts targeting Ellie.
It shows that tens of thousands of new accounts have been created since Mr Musk took over, which then immediately followed known abusive and misogynistic profiles - 69% higher than before he was in charge.
The research suggests these abusive networks are now growing - and that Mr Musk's takeover has created a "permissive environment" for the creation and use of these kinds of accounts.
Elon Musk's Twitter Storm
Panorama investigates how Elon Musk's ownership is transforming one of the world's most influential social media platforms.
Watch on BBC One at 20:00 GMT, Monday 6 March
Mr Musk's key priorities since the takeover - according to his tweets - are to make the social media company profitable and to champion freedom of expression.
In December 2022, he released internal documents called the "Twitter Files" to explain why he believed the company hadn't been fairly applying its moderation and suspension policies under the old leadership.
But those who have been on the inside, feel like Mr Musk has used this to de-prioritise protecting users from harm altogether. Even the dangerous content he's lobbied against, including Child Sexual Abuse and networks of so-called bot accounts deliberately designed to mislead, isn't being tackled as it was before, they say.
It is not just individual trolls that Twitter has previously tried to guard against, but also so-called "influence operations" - state-sanctioned campaigns seeking to undermine democracy and target dissidents and journalists.
Ray Serrato worked in a team that specialised in tackling these operations. He left in November because he felt there wasn't a clear vision to protect users under the new leadership. He says his team would identify suspicious activity like this "daily". Now his team has been "decimated" and exists in a "minimised capacity".
"Twitter might have been the refuge where journalists would go out and have their voice be heard and be critical of the government. But I'm not sure that's going to be the case anymore."
"There are a number of key experts that are no longer in that team that would have covered special regions, or threat actors, from Russia to China," he tells me.
Another insider, who we're calling Rory, is also very concerned about that drain of expertise - and how it appears to be undermining a Musk priority, preventing paedophiles using Twitter to groom victims and share links to abuse. Rory was employed until very recently as part of a team tackling child sexual exploitation [CSE].
His team would identify accounts sharing abusive content about children, escalating the worst to law enforcement. Before the takeover such content was a huge problem, he says - and he already feared they were understaffed.
"Every day you would be able to identify that sort of material," he says.
But his team was cut soon after the acquisition, he says, from 20 people to around six or seven. In his view that's too few to keep on top of the workload.
Rory says - before he left - neither Mr Musk nor any other member of the new management made contact with him and his old team, who between them had years of experience in this area.
"You can't take over a company and suddenly believe you have knowledge… to deal with [Child Sexual Exploitation] without having the experts in place," he says.
Twitter says it removed 400,000 accounts in one month alone to help "make Twitter safer". But Rory is worried there are now fewer people with the knowledge to effectively escalate concerns about this content with law enforcement.
"You can by all means suspend hundreds of thousands of accounts in a month. But if the reporting of that content [to law enforcement] has dropped, then it doesn't really mean anything, and most of the users who had their accounts suspended would just set up a new account anyway."
He adds that offending users can then just set up new accounts, at a time when suspended profiles are being welcomed back to Twitter.
I wanted to ask Elon Musk about the takeover, his vision for Twitter and how he thinks it is playing out in reality. I tried to contact him via email, tweets and even a Twitter "poll". This wasn't a real poll but Mr Musk has used these votes to make decisions about Twitter's future, and I was hoping it might catch his attention. More than 40,000 users voted and 89% said Mr Musk should do an interview with me. I had no response.
Twitter and Musk are yet to formally respond to BBC Panorama's investigation.
I'm told all of Twitter's communications team have either resigned or been sacked. Twitter's policies, publicly available online, say that "defending and respecting the user's voice" remains one of its "core values".
Musk did however tweet about our piece after its publication, saying: "Sorry for turning Twitter from nurturing paradise into place that has… trolls."
Have you been affected by the issues raised in this story? Share your experiences by emailing haveyoursay@bbc.co.uk, external.
Please include a contact number if you are willing to speak to a BBC journalist. You can also get in touch in the following ways:
WhatsApp: +44 7756 165803
Tweet: @BBC_HaveYourSay, external
Please read our terms & conditions and privacy policy
If you are reading this page and can't see the form you will need to visit the mobile version of the BBC website to submit your question or comment or you can email us at HaveYourSay@bbc.co.uk, external. Please include your name, age and location with any submission.
- Published27 February 2023
- Published5 November 2022