TED 2019: How to kill a zombie rumour and fix Facebook

  • Published
ZombiesImage source, Getty Images
Image caption,

Zombie rumours - false stories that refuse to die - are proliferating online

From so-called zombie rumours - viral online stories that refuse to die - to the difference between misinformation and fake news, much of the talk at TED 2019 has been about the need to improve online conversation.

And this is not just because it would be nice to counter the vast amounts of online lies and propaganda with truthful and respectful debate, something the TED audience always prefers.

But because, a series of speakers said, misinformation is having alarming real world consequences - from influencing elections to causing deaths.

Image source, Ryan Lash/TED
Image caption,

Claire Wardle called on everyone to donate the weird stuff they see online to an open database

Claire Wardle is the founder of First Draft News, a charity that fights misinformation. She recently set up the Coalition to Integrate Values into the Information Commons (Civic). It aims to build new infrastructure for quality information, something she described as a "Wikipedia of trust".

At TED, she asked "citizens of the internet" - whether everyday users, journalists, educators or software developers - to take part in the project, which will build a depository of the rumours, memes and propaganda circulating online. It will attempt to throw light on where they came from and suggest ways to filter such content in future.

She began her talk with a typical online zombie rumour: a photo of a banana with a red mark on it. The post suggested that the fruit had been injected with the HIV virus.

"Every day we see new memes like this. Rumours that tap into people's deepest fears and their fears for their families. Lies and facts sit side by side," she said.

It was not good enough for Facebook and Google to have their own fact checkers, or even for governments to regulate the web. Such viral content needed to be gathered, stored and analysed in an open database, she said.

It is also time to stop using the term "fake news", which itself has become a false narrative.

"Fake news covers lies, rumours, conspiracy theories but it is also used as a term by politicians around the world to attack a free and independent press," she said.

Zuckerberg's ephiphany

Image source, Getty Images
Image caption,

Mark Zuckerberg has said he wants Facebook to be "privacy-focused"

Roger McNamee is a venture capitalist and early investor in Facebook. He became so disillusioned with the direction the firm was taking that he wrote a book called 'Zucked: Waking up to the Facebook catastrophe'.

He spoke at TED about how the unintended consequences of Facebook's business model, which relies on keeping users engaged, was also making the sharing of questionable content all too easy.

But, he said, it can be fixed.

"Mark Zuckerberg is one good night's sleep away from the epiphany where he wakes up and realises he can do more good by fixing the business model of Facebook than he can with a thousand Chan-Zuckerberg initiatives," he said, referring to the philanthropic organisation Mr Zuckerberg runs with his wife Priscilla Chan.

In conversation with TED curator Chris Anderson, Mr McNamee talked about how he had spoken to Mr Zuckerberg and Facebook's chief operating officer Sheryl Sandberg nine days before the 2016 US presidential election. He told them the company had a problem: he had seen a Facebook group, claiming to be part of the Bernie Sanders campaign, distributing misogynistic viral memes that looked like someone was paying for them to spread.

Mr McNamee was also concerned about a firm that had recently been expelled from the platform for selling data on people who had expressed an interest in Black Lives Matter and selling that data to police departments.

He told the TED audience that Mr Zuckerberg told him that they were "isolated incidents".

Shortly after the presidential election, it became obvious that Facebook had a much bigger problem when it emerged that 126 million Facebook users had been targeted by Russian trolls spreading misinformation.

Image source, Ryan Lash/TED
Image caption,

Many of the speakers on the TED stage expressed concerns about the current state of the internet

Facebook has now acted on Russian interference and introduced new rules around elections, with tools to make political ads more transparent, listing who is placing ads and requiring them to have an address in the country the election is taking place in.

But, said Mr McNamee, much more needed to be done because the effects of online bad actors has now spread offline too.

"You did not need to be on Facebook in Myanmar to be dead. You just needed to be a Rohingya," he said.

"You did not need to be on Facebook or YouTube in Christchurch, New Zealand, to be dead. You just needed to be in one of those mosques."

He accused the tech giants of using "behavioural manipulation" to learn more about their users and improve services.

He gave an example from Google Maps and Google-owned navigation app Waze.

"Do you know how they get route timings for all the different routes? Some percentage of the people have to drive inferior routes in order to them to know what the timing is."

This was helpful to building better routes but it was also "creepy", he said.

"The actual thing that's going on inside these companies is not that we're giving them a little bit of personal data and they're getting better ad targeting. There is way more going on here than that. And the stuff that's going beyond that is having an impact on people's lives."

Zombie rumour-mongers

Andrew Marantz is a writer for the New Yorker who has spent the last three years tracking down the people that make manipulative viral content.

At TED he said that he discovered a complex picture, with the content makers ranging from disillusioned teenagers to white supremacists living in California.

"Some saw it as a way to make money online," he said. "Some wanted to be as outrageous as possible - but I also talked to true ideologues.

"Many start something a sick joke and then they get so many likes and shares that they start believing their own jokes," he added.

He described one young woman who he spoke to who went from being an Obama supporter to attending white supremacist rallies after spending months viewing misleading political propaganda online.

The social networks need to take responsibility, he said, and it could start with a recode.

"Social media algorithms were never built to distinguish between good and bad or true and false."

"If the algorithms were build for emotional engagement, and that is having bad real world consequences, then they have to be optimised for something else," he said.