Nancy Pelosi clip shows misinformation still has a home on Facebook
- Published
Facebook has said it won’t remove a doctored video that makes Democratic leader Nancy Pelosi appear incoherent. One upload of the clip has been viewed more than 2.5 million times - and remains visible.
"There’s a tension here,” Facebook said, between allowing free speech, and preventing the spread of fake news.
“We work hard to find the right balance between encouraging free expression and promoting a safe and authentic community, and we believe that reducing the distribution of inauthentic content strikes that balance.
"But just because something is allowed to be on Facebook doesn’t mean it should get distribution. In other words, we allow people to post it as a form of expression, but we’re not going to show it at the top of News Feed.”
Like so many of Facebook’s “evolving” policies, I predict Facebook will eventually change this stance and delete this clip, and others like it. YouTube, incidentally, has already taken that step.
But, it doesn’t matter. It’s too late. A pattern we have seen time and time again is now in in full motion - and shows little has been learned, or at least improved, when it comes to the manipulation of online platforms.
Social media smoke machine
Misinformation hides in the shadows until something in the real world offers the tiniest crack through which the conspiracy can pour out.
In various online spaces and groups, clips of Speaker Pelosi that had been slowed down by 25% had been circulating, picking up pace after the Democrat accused President Trump of engaging in a “cover up” over his business dealings.
On Thursday evening, the conspiracy crack emerged. President Trump posted a clip from a show from Fox Business which, while not doctored, had been selectively-edited to portray Congresswoman Pelosi as garbling her way through a press conference. The full footage demonstrated that was not the case.
Misrepresented occurrences such as this are used to energise lies that have bubbled around in the far reaches of the internet for a while, waiting for their moment.
It relies on the powerful human instinct of believing there cannot be smoke without fire. If so many people are talking about it, there must be something to the rumours. Social media is a smoke machine, no fire necessary.
Even with the videos' spread retrospectively limited by Facebook and YouTube, amplification can come from influential figures - who do so with a careful calculation that offers plausible deniability.
“What is wrong with Nancy Pelosi?” remarked Rudy Giuliani, the president’s personal lawyer, when sharing a (now deleted) tweet containing a doctored video.
Sooner or later he’ll be asked, likely on cable TV news, about that tweet. He has his tried-and-tested escape hatch: he didn’t make a claim, he merely “asked a question”. Which, as anyone who follows online political discourse knows, is often all it takes.
While this takes place, less prominent players engage in whataboutism: If the Pelosi video isn’t allowed, what about x involving someone else?
Simple matters of context make those arguments fall away, but the row itself helps fuel the effectiveness of misinformation by adding a layer of perceived victimisation. Vast swathes of people will share the falsified clips, not because people think it’s necessarily real, but because it’s now part of a fight.
The threat of the deep fake
Where this episode breaks somewhat new ground is its use of manipulated video, rather than, as we saw in 2016, bogus text or "Photoshopped" images. The doctored clips of Speaker Pelosi were slowed down, and the pitch her voice doctored to make it still sound authentic, but what comes in the future might be more damaging, and less easy to spot.
There is a fear that so-called “deep fakes” could magnify the effects of misinformation further as we head towards the 2020 US presidential election. Deep fakes are videos that are edited, using readily available technology, to realistically portray something very different. It’s a technique that has been used to place celebrities into pornographic videos they were not involved in. For politicians, it means making them say things they never actually said.
"Because they are so realistic, deepfakes can scramble our understanding of truth in multiple ways,” explained John Villasenor, senior fellow at the Brookings Institution, in a paper published earlier this year.
"By exploiting our inclination to trust the reliability of evidence that we see with our own eyes, they can turn fiction into apparent fact. And, as we become more attuned to the existence of deepfakes, there is also a subsequent, corollary effect: they undermine our trust in all videos, including those that are genuine.
"Truth itself becomes elusive, because we can no longer be sure of what is real and what is not.”
On this issue, Facebook has drawn its line and decided manipulated video, one that undermines a democratically elected leader, is OK on its platform.
As America dives into the campaign season, you have to wonder: what else will be allowed to spread?
Tension, indeed.
_______
Follow Dave Lee on Twitter @DaveLeeBBC, external
Do you have more information about this or any other technology story? You can reach Dave directly and securely through encrypted messaging app Signal on: +1 (628) 400-7370
- Published2 January 2019
- Published23 May 2019
- Published22 May 2019
- Published4 January 2019