Coronavirus: YouTube bans misleading Covid-19 vaccine videos
- Published
YouTube has pledged to delete misleading claims about coronavirus vaccines as part of a fresh effort to tackle Covid-19 misinformation.
It said any videos that contradict expert consensus from local health authorities, such as the NHS or World Health Organization, will be removed.
It follows an announcement by Facebook that it would ban ads that discourage people from getting vaccinated.
However, that restriction will not apply to unpaid posts or comments.
YouTube had already banned "medically unsubstantiated" claims relating to coronavirus on its platform.
But it is now explicitly expanding the policy to include content relating to vaccines.
'Imminent vaccine'
"A Covid-19 vaccine may be imminent, therefore we're ensuring we have the right policies in place to be able to remove [related] misinformation," the Google-owned service said in a statement.
It said it would remove any suggestions that the vaccine would:
kill people
cause infertility
involve microchips being implanted in people who receive the treatment
YouTube said it had already removed 200,000 dangerous or misleading videos about the virus since February.
False claims
Facebook's new policy is designed to stop it facing accusations of profiting from the spread of anti-vaccination messages.
The social network had previously allowed ads to express opposition to vaccines if they did not contain false claims.
It said the new rules would be enforced "over the next few days", but some ads would still run in the meantime.
It added that it was launching a campaign to provide users information about the flu vaccine, including where to get flu shots in the US.
"Our goal is to help messages about the safety and efficacy of vaccines reach a broad group of people, while prohibiting ads with misinformation that could harm public health efforts," the company blogged, external.
Anti-vaccination groups will still be allowed on its platform.
Unpaid posts or comments that discourage people from getting a vaccination are also still permitted.
Earlier in the year, Facebook's public policy manager Jason Hirsch told Reuters the company believed users should be able to express personal anti-vaccine views. He said that more aggressive censorship could push people hesitant about vaccines towards the anti-vaccine camp.
The subsequent change is one of many that have recently been made to its free speech principles.
On Monday, Facebook banned posts denying the Holocaust, following years of pressure.
And last week it also banned content related to the QAnon conspiracy theory ahead of the US election.
Political pressure
The moves come as the UK government faced renewed criticism over the amount of time it is taking to pass a new law to tackle online misinformation and other issues involving the social media giants.
The chair of the Department of Culture, Media and Sport Committee Julian Knight said that delays to passing the Online Harms Bill were "unjustifiable" and attacked the government for failing to empower a regulator to handle related complaints,
Ministers have previously suggested Ofcom take on the role, but have yet to confirm the appointment.
The culture secretary Oliver Dowden was questioned by the committee about the bill.
He said draft legislation would be published in 2021, and added it should include "tough penalties" for those who break the rules.
But one expert raised concern that the tech companies would be left to self-regulate themselves in the meantime.
"The volume of content defined as misinformation overrides the number of employees to oversee such things, or [the automated] functionalities the platforms have," said Unsah Malik, a social media advisor.
"We should probably have stronger consequences for those who publish misinformation - make it unlawful and fine people."
As the possibility of a coronavirus vaccine edges closer, disinformation and conspiracy theories surrounding it increasingly circulate on social media.
That includes everything from false claims that a vaccine is a tool for mass genocide to baseless conspiracy theories about Bill Gates micro-chipping the world population.
For that reason, moves taken by YouTube and Facebook will be welcomed. Social media sites were widely criticised for not acting quickly enough to tackle health misinformation early on in the pandemic - and many will be reassured they're looking ahead.
But an entirely different question is how well these new measures will be enforced, how effective they will prove to be - and whether they could be too late.
Vaccine disinformation has thrived in large Facebook groups and YouTube videos from notorious pseudoscientists for months now. And it has spilled over into parent chats and community forums.
It's that gradual exposure to conspiracy theory content that could sow seeds of doubt in the minds of many about a coronavirus vaccine.
Plus this disinformation undermines legitimate concerns that any available vaccine is safe and properly approved.
- Published16 April 2020
- Published8 May 2020
- Published14 May 2020
- Published22 April 2020
- Published20 August 2020
- Published12 October 2020