YouTube bans coronavirus vaccine misinformation
YouTube has been banning videos that claim vaccines can cause death or infertility. Image: REUTERS/Toby Melville
- Alphabet Inc’s YouTube is working to remove videos from its site that spread misinformation about COVID-19 vaccines.
- This will include claims that the vaccine will kill people or cause infertility, or that microchips will be implanted in people who receive it.
- YouTube already removes videos that dispute the transmission of COVID-19 and promote medically unsubstantiated methods of treatment.
Alphabet Inc’s YouTube said it would remove videos from YouTube containing misinformation about COVID-19 vaccines, expanding its current rules against falsehoods and conspiracy theories about the pandemic.
The video platform said it would now ban any content with claims about COVID-19 vaccines that contradict consensus from local health authorities or the World Health Organization.
YouTube said in an email that this would include removing claims that the vaccine will kill people or cause infertility, or that microchips will be implanted in people who receive the vaccine.
A YouTube spokesman told Reuters that general discussions in videos about “broad concerns” over the vaccine would remain on the platform.
YouTube says it already removes content that disputes the existence or transmission of COVID-19, promotes medically unsubstantiated methods of treatment, discourages people from seeking medical care or explicitly disputes health authorities’ guidance on self-isolation or social distancing.
Conspiracy theories and misinformation about the new coronavirus vaccines have proliferated on social media during the pandemic, including through anti-vaccine personalities on YouTube and through viral videos shared across multiple platforms.
Although drugmakers and researchers are working on various treatments, vaccines are at the heart of the long-term fight to stop the new coronavirus, which has killed more than a million people, infected more than 38 million and crippled the global economy.
In its email, YouTube said it had removed over 200,000 videos related to dangerous or misleading COVID-19 information since early February.
Andy Pattison, manager of digital solutions at the World Health Organization, told Reuters that the WHO meets weekly with the policy team at YouTube to discuss content trends and potentially problematic videos. Pattison said the WHO was encouraged by YouTube’s announcement on coronavirus vaccine misinformation.
The company also said it was limiting the spread of COVID-19 related misinformation on the site, including certain borderline videos about COVID-19 vaccines. A spokesman declined to provide examples of such borderline content.
YouTube said it would be announcing more steps in the coming weeks to emphasize authoritative information about COVID-19 vaccines on the site.
Don't miss any update on this topic
Create a free account and access your personalized content collection with our latest publications and analyses.
License and Republishing
World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.
The views expressed in this article are those of the author alone and not the World Economic Forum.
Stay up to date:
Internet Governance
The Agenda Weekly
A weekly update of the most important issues driving the global agenda
You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.
More on Health and Healthcare SystemsSee all
Hope French and Michael Atkinson
November 7, 2024