YouTube to remove videos with COVID vaccine misinformation
Published: 04:27 PM, 15 October 2020
YouTube, the Google-owned video-sharing service, announced on Wednesday that it will remove videos containing COVID-19 vaccine misinformation.
Updating its policies on misinformation about the pandemic, YouTube said it was expanding its medical misinformation policy “to remove claims about COVID-19 vaccinations that contradict expert consensus from local health authorities or the World Health Organization.”
The move is the latest by online platforms struggling to contain the spread of hoaxes and false information about the coronavirus and treatments.
YouTube said it was acting in anticipation of the release of one or more vaccines, and scepticism among many people about their usefulness. Content to be removed would include claims that a vaccine could kill people or cause infertility, or that microchips will be implanted in people who receive a vaccination, it added.
YouTube has removed more than 200,000 videos with “dangerous or misleading” COVID-19 information since February, including unverified claims about transmission or unsubstantiated treatments.
In a related action, Facebook on Tuesday announced a ban on ads that discourage people from getting vaccinated, as part of its efforts to contain misinformation.
The tech giants have regularly been accused of allowing anti-vaccine movements to flourish.