YouTube is moving to block and remove all content that spreads misinformation about vaccines against Covid-19 and other illnesses, such as measles, hepatitis and chickenpox, the Google-owned online video company said in a blog post Wednesday.
“This would include content that falsely says that approved vaccines cause autism, cancer or infertility, or that substances in vaccines can track those who receive them,” the company explained.
Since 2020, Google has taken down 130,000 videos for violating the company’s Covid-19 vaccine policies and says it is now stepping up those efforts.
“We’re expanding our medical misinformation policies on YouTube with new guidelines on currently administered vaccines that are approved and confirmed to be safe and effective by local health authorities and the WHO,” the company said in the blog post.
Google will go as far as taking down the accounts of serial spreaders of misinformation, however, claims about vaccines that are being tested will still be allowed.
Personal stories about reactions to a vaccine will also be permitted, as long as they do not come from an account that has a history of promoting vaccine misinformation.