YouTube announced on Wednesday a tightening of its policy against anti-vaccine content and indicated that its measures will not be limited to disinformation videos about immunizers against covid-19.
"Videos with content that falsely claim that approved vaccines are dangerous and cause chronic health effects, that say that vaccines do not reduce the transmission of diseases or make people contract diseases, or that contain erroneous information about the substances used, will be deleted." the platform said in a statement.
YouTube said the measure covers content according to which approved vaccines cause autism, cancer or infertility or that they can mark those who receive them.
Misleading or lying videos about older vaccines such as rubella or hepatitis B could also be removed from the site.
The contents on vaccination policies, new vaccine trials as well as those of historical successes or failures of vaccines are still allowed on the platform as long as they respect YouTube regulations.
The platform, in a battle against misinformation about covid vaccines, said it had deleted more than 130 videos that violated its rules in a year.
Big tech is being pressured to suppress or mitigate anti-vaccine content on their platforms, especially since the beginning of the pandemic.