YouTube vows to BLOCK all anti-vaxx videos that contain misinformation
YouTube BANS prominent anti-vaxxers, including Robert F. Kennedy and Joseph Mercola, and will remove videos peddling misinformation about Covid, MMR and chickenpox vaccines and ALL other jabs
The Google-owned platform expanded the scope of its misinformation policy This will now include false or misleading information on all approved vaccinesAs a result of its Covid-19 misinformation policy 130,000 videos were removedThe new scope will allow it to clamp down on wider false claims about vaccines
<!–
<!–
<!–<!–
<!–
(function (src, d, tag){
var s = d.createElement(tag), prev = d.getElementsByTagName(tag)[0];
s.src = src;
prev.parentNode.insertBefore(s, prev);
}(“https://www.dailymail.co.uk/static/gunther/1.17.0/async_bundle–.js”, document, “script”));
<!–
DM.loadCSS(“https://www.dailymail.co.uk/static/gunther/gunther-2159/video_bundle–.css”);
<!–
YouTube has vowed to remove videos that contain misinformation about all vaccines, including jabs for Covid-19, chicken pox and MMR.
This move expands its policies around health misinformation which had been strengthened during the coronavirus pandemic.
As well as removing specific videos that violate the new policy, YouTube will terminate the channels of high profile users who spread misinformation about vaccines, and have already acted to remove Robert F. Kennedy and Joseph Mercola.
Kennedy was one of the most high profile proponents of the debunked theory that vaccines cause autism, and Mercola, an alternative medicine entrepreneur has been generally critical of vaccines while promoting alternative therapies.
The Google-owned video platform said its ban on Covid-19 vaccine misinformation, which was introduced last year, had seen 130,000 videos removed so far as a result.
However, the firm says more scope was needed to clamp down on broader false claims about other vaccines appearing online.
Anti-vaccine content creator,Joseph Mercola (pictured left), an alternative medicine entrepreneur, and Robert F Kennedy Jr (right) are among those set to be banned by YouTube
Under the new rules, any content which falsely alleges that any approved vaccine is dangerous and causes chronic health problems will be removed, as will videos that include misinformation about the content of vaccines.
Social media and internet platforms have been repeatedly urged to do more to tackle the spread of online misinformation.
Millions of posts have been blocked or taken down and a number of new rules and prompts to official health information have been introduced across most platforms.
However, critics have suggested not enough has been done to slow the spread of harmful content since the start of the pandemic.
YouTube said it was taking its latest action in response to seeing vaccine misinformation begin to branch out into other false claims.
‘We’ve steadily seen false claims about the coronavirus vaccines spill over into misinformation about vaccines in general,’ the firm said in a blog post.
‘We’re now at a point where it’s more important than ever to expand the work we started with COVID-19 to other vaccines.’ it explained.
The changes also bar content that falsely alleges approved vaccines are dangerous and cause chronic health effects.
It will also remove videos with claims that vaccines do not reduce transmission or contraction of disease.
YouTube wrote it will take down any videos that also contain misinformation on the substances contained in vaccines.
‘This would include content that falsely says that approved vaccines cause autism, cancer or infertility,’ YouTube wrote in the blog post shared by Google.
Other content that would result in the removal of a video is one that suggests substances in vaccines can track those who receive them.
‘Our policies not only cover specific routine immunisations like for measles or Hepatitis B, but also apply to general statements about vaccines,’ the firm says.
YouTube added that there would be ‘important exceptions’ to the new guidelines, including content about ‘vaccine policies, new vaccine trials and historical vaccine successes or failures’.
Personal testimonies relating to vaccines, which the company said were important parts of public discussion around the scientific process, are also allowed.
YouTube has vowed to remove videos that contain misinformation about all vaccines, including jabs for Covid-19, chicken pox and MMR
‘Today’s policy update is an important step to address vaccine and health misinformation on our platform,’ the company said.
Adding: ‘We’ll continue to invest across the board in the policies and products that bring high-quality information to our viewers and the entire YouTube community.’
If your content is flagged as violating the YouTube policy then it will be removed and you’ll get an email to explain why.
On a first time violation there will be a warning but it is unlikely YouTube will take any further action beyond removing the video from public viewing.
Users who continue to upload banned content will have the channel terminated – usually after three strikes in 90 days.
Conspiracy theories and misinformation about the coronavirus vaccines proliferated on social media during the pandemic.
This was through anti-vaccine personalities on YouTube, through viral videos shared across multiple platforms and on TikTok and Twitter.
Research by the Center for Countering Digital Hate (CCDH) earlier this year found 65 per cent of misinformation content on social media was attributed to a dozen people, including Kennedy and Mercola.
Some posts perpetuated conspiracy theories relating to Microsoft co-founder Bill Gates, unfounded claims that vaccines cause harm to specific groups, or that vaccines cause autism.
Other posts from the 12 people claimed that the coronavirus pandemic was not real, that masks had no effect on transmission, shared unproven coronavirus treatments and the unfounded claim that Covid-19 vaccines pose a threat to pregnant women.
Although drugmakers and researchers are working on various treatments, vaccines are at the heart of the long-term fight to stop the new coronavirus, so posts suggesting they are not effective or dangerous can reduce uptake.