YouTube has vowed to remove videos that contain misinformation about all vaccines, including jabs for Covid-19, chicken pox and MMR.
This move expands its policies around health misinformation which had been strengthened during the coronavirus pandemic.
As well as removing specific videos that violate the new policy, YouTube will terminate the channels of high profile users who spread misinformation about vaccines, and has already acted to remove Robert F. Kennedy Jr. and Joseph Mercola.
Kennedy was one of the most high profile proponents of the debunked theory that vaccines cause autism, and Mercola, an alternative medicine entrepreneur, has been generally critical of vaccines while promoting alternative therapies.
The Google-owned video platform said its ban on Covid-19 vaccine misinformation, which was introduced last year, had seen 130,000 videos removed so far as a result.
However, the firm says more scope was needed to clamp down on broader false claims about other vaccines appearing online.
Anti-vaccine content creator, Joseph Mercola (pictured left), an alternative medicine entrepreneur, and Robert F Kennedy Jr (right) are among those set to be banned by YouTube
Covid vaccine content that is NOT allowed
Here are some examples of content that’s not allowed on YouTube:
– Claims that vaccines cause chronic side effects such as cancer, diabetes and other chronic side effects
– Claims that vaccines do not reduce risk of contracting illness
– Claims that vaccines contain substances that are not on the vaccine ingredient list, such as biological matter from foetuses (e.g. foetal tissue, foetal cell lines) or animal byproducts
– Claims that vaccines contain substances or devices meant to track or identify those who’ve received them
– Claims that vaccines alter a person’s genetic makeup
– Claims that the MMR vaccine causes autism
– Claims that vaccines are part of a depopulation agenda
– Claims that the flu vaccine causes chronic side effects such as infertility
– Claims that the HPV vaccine causes chronic side effects such as paralysis
Advertisement
Under the new rules, any content which falsely alleges that any approved vaccine is dangerous and causes chronic health problems will be removed, as will videos that include misinformation about the content of vaccines.
Social media and internet platforms have been repeatedly urged to do more to tackle the spread of online misinformation.
Millions of posts have been blocked or taken down and a number of new rules and prompts to official health information have been introduced across most platforms.
However, critics have suggested not enough has been done to slow the spread of harmful content since the start of the pandemic.
YouTube said it was taking its latest action in response to seeing vaccine misinformation begin to branch out into other false claims.
‘We’ve steadily seen false claims about the coronavirus vaccines spill over into misinformation about vaccines in general,’ the firm said in a blog post.
‘We’re now at a point where it’s more important than ever to expand the work we started with COVID-19 to other vaccines.’
The changes also bar content that falsely alleges approved vaccines are dangerous and cause chronic health effects.
It will also remove videos with claims that vaccines do not reduce transmission or contraction of disease.
YouTube wrote it will take down any videos that also contain misinformation on the substances contained in vaccines.
‘This would include content that falsely says that approved vaccines cause autism, cancer or infertility,’ YouTube wrote in the blog post shared by Google.
YouTube is playing catchup with other social media firms, as Facebook banned misinformation on all vaccines seven months ago.
However, despite the information ban, the pages of Mercola and Kennedy are still active on both Twitter and Facebook.
As well as removing misinformation, YouTube is working with official sources to bring more videos to the platform including the National Academy of Medicine and the Cleveland Clinic.
YouTube head of health care, Garth Graham, told the Washington Post that the goal was to get scientific information front and centre.
It was in the hope of catching their attention before they get caught in a web of misinformation and anti-vaxx videos.
‘There is information, not from us, but information from other researchers on health misinformation that has shown the earlier you can get information in front of someone before they form opinions, the better,’ Graham said.
YouTube has vowed to remove videos that contain misinformation about all vaccines, including jabs for Covid-19, chicken pox and MMR
What happens if your video violates YouTube’s policy?
If your content violates YouTube’s policy, it will remove the content and send you an email to let you know.
If this is the first time you’ve violated the Community Guidelines, you’ll probably get a warning with no penalty to your channel.
If it’s not, YouTube may issue a strike against your channel.
If you get 3 strikes within 90 days, your channel will be terminated.
However, YouTube may also terminate your channel or account after a single case of severe abuse, or when the channel is dedicated to a policy violation.
Advertisement
As of Wednesday, popular anti-vaccine accounts, including those run by Mercola were kicked off YouTube.
A press email for Mercola’s website said in a statement: ‘We are united across the world, we will not live in fear, we will stand together and restore our freedoms.’
Kennedy did not immediately respond to requests for comment.
Other content that would result in the removal of a video is one that suggests substances in vaccines can track those who receive them.
‘Our policies not only cover specific routine immunisations like for measles or Hepatitis B, but also apply to general statements about vaccines,’ the firm says.
YouTube added that there would be ‘important exceptions’ to the new guidelines, including content about ‘vaccine policies, new vaccine trials and historical vaccine successes or failures’.
Personal testimonies relating to vaccines, which the company said were important parts of public discussion around the scientific process, are also allowed.
‘Today’s policy update is an important step to address vaccine and health misinformation on our platform,’ the company said.
It added: ‘We’ll continue to invest across the board in the policies and products that bring high-quality information to our viewers and the entire YouTube community.’
If your content is flagged as violating the YouTube policy then it will be removed and you’ll get an email to explain why.
Conspiracy theories and misinformation about the coronavirus vaccines proliferated on social media during the pandemic
On a first time violation there will be a warning but it is unlikely YouTube will take any further action beyond removing the video from public viewing.
Users who continue to upload banned content will have the channel terminated – usually after three strikes in 90 days.
Conspiracy theories and misinformation about the coronavirus vaccines proliferated on social media during the pandemic.
This was through anti-vaccine personalities on YouTube, through viral videos shared across multiple platforms and on TikTok and Twitter.
Research by the Center for Countering Digital Hate (CCDH) earlier this year found 65 per cent of misinformation content on social media was attributed to a dozen people, including Kennedy and Mercola.
Some posts perpetuated conspiracy theories relating to Microsoft co-founder Bill Gates, unfounded claims that vaccines cause harm to specific groups, or that vaccines cause autism.
Who are Robert F. Kennedy and Joseph Mercola and why were they banned from YouTube?
ROBERT F. KENNEDY JR
Kennedy was one of the most high profile proponents of the debunked theory that vaccines cause autism
Robert F. Kennedy Jr is an American environmental lawyer and anti-vaccine advocate.
He’s the son of senator Robert F. Kennedy, who was shot and killed in 1998, and former president John F. Kennedy, also assassinated, back in 1963.
Robert F. Kennedy Jr set up his own advocacy group called Children’s Health Defense in 2016, known for anti-vaccine activities.
Kennedy, a lifelong Democrat, has previously lobbied Congress to allow parents who do not want to vaccinate their children exemptions from state mandates to do so.
The 67-year-old has insisted that he is not against vaccines, saying he is in favor of ‘safe’ vaccines.
He was banned from Instagram earlier in 2021 for ‘repeatedly sharing debunked claims about the coronavirus or vaccines’, according to Facebook, which owns the photo-sharing app.
Kennedy was one of the most high profile proponents of the debunked theory that vaccines cause autism.
JOSEPH MERCOLA
Joseph Mercola, an alternative medicine entrepreneur has been generally critical of vaccines
Joseph Mercola, an alternative medicine entrepreneur has been generally critical of vaccines.
The 67-year-old Chicagoan markets dietary supplements and promotes alternative therapties on his website.
These include homeopathy – a widely panned ‘treatment’ based on the use of highly diluted substances.
Mercola has previously been warned by the US Food and Drug Administration to stop selling supplements that claimed to treat coronavirus.
Dr. Joseph Mercola, 67, recently earned the top spot in a report about the ‘Disinformation Dozen’ by the Center for Countering Digital Hate, along with Robert F Kennedy Jr.
YouTube has banned their accounts as part of efforts to combat misinformation over vaccines, which it says are ‘approved and confirmed to be safe and effective by local health authorities and the WHO’.
Advertisement
Other posts from the 12 people claimed that the coronavirus pandemic was not real, that masks had no effect on transmission, shared unproven coronavirus treatments and the unfounded claim that Covid-19 vaccines pose a threat to pregnant women.
Although drugmakers and researchers are working on various treatments, vaccines are at the heart of the long-term fight to stop the new coronavirus, so posts suggesting they are not effective or dangerous can reduce uptake.
This is a shift for YouTube, which streams over a billion hours of content every day, as it has long resisted policing content too heavily.
It has argued in favour of maintaining an open platform as it is ‘critical to free speech’ but social media firms are increasingly coming under fire from lawmakers and activists over those using their platform to spread false information.
YouTube vice president of global trust and safety, Matt Halprin, told the Washington Post the firm didn’t act sooner as they were focusing on Covid-19 misinformation.
Russian President Vladimir Putin during a meeting with Turkish President Recep Tayyip Erdogan. Russian state-backed broadcaster RT’s German-language channels were deleted from YouTube, as the company said the channels had breached its COVID-19 misinformation policy
‘Developing robust policies takes time,’ Halprin told the Post, adding ‘we wanted to launch a policy that is comprehensive, enforceable with consistency and adequately addresses the challenge.’
But even as YouTube takes a tougher stance on misinformation, it faces backlash around the world.
On Tuesday, Russian state-backed broadcaster RT’s German-language channels were deleted from YouTube, as the company said the channels had breached its COVID-19 misinformation policy.
Russia on Wednesday called the move ‘unprecedented information aggression,’ and threatened to block YouTube.
YouTube said in a blog post: ‘Today’s policy update is an important step to address vaccine and health misinformation on our platform, and we’ll continue to invest across the board in the policies and products that bring high quality information to our viewers and the entire YouTube community.’
WHAT ARE TECH COMPANIES DOING ABOUT COVID-19?
The social network is giving the World Health Organisation as many free ads as it needs in a bid to get accurate health information to users of the platform as clearly as possible.
It also launched the ‘Coronavirus Information Centre’ – a dedicated webpage with COVID-19 resources and advice at the start of the pandemic in 2020.
This is being promoted at the top of users’ News Feeds, directing them to the latest updates and guidance from the NHS and WHO.
Facebook is also making its Workplace platform available to governments and emergency services for free in a bid to help those dealing with the coronavirus.
All government organisations globally, at a national or local level, are eligible to claim 12 months of free access to the premium tier of Workplace.
Facebook has already been sending myth-busting messages to users’ news feeds if they have interacted with posts containing ‘harmful misinformation’ about coronavirus such as conspiracy theories.
Although in May 2021, Facebook lifted its ban on users claiming that Covid-19 may have originated in a lab in Wuhan.
Twitter also resolved to delete tweets from its site that promote conspiracy theories, misleading or dangerous advice and other harmful ideas relating to coronavirus.
Tweets that deny ‘established scientific facts’ and expert guidance regarding the virus will be marked as harmful and removed, the site said in a blog post at the start of the pandemic in March 2020.
It gave examples of inaccurate tweets that would be deleted swiftly, including ‘people with dark skin are immune to COVID-19 due to melanin production’, ‘use aromatherapy and essential oils to prevent COVID-19’ and ‘the news about washing your hands is propaganda for soap companies, stop washing your hands!’.
In April 2021, Twitter introduced the option to report misleading tweets for some users in the US, Australia and South Korea, bid to combat spread of fake news in the Covid era.
It picked a test group of users in the aforementioned countries who can already use the feature, before making it available to its almost 200 million users worldwide.
In January 2021, Twitter launched Birdwatch, another approach to fight misinformation.
Through Birdwatch, users can make comments on tweets they deem misleading, but the comments are not directly shown in the tweet.
Google also teamed up with WHO to launch an SOS Alert dedicated to the coronavirus, which appears at the top of search results when users type ‘coronavirus’.
The search engine is prioritising information on the virus from the WHO, including official WHO updates on the spread of the virus and how to stay safe.
The company started showing fact-checking labels to US viewers on its video platform YouTube in April in a bid to curb coronavirus misinformation, which exploded on social media as the pandemic intensified.
Google later brought fact-checking labels to Google Images search results globally in June 2020.
The tech giant has claimed to have kept on top of misinformation during the pandemic – in March 2021, it said it removed more than 99 million malicious ads for fake Covid-19 cures and fake vaccine doses the year prior.
Advertisement
Source link : https://www.dailymail.co.uk/sciencetech/article-10041771/YouTube-remove-misinformation-videos-vaccines.html