SEIZE THE $50 BILLION SITE-OF-CARE SHIFT OPPORTUNITY
Get the tools, data, and insights to drive growth.
Learn more
RECALIBRATE YOUR HEALTHCARE STRATEGY
Learn 4 strategic pivots for 2025 and beyond.
Learn more

Library

| Daily Briefing

YouTube just banned vaccine misinformation


YouTube on Wednesday banned several high-profile anti-vaccine influencers and announced that, moving forward, it would remove all content that falsely claims approved vaccines are dangerous.

Radio Advisory: Aaron Carroll on how clinicians can combat medical misinformation

YouTube cracks down on vaccine misinformation

YouTube said it would take down any content that makes false claims about approved vaccines, including claims that the vaccines cause autism, cancer, infertility, or don't reduce the transmission or contraction of disease.

The policy applies to all vaccines approved by health authorities, not just those for Covid-19.

"We've steadily seen false claims about the coronavirus vaccines spill over into misinformation about vaccines in general," the company said in a blog post. "We're now at a point where it's more important than ever to expand the work we started with Covid-19 to other vaccines."

Critical content regarding vaccines that are still undergoing clinical trials will be allowed, as will personal stories about reactions to vaccines, as long as such stories aren't posted by an account with a history of posting vaccine misinformation.

YouTube said that since 2020, the site has removed more than 130,000 videos for violating its policies regarding Covid-19 vaccines. It has now permanently removed several channels associated with high-profile spreaders of vaccine information, including those associated with Joseph Mercola and Robert F. Kennedy, Jr.

A person familiar with YouTube's policymaking process told the New York Times that, after the company developed rules surrounding Covid-19 vaccine misinformation last year, it began to consider broadening the policy.

The company found that many videos that included Covid-19 vaccine misinformation also incorporated broader misinformation, making it difficult to narrowly tamp down on Covid-19 vaccine misinformation without forming a broader policy.

However, developing the rules took months due to the difficulty of addressing content across many languages and determining the boundaries of what users would be allowed to post, the person told the Times.

Reaction

YouTube's new policy broadly aligns it with other major social media sites, such as Facebook and Twitter.

Facebook in February announced it would remove posts with false claims about vaccines, including those for diseases other than Covid-19. In March, Twitter implemented a new policy explaining penalties for sharing vaccine misinformation—although that policy focused primarily on Covid-19 and provided a "five strikes" approach of escalating penalties for violations.

Experts on misinformation have said that anti-vaccine content on social networks is a factor in vaccine hesitation. YouTube videos are often the source of that content, going viral on platforms like Facebook and Twitter, the Times reports.

"One platform's policies affect enforcement across all the others because of the way networks work across services," Evelyn Douek, a lecturer at Harvard Law School who focuses on online speech and misinformation, said. "YouTube is one of the most highly linked domains on Facebook, for example."

"It's not possible to think of these issues platform by platform," Douek added. "That's not how anti-vaccination groups think of them. We have to think of the internet ecosystem as a whole." (Sebastian, Wall Street Journal, 9/29; Alba, New York Times, 9/29; Seitz, Associated Press, 9/29)


Aaron Carroll on how clinicians can combat medical misinformation

Listen to the Radio Advisory episode

Radio Advisory, a podcast for busy health care leaders.

Medical misinformation has been a significant problem for a long time, but amid the Covid-19 pandemic, the problem has become even more widespread. In this episode, host Rachel Woods sits down with Dr. Aaron Carroll, author, professor, and Indiana University chief health officer—to discuss what all clinicians should do to combat medical misinformation.

Plus, Advisory Board experts Solomon Banjo and Pam Divack offer their take on clinician’s role in online spaces (with patients and with each other) and translate those same principles for the rest of the industry.


SPONSORED BY

INTENDED AUDIENCE

AFTER YOU READ THIS

AUTHORS

TOPICS

INDUSTRY SECTORS

MORE FROM TODAY'S DAILY BRIEFING

Don't miss out on the latest Advisory Board insights

Create your free account to access 1 resource, including the latest research and webinars.

Want access without creating an account?

   

You have 1 free members-only resource remaining this month.

1 free members-only resources remaining

1 free members-only resources remaining

You've reached your limit of free insights

Become a member to access all of Advisory Board's resources, events, and experts

Never miss out on the latest innovative health care content tailored to you.

Benefits include:

Unlimited access to research and resources
Member-only access to events and trainings
Expert-led consultation and facilitation
The latest content delivered to your inbox

You've reached your limit of free insights

Become a member to access all of Advisory Board's resources, events, and experts

Never miss out on the latest innovative health care content tailored to you.

Benefits include:

Unlimited access to research and resources
Member-only access to events and trainings
Expert-led consultation and facilitation
The latest content delivered to your inbox
AB
Thank you! Your updates have been made successfully.
Oh no! There was a problem with your request.
Error in form submission. Please try again.