YouTube on Tuesday announced a new policy for combatting medical misinformation on the video-sharing platform, according to a blog post.
The company said it will streamline its existing guidelines to fall under three categories: prevention, treatment and denial. In doing so, YouTube will remove content that contradicts well-established guidance from health officials about topics like Covid-19, reproductive health, cancer and harmful substances, among others.
“While specific medical guidance can change over time as we learn more, our goal is to ensure that when it comes to areas of well-studied scientific consensus, YouTube is not a platform for distributing information that could harm people,” the company said.
YouTube, which is owned by Google, has historically struggled to moderate the content that is uploaded on its platform. A former YouTube moderator sued the company in 2020, alleging that many content moderators remain in their positions for less than a year and that the company is “chronically understaffed.”
As a result, the company is often playing catch-up, racing to remove posts that violate its established guidelines.
YouTube said it will determine whether a condition fits within its new medical policy by assessing whether it’s a high public health risk that’s often prone to misinformation. The company pointed to cancer as an example since people often turn to guidance from platforms like YouTube after learning of a diagnosis.
This means content that discourages effective treatment or promotes unproven treatment will be removed, according to the blog post.
But YouTube said content that is of public interest may remain available, even if it violates the new policy. For instance, if a political candidate disputes official health guidance or public hearing takes place that includes inaccurate information, YouTube may not remove it.
The company said it will work to add additional context to videos for viewers in these instances.