In a significant move, YouTube has unveiled its new measures to address medical misinformation, particularly focusing on the eradication of “cancer misinformation.”
The prominent video-sharing and social media platform has outlined its intention to initiate a comprehensive removal of videos that propagate cancer treatments deemed “harmful or ineffective,” or content that steers viewers away from seeking professional medical intervention.
The official blog post highlights examples, such as videos asserting that items like garlic possess cancer-curing properties or advocating for vitamin C as a substitute for radiation therapy—both of which lack scientific credibility.
This step builds on the platform’s existing strategies and insights gained in addressing erroneous medical content related to subjects like COVID-19, vaccines, and reproductive health.
YouTube’s newly established framework encompasses a plan to safeguard viewers, creators, and partners by categorizing its medical misinformation guidelines into three segments: Prevention, Treatment, and Denial.
Content that contradicts or disputes information originating from authoritative health bodies like the World Health Organization (WHO) will be taken down. Presently, first-time violations of these guidelines will warrant a warning, followed by the removal of offending content.
Repeated breaches within a span of 90 days will lead to the termination of the channel. It’s worth noting that this policy is still awaiting implementation and may undergo adjustments.
Recognizing that YouTube hosts a substantial repository of information, it’s pertinent to consider that contributors may lack formal medical qualifications or expertise, resulting in content that might not be substantiated by evidence.
A study from 2021 evaluated 40 videos concerning pediatric cancer clinical trials, revealing that more than half were “misleading with serious shortcomings.” A similar 2022 study focused on prostate cancer-related content and discovered that 98 percent of the videos contained varying degrees of misinformation regarding screening recommendations.
Nevertheless, the platform and other social media entities have faced critiques regarding their labeling of content as “misinformation.” A lawsuit by Democratic presidential candidate Robert F. Kennedy Jr. against YouTube and its parent company, Google, contends that his First Amendment rights were violated due to an alleged “censorship campaign” designed to suppress his views on vaccines.
Meanwhile, Meta (formerly Facebook) has encountered substantial criticism for its lack of transparency and consistency in content removal decisions related to COVID-19, without clear explanations.
Reflecting on the future, the authors of the YouTube blog state, “We aim to establish a robust framework that can be extended as the need for new medical misinformation policies arises. Our vigilance will continue, guided by both local and global health authority recommendations.
“We aspire to maintain clarity and transparency in our approach, ensuring that content creators comprehend the boundaries set by our policies, and viewers can trust the health information they encounter on YouTube.”