The American video giant goes to war against medical misinformation. During the Covid-19 pandemic, online platforms found themselves at the heart of the fight against false medical information. Public authorities and digital giants were then required to cooperate in order to stop the dissemination of this content. A fight against the health-related “infodemic” that continues today.

This Tuesday, the online video sharing platform YouTube announced in a blog post the implementation of new measures against this scourge. These provisions will ensure that the images do not contain “disturbing and potentially dangerous information” on medical subjects “which have been the subject of numerous scientific studies and reached a consensus”.

Specifically, YouTube intends to remove any content that “contradicts” guidelines “from local health authorities or the World Health Organization (WHO). The American company is particularly targeting the dissemination of false information related to the prevention and transmission of certain pathologies, or to “the safety and effectiveness of authorized vaccines”. Medical treatments are also affected: videos promoting remedies whose medical properties have not been scientifically proven, or even dissuading the user from consulting a health professional, will be removed from YouTube. Finally, any content “that denies the existence of certain pathologies”, including those “denying that people have died as a result of COVID-19” will also be deleted.

These rules apply to videos, their descriptions, comments, live streams, external links, as well as “any other product or feature of YouTube”, according to the platform’s rules. A broad approach, which pays particular attention to false information about cancers and their treatments, due to a “high public health risk”, this disease being among the main causes of death in the world, according to the WHO. YouTube will remove, starting Tuesday and for the next few weeks, “any content promoting cancer treatments deemed dangerous or ineffective, or discouraging viewers from consulting a healthcare professional”. Thus, videos claiming to have found an alternative to guaranteed and proven treatments against cancer, such as “garlic cures cancer” or those advising to “take vitamin C instead of undergoing radiotherapy” will be deleted, illustrates the communicated.

The platform nevertheless specifies that the context of the video will be taken into account. “We always pay close attention to context and allow content that provides sufficient context for educational, documentary, scientific or artistic purposes.” If it is sufficiently useful in the “general interest”, it will not be deleted, even if it does not respect the regulation on false medical information. This exception can relate to videos showing for example “comments made by a national political candidate in campaign challenging the directives of the health authorities”. Users testifying about their personal experience or discussing a medical study may also be exempt from these rules in certain cases. Finally, adding context when processing medical information is possible, but “does not guarantee its maintenance on YouTube”. The platform can for its part “show an information panel under these videos in order to provide context to the spectators”.

YouTube had already toughened up its fight against false medical information in 2021, mainly targeting “antivax” campaigns, and then indicated that it had removed more than a million videos spreading “dangerous misinformation about the coronavirus” since the start of the pandemic. YouTube which, recalled that behind an impressive number of content deletions, many videos actually counted very few views.