How YouTube Makes Money from Fake Cancer Cure Videos

Prof Justin Stebbing from Imperial College London looks at some of the cancer claims the BBC found in YouTube videos

YouTube’s algorithm promotes fake cancer cures in a number of languages and the site runs adverts for major brands and universities next to misleading videos, a BBC investigation has found.

Searching YouTube across 10 languages, the BBC found more than 80 videos containing health misinformation – mainly bogus cancer cures. Ten of the videos found had more than a million views. Many were accompanied by adverts.

The unproven “cures” often involved consuming specific substances, such as turmeric or baking soda. Juice diets or extreme fasting were also common themes. Some YouTubers advocated drinking donkey’s milk or boiling water. None of the so-called cures offered are clinically proven to treat cancer.

Appearing before the fake cancer cure videos were adverts for well-known brands including Samsung, Heinz and Clinique.

YouTube’s advertising system means that both the Google-owned company and the video makers are making money from the misleading clips.

Khawla Aissane, speaking in Arabic on her YouTube channel, suggested donkey’s milk could stop the activity of cancer cells. YouTube

Shut down in English – but not other languages

In January, YouTube announced they would be “reducing recommendations of borderline content and content that could misinform users in harmful ways—such as videos promoting a phony miracle cure for a serious illness.”

But the company said the change would initially only affect recommendations of a very small set of videos in the United States, and does not apply in languages other than English.

The BBC search covered English, Portuguese, Russian, Arabic, Persian, Hindi, German, Ukrainian, French and Italian.

We found, for example, that in Russian, a simple search for “cancer treatment” leads to videos advocating drinking baking soda. Watching these videos in turn led to recommendations for other unproven “treatments” such as carrot juice or extreme fasting.

Erin McAweeney, a research analyst at the Data & Society institute, explained that because YouTube’s algorithm recommends similar videos to the ones you have just watched, it is continuously “carving a path” from one video to the next, regardless of the credibility of the advice offered within.

“Someone can start out on a credible video and be suggested to watch a juice cure video next. A recommendation system doesn’t know credible from non-credible content.” McAweeney says.

YouTube has stated that its recommendation system – which has been accused of leading users down rabbit holes of conspiracy theories and radicalisation – would change, recommending videos that are credible and trustworthy to people that are watching videos that might not be.

YouTube’s Community Guidelines ban harmful content including: “Promoting dangerous remedies or cures: content which claims that harmful substances or treatments can have health benefits.”

Many of the fake cancer cures the BBC found, such as juicing, were not in themselves harmful, but could indirectly damage a cancer sufferer’s health – for instance, if they neglect conventional medical approaches in favour of the so-called cures.

Many Brazilian YouTubers, including Elizeu Correia, advocated eating exotic plants to treat cancer, such as bitter gourd. YouTube

Making money with misinformation

Researchers from BBC Monitoring and BBC News Brasil were served a range of adverts before the fake cure videos.

In addition to Samsung, Heinz and Clinique, the BBC saw adverts for travel website and writing app Grammarly, for Hollywood films, and for British universities including the University of East Anglia and the University of Gloucestershire. All of the ads appeared alongside potentially harmful misinformation.

The companies and universities distanced themselves from the misleading content.

Samsung said the campaign they were running had “no connection or correlation” with the fake cancer cure video that ran after it. “Samsung follows and insists on the highest brand safety guidelines on all advertising platforms it uses,” the company said in a statement.

Kraft Heinz said that it “has a number of both automated and human controls continuously in place to ensure we avoid our advertising running with inappropriate content.

“This particular instance is concerning to us and we have taken steps to block this channel.”

Grammarly, a company whose adverts appeared 20 times alongside fake cancer cure videos views by BBC researchers, said: “Upon learning of this, we immediately contacted YouTube to pull our ads from any such channels and to ensure the ads will not appear alongside content promulgating misinformation.”

Clinique owner Estee Lauder and did not respond to requests for comment.

The two universities said that their adverts appeared next to misleading videos just once each, and that the channels were blocked from their advertising campaigns after being contacted by the BBC.

The University of East Anglia, which has its own cancer research programme, said: “No payment was made by the university [specifically for the advert which ran next to the fake cure video] and we have contacted Google to ensure that placement does not happen again.”

The University of Gloucestershire said: “When advertising on YouTube, content changes quickly and even the most attentive human and technological effort can require constant diligence. As such we are continuously working with Google to ensure this type of placement doesn’t occur again.”

How does YouTube decide what adverts you see?

Adverts on YouTube can be targeted to particular regions or audiences. The systems that determine which ad to show to which person at which time are complicated, explains Tim Schmoyer, founder of the YouTube consultancy Video Creators.

“YouTube optimizes the experience to show the right ads to the right people at the right time in order to minimize abandonment from the platform and provide most value to the advertiser, creator, and to themselves, of course,” he says.

YouTube also has the power to “demonetise” certain channels – in other words, to prevent video makers from making any revenue from advertising.

The site has made moves to demonetise channels which spread anti-vaccine misinformation, for example.

Demonitising may prevent video makers from making money, but it does not necessarily prevent their videos from going viral, according to McAweeney from Data & Society, who says that “no evidence shows that demonetising solves the issue of audience size and reach”.

“There are many motivations behind spreading health misinformation and disinformation, money is only one among them,” she says. “In most cases, getting attention and views on a video is more valuable for these actors than the money it generates.”

Efimova Tatyana, who advocated a baking soda treatment, removed her video after the BBC contacted her. YouTube

The BBC passed on details of the fake cure videos to YouTube, and they have so far demonetised over 70 of the videos for breaching their monetisation policy.

The BBC also contacted the creators of five of the videos.

One Russian YouTuber, Tatyana Efimova, who advocated the baking soda “cure”, made clear in her video that she is not a doctor. She said that she was telling a personal story of someone she knew and that it is up to viewers to decide whether to take baking soda or not. After being contacted by the BBC she removed the video and said: “It is not that important for me.”

Elizeu Correia, a Brazilian YouTuber, said his video claiming that bitter gourd tea can fight tumours “is not about a dangerous or poisonous tea”. He then made the video private, so it is not available to view to the general public.

Shunyakal, a Hindi-language media organisation, didn’t respond directly to the BBC’s request for comment, but their video about a non-medical cancer treatment centre was removed from their public channel after we contacted them. Before its removal, it had been viewed more than 1.4 million times.

The BBC also contacted Khawla Aissane, who promoted donkey’s milk as a cure, but she didn’t respond.

YouTube declined a request for an interview. In a statement the company said: “Misinformation is a difficult challenge, and we have taken a number of steps to address this including showing more authoritative content on medical issues, showing information panels with credible sources, and removing ads from videos that promote harmful health claims.

“Our systems are not perfect but we’re constantly making improvements, and we remain committed to progress in this space.”

Health community

Some YouTube videos found in the BBC’s research included caveats about the need to seek professional medical advice, but many promoted their cures as an alternative to conventional cancer treatments.

“Some of the things on YouTube and the internet are really, positively dangerous, and it’s unfiltered,” commented Prof Justin Stebbing, a leading cancer specialist at Imperial College London.

Experts also pointed to the perils of a user-generated site like YouTube, where video producers and the people within the company making decisions about content in many cases do not have a medical background.

“We are asking corporations with people who are not experts in healthcare and public health to make those judgements on behalf of all citizens,” says Isaac Chun-Hai Fung, an associate professor of epidemiology at Georgia Southern University.

Dr Fung and his students researched health information in English on YouTube. They found that regardless of the topic, the majority of the 100 most popular YouTube videos were uploaded by amateurs – people who are not healthcare or science professionals.

Part of the solution, he says, is for professionals to create more content.

“There should be high-quality education videos in multiple languages for non-professionals. Healthcare professionals should work with media professionals. I don’t think there’s enough investment in that.”

Author: Flora Carmichael & Juliana Gragnani

Be the first to comment

Leave a Reply