YouTube is making money from placing advertisements on anti-vaccination and conspiracy theory videos.
Despite the streaming giant pledging more than a year ago to stop adverts appearing next to videos pushing misinformation about vaccines, Business Matters found dozens of advertisements placed alongside conspiracy videos.
Adverts for the writing software Grammarly and the Big Yellow Box storage company appeared alongside videos claiming that doctors and nurses were guilty of genocide and that vaccines could kill. Big Yellow, the storage company, said that it was suspending advertising on YouTube as a result.
The meal service HelloFresh was promoted beside videos claiming that no vaccine had ever worked. There is overwhelming scientific evidence that vaccines are safe and effective.
Tim Loughton, the Conservative MP for Worthing and Shoreham and a member of the home affairs select committee, said that people were seeing the “misinformed ramblings of conspiracy theorists” ahead of reputable health advice.
“We know how slow YouTube and other social media platforms have been to take down content that thrives on hate speech or promotes extremism and violence, let alone preventing them from posting it in the first place,” he said.
“While the whole country is being asked to pull together in a national effort to defeat Covid some social media companies seem to think that increasing their revenue should be their priority. They urgently need to revisit their business ethics.”
Antivax channels were demonetised by YouTube in 2019, denying the creators any revenue from advertisers. Companies had complained about their products being promoted next to misinformation.
The Times found examples of content slipping through the net, however, including videos by Charlie Ward, a British believer in the conspiracy theory QAnon, who lives in Marbella.
Ward has more than 40,000 followers for his conspiracy theory videos. It is unclear whether he gets a share of ad revenue from YouTube.
In one video, which carried adverts for Grammarly and Big Yellow Box storage, Ward claims that the vaccine is causing mass deaths and was created by Bill Gates, the Microsoft co-founder, to reduce the world’s population.
“I knew they were going to shut down the world’s economy for a global financial reset,” he says. “I knew they were going to use the virus, the vaccine, the 5G, the riots and the alien invasion as a smokescreen. I knew that.”
He says that people should resist the vaccine, even if it becomes necessary for travel. “I go to Africa a lot,” he says. “I’ve never had an injection going to Africa . . . never. Because I pay $100 not to, because the Africans are good boys. I say I want the certificate . . . straight through.”
He goes on to claim that doctors and nurses who have treated coronavirus patients or administered vaccines will one day be “held responsible” for genocide.
In the first 24 hours that the video was live on YouTube, it gained more than 13,000 views. He has over 40,000 subscribers on the site with more than 1 million total views.
Companies do not have control over the videos their advertisements are placed next to, although YouTube has a policy not to put advertising next to dangerous or derogatory content, including antivax and QAnon material.
A spokesman for Big Yellow Box said: “We are very disappointed to learn about this. We were not aware of it and it should not have happened given the safety measures in place for this form of advertising. We have suspended our advertising on YouTube with immediate effect until this is resolved.”
As well as running adverts alongside antivax content, The Times found examples of misinformation on other sites being promoted through paid advertisements on YouTube.
In one case, searching for the name of Del Bigtree, an antivaxer who was banned from YouTube last summer, brought up videos from health services debunking his claims — but also paid adverts to watch his film Vaxxed on Amazon.
The work has been widely discredited. It repeats the hoax claims made by Andrew Wakefield that vaccines cause autism in children and was dropped by film festivals and cinemas.
Advertisements for the film and its sequel appeared in other search results, as did content from a QAnon activist and another well-known antivaxer.
In response to The Times’s findings, YouTube disabled adverts on the antivax and conspiracy videos and removed ads for Vaxxed and other conspiracy content. The revenue earned from the ads was quite low, the site said.
A YouTube spokeswoman added: “We quickly remove flagged content that violates our community guidelines, including Covid-19 content that explicitly contradicts expert consensus from the NHS or the World Health Organisation.
“We also have strict policies that govern what kind of videos we allow ads to appear on, and we enforce these advertising policies vigorously. Videos that promote harmful or dangerous acts or theories are not allowed to monetise.”