Is YouTube’s Algorithm Prioritizing Right-Wing and Christian Content?

Youtube on ipad over cross and american flag
YouTube Algorithm Christian Content Victor Moussa/Shutterstock Kaspars Grinvalds/Shutterstock

A new study found that YouTube’s algorithm is more likely to recommend right-wing, Christian videos to users. This is regardless of whether viewers showed an interest in that kind of content. 

The month-long experiment, published on June 18, was conducted by a U.K.-based, anti-extremism not-for-profit, the Institute for Strategic Dialogue

Over five days, researchers built up personas to explore video recommendations for users interested in four main topic areas. These topics were mommy vloggers, Spanish-language news, gaming, and male lifestyle vlogging.

They built up these personas by watching, subscribing, and searching for specific terms related to this topic area. Then, for a month, these researchers took note of the kinds of videos that were recommended for each account. 

“We wanted to, for the most part, look at topics that don’t generally direct people into extremist worlds or anything along those lines,” Aoife Gallagher, the project’s lead analyst, told NBC

What were the study’s findings?

Accounts that recorded interest in “male lifestyle guru” content were frequently recommended Fox News, despite not interacting with Fox News during the 5-day persona-building stage. 

The researchers also made two mommy-vlogging personas with different political leanings: one person watching Fox News and the other watching MSNBC. But despite having the same watch time with the respective news sources, Fox News was recommended more than MSNBC. In addition, both accounts were recommended content by the same anti-vaxx influencer. 

Across the board, all personas were recommended religious content, which was “primarily” Christian content, despite not searching for it. 

Concerningly, personas designed to mimic a child’s search activity were recommended both sexually explicit videos and videos related to Andrew Tate: a far-right, “alpha male” figurehead who is currently awaiting trial for charges of being a sex trafficker.

The study claims that the algorithm drives 70% of all YouTube views. So if we put that into context, these findings are definitely a cause for concern. Particularly for creators, whose content may be deprioritized in favor of these more controversial accounts.

What did YouTube say about the study?

In a statement to NBC, YouTube spokesperson Elena Hernandez said: “We welcome research on our recommendation system, but it’s difficult to draw conclusions based on the test accounts created by the researchers, which may not be consistent with the behavior of real people.” 

She continued, “YouTube’s recommendation system is trained to raise high-quality content on the home page, in search results, and the Watch Next panel for viewers of all ages across the platform. We continue to invest significantly in the policies, products, and practices to protect people from harmful content, especially younger viewers.”

Further reading:

Content for Creators.

News, tips, and tricks delivered to your inbox twice a week.

Newsletter Signup

Top Stories