Brandon Stewart and Andy Guess

Research Record: YouTube’s Algorithm and its Effect on Political Polarization

Mar 18 2025
By David Pavlak
Source Princeton School of Public and International Affairs

Princeton SPIA’s Research Record series highlights the vast scholarly achievements of our faculty members, whose expertise extends beyond the classroom and into everyday life.

If you’d like your work considered for future editions of Research Record, click here and select “research project.”

The Details

Authors: Andrew M. Guess (Princeton University), Brandon M. Stewart (Princeton University), Yasemin Savas (Princeton University), Naijia Liu (Harvard University), Xinlan Emily Hu (University of Pennsylvania), Rei Mariman (University of Pennsylvania), Dean Knox (University of Pennsylvania), Matthew A. Baum (Harvard University), Justin de Benedictis-Kessner (Harvard University), Adam J Berinsky (MIT), Allison JB Chaney (Duke University), Christopher Lucas (Washington University)

Title: Short-term exposure to filter-bubble recommendation systems has limited polarization effects: Naturalistic experiments on YouTube

Journal: Proceedings of the National Academy of Sciences

The Big Picture

With billions of monthly users, YouTube is one of the world’s largest and most popular media platforms. However, the video-sharing channel’s algorithm has been accused by some of offering increasingly politically polarizing videos to its users.

To determine whether what users watched affected their political attitudes, researchers led by Princeton SPIA’s Andy Guess, associate professor of politics and public affairs and director of the Survey Research Center, and Brandon Stewart, associate professor of sociology and SPIA-associated faculty, manipulated how YouTube recommends videos in order to learn the effects of users entering into “filter bubbles” (content that is similar to previously consumed media) or “rabbit holes” (content that becomes increasingly extreme over time).

The paper’s first authors included Naijia Liu Ph.D. ’21, Xinlan Emily Hu, and Yasemin Savas.

“Previous research has tried to understand patterns in the kinds of videos served to users by existing recommendation systems,” said Stewart regarding the multi-institution collaboration. “Building on this, we wanted to come up with two well-defined variations and see whether these alternative recommendation systems had a discernible impact on real users’ opinions.”

The Findings

The team conducted experiments on two political issues — gun control and minimum wage — and included just under 9,000 users. In the design, the researchers allowed users to watch videos on a custom-built YouTube-like platform and choose videos from experimentally manipulated recommendations based on YouTube’s own internal algorithm.

The research team demonstrated that presenting people with more slanted video recommendations had no detectable polarizing effects on their attitudes in the short term. While the team cannot rule out effects from long-term exposure or to small vulnerable subsets of users, the evidence is not consistent with prevailing popular narratives about YouTube recommendation systems radicalizing users en masse.

“We can rule out modest-sized effects on opinion over the short term,” said Guess. “Tracing out to the longer term is a big challenge and might require additional assumptions about the decay of persuasive effects. Still, we hope that our evidence is informative for efforts to bound plausible impacts of these systems on users.”

The Implications

“We highlight that while algorithms can powerfully shape the content that people potentially interact with, the role of users’ own preferences — in our experiments, they chose sequences of videos from a set of four options — is important and underappreciated,” said Guess.