TikTok’s recommendation algorithm systematically pushed more Republican-aligned political content during the 2024 US presidential election cycle, according to a new peer-reviewed study published in Nature.
Researchers from New York University Abu Dhabi conducted one of the largest independent audits of TikTok’s “For You” recommendation system, using 323 automated accounts designed to simulate users with different political preferences. The accounts were based in New York, Texas, and Georgia, and collectively analyzed more than 280,000 recommended videos over a 27-week period during the election campaign.
The study found that accounts trained to engage with Republican content received approximately 11.5% more politically aligned recommendations than Democratic-leaning accounts. Meanwhile, Democratic-oriented accounts were exposed to roughly 7.5% more Republican content than Republican-oriented accounts saw from the opposing side.
Researchers said the imbalance persisted across all three states examined and remained visible even after accounting for engagement metrics such as likes, comments, views, and shares.
The paper also found asymmetries in how political topics were distributed. Democratic-leaning accounts were more frequently shown cross-partisan content focused on immigration and crime, while Republican-oriented accounts saw more abortion-related political content. Researchers suggested the algorithm may have amplified content targeting perceived weaknesses of opposing political groups.
Unlike platforms where users primarily build feeds through following accounts manually, TikTok’s “For You” page relies heavily on algorithmic recommendations driven by watch behavior and engagement signals. Researchers argued this makes TikTok particularly useful for studying algorithmic influence because users have less direct control over what content appears in their feeds.
TikTok disputed the findings, saying the experiment using automated accounts does not accurately represent how real users experience the platform. The company said users actively shape recommendations through numerous controls and interactions not fully reflected in the study design.
The researchers said the study does not prove intentional political manipulation or direct election interference. Instead, they argue the findings demonstrate how recommendation systems can unintentionally create systematic political imbalances on a large scale.
The paper arrives amid growing scrutiny over the political influence of social media algorithms, particularly on platforms heavily used by younger voters. TikTok now serves as a major news and political information source for millions of Americans under 30.
Researchers warned that even relatively small recommendation skews could influence public discourse when applied across massive audiences during tightly contested elections.