X’s feed algorithm shifts users’ political opinions to the right, new study finds

Social media algorithms are not politically neutral and can actively shape a person’s political opinions. A recent study published in the journal Nature provides evidence that switching on the algorithmic feed on the platform X shifted users’ political views toward the right. Turning the algorithm off did not reverse this effect, which suggests that algorithms can leave a lasting footprint on a person’s information environment.

X, formerly known as Twitter, is a major platform for political news and public conversation. The platform offers two primary ways to view content. The chronological feed simply displays posts from accounts a user actively follows in the exact order they were posted, with the newest at the top.

The algorithmic feed relies on complex mathematical rules to suggest and order content. It shows posts from unknown accounts and prioritizes items designed to keep the user engaged, such as posts with many likes and comments. The scientists behind the new study wanted to understand if these customized feeds actually change how people view the world.

Past studies on other social networks found that temporarily turning off the feed algorithm did not change political attitudes. The researchers suspected this happened because initial exposure to an algorithm leaves a permanent mark on user behavior. They also wanted to conduct an independent study without the direct involvement of a tech company. This independence allowed them to see exactly how the algorithm influences real users on X.

“Feed algorithms decide what billions of people see on social media every day. Whether they also shape what people think is one of the most important open questions in the social sciences,” said study author Philine Widmer, an assistant professor at the Paris School of Economics.

“A major prior study, conducted in collaboration with Meta during the 2020 US election, found that turning off the algorithm had no measurable effect on political attitudes. So, perhaps algorithms do not matter for politics? Our study suggests that the picture is more nuanced.”

“Earlier work had only tested one direction: switching the algorithm off for users who had been exposed to it, potentially for years. But what happens when the algorithm is switched on? And could exposure to algorithmic content leave a lasting imprint that persists even after the algorithm is removed? These questions had not been tested quantitatively.”

Widmer and her colleagues specifically focused on concrete policy preferences and views on current events rather than just looking at general political party loyalty. To test this, they conducted a field experiment over seven weeks in the summer of 2023.

They recruited 4,965 active X users based in the United States. All participants completed a survey at the beginning of the study to gather baseline data on their political affiliations, social media habits, and overall well-being. The study sample was diverse, though slightly leaning toward a well-educated demographic.

About forty-six percent of the participants identified as Democrats, while twenty-one percent identified as Republicans. Participants were randomly assigned to use either the algorithmic feed or the chronological feed for the entire seven weeks. They received financial compensation for adhering to their assigned feed setting.

At the end of the study, participants completed a final survey to measure any shifts in their views on specific policies and current news events. The final survey included questions about the criminal investigations into Donald Trump and the ongoing war in Ukraine. The researchers also measured affective polarization. Affective polarization is a term used to describe how much people dislike or distrust those who belong to an opposing political party.

In addition to the surveys, the scientists collected data on the exact accounts the participants chose to follow during the study. A smaller group of users also installed a browser extension on their computers.

This extension allowed the researchers to safely record the exact posts shown in those users’ feeds without relying on the platform to share internal data. Switching users from a chronological feed to an algorithmic feed increased their overall engagement with the platform. For example, posts shown in the algorithmic feed received substantially more likes, reposts, and comments than those in the chronological feed.

Exposure to this highly engaging feed shifted users’ political opinions to be measurably more conservative. Users who switched to the algorithmic feed began to prioritize issues typically favored by Republicans, such as immigration and inflation, over issues favored by Democrats.

These users also grew more critical of the criminal investigations into Donald Trump, which were prominently discussed in the news at the time. They tended to view the investigations as unacceptable or contrary to the rule of law. Their views on the war in Ukraine also shifted toward relatively more pro-Kremlin positions.

“The main takeaway is that social media feed algorithms are not politically neutral,” Widmer told PsyPost. “In our experiment with U.S.-based X users in summer 2023, switching on X’s algorithmic feed shifted political opinions to the right.”

“In terms of standardized effect sizes, our estimates for political opinions range from about 0.08 to 0.12 standard deviations, i.e., small effects by Cohen’s conventions. However, it is worth noting that these effects emerged after just seven weeks. People have been exposed to algorithmic feeds for years. As researchers, for practical and funding reasons, we rarely get the chance to observe people for more than a couple of weeks. Over this relatively short time frame, we found shifts in opinions on current political issues.”

The browser data helped explain the specific mechanics behind this shift. Using language processing tools to analyze the collected posts, the researchers found that X’s algorithm actively promoted conservative content and posts by political activists. Specifically, posts annotated as conservative were more likely to appear in the algorithmic feed than liberal posts.

At the same time, the algorithm buried or demoted posts from traditional news media outlets. Posts from news organizations appeared significantly less often in the algorithmic feed compared to the chronological feed.

Surprisingly, switching from the algorithmic feed back to the chronological feed had almost zero impact on political opinions.

“The most striking finding for us was the asymmetry,” Widmer said. “We expected the algorithm to have some effect on political attitudes, but we did not expect the effects to be so clearly one-directional: switching the algorithm on shifted opinions, but switching it off did not reverse them. This asymmetry had not been documented before.”

The researchers suggest this one-sided effect happens because of newly followed accounts. The algorithm placed highly engaging right-wing activists directly in front of users, and people began to follow those specific profiles. When participants switched back to the chronological setting, their feeds still displayed posts from the accounts they actively followed.

Because the users had already followed new conservative activists while the algorithm was active, their chronological feed remained filled with that perspective. The changes to their daily information diet stuck, leaving a lasting impact on what they saw each day.

“Our proposed mechanism is, thus, that users continue to follow the accounts the algorithm exposed them to, even when the algorithm is turned off,” Widmer told PsyPost. “Hence, the algorithm can leave a lasting footprint on users’ information environment.”

“For readers, this means that what you see on social media is not a neutral reflection of the world or even of the accounts you choose to follow. Algorithms actively shape your information diet, and those changes can stick, i.e., they are not easily reversible.”

As with all research, there are some limitations to consider. The study was conducted specifically on X in the summer of 2023 with users based in the United States. “We would not want readers to assume that every algorithm on every platform produces the same effects, or that the same effects would necessarily be found in different periods or in different contexts,” Widmer said. Different algorithms are built with different goals, and the effects might differ during other time periods or under different corporate ownership.

The study also focused on active users who log in regularly. The impact is likely smaller for people who only use the platform occasionally and experience less exposure to the content. Ideally, future research should observe people for several months or years to see if prolonged exposure eventually alters deeper identities.

“This study was conducted entirely independently of X, funded by public research funds (the Swiss National Science Foundation),” Widmer noted. “We had no cooperation with the platform and no access to internal data.”

The study, “The political effects of X’s feed algorithm,” was authored by Germain Gauthier, Roland Hodler, Philine Widmer, and Ekaterina Zhuravskaya.

Leave a comment
Stay up to date
Register now to get updates on promotions and coupons
HTML Snippets Powered By : XYZScripts.com

Shopping cart

×