How Social Media Algorithms Shape Our News Consumption and Worldview

A recent study reveals that X/Twitter’s algorithms significantly shape our news feed content, potentially affecting our worldviews and political decisions. Learn more about this eye-opening research and its implications.

A new study has unveiled the profound impact that social media algorithms have on the type of news we consume and how we perceive it, especially on platforms like X/Twitter. This research sheds light on the intricate ways these algorithms shape our information environment, potentially influencing political polarization and voter behavior.

To investigate the effects of algorithmically-curated news feeds on political polarization, a team of researchers from the University of Pennsylvania and the University of Minnesota Twin Cities conducted a study involving 243 X/Twitter users and over 800,000 tweets during a three-week period at the end of 2023. The researchers aimed to compare the news quality and quantity from algorithmic feeds versus chronological feeds composed of content from accounts users directly follow.

The study, which will be presented at The 27th ACM Conference on Computer-Supported Cooperative Work and Social Computing, utilized a sociotechnical audit. This audit combined traditional methods of analyzing content and browsing history with user experience surveys, offering a comprehensive view of how users perceive news over time.

At a high level, the researchers found that X/Twitter’s algorithm significantly influences user content visibility, affecting what users see compared to their personal choices of followed accounts. 

“As we consume information, we form opinions and take actions based on those opinions,” Shengchun Huang, a doctoral student at Penn’s Annenberg School for Communication, said in a news release. “When you only consume news aligned with what you believe, you miss a lot of what is happening in the world and are less able to see things from other perspectives.”

Interestingly, contrary to the hypothesis that algorithms might push users towards more extreme content, the study discovered that algorithmic feeds presented milder content compared to the chronological timeline. 

“It turned out that, during the time we performed this audit, X/Twitter’s algorithmic feeds were presenting users with information that was milder and overall less polarizing than the chronological timeline,” lead author Stephanie Wang, a doctoral student in the Department of Computer and Information Science at Penn, said in the news release.

Moreover, the algorithmic feeds displayed fewer news articles and more diverse types of content, reducing users’ exposure to traditional news.

“During that particular moment, the information may not have been very extreme or disruptive, but this doesn’t mean we can rely on these algorithms to continue to operate in that way,” added Danaë Metaxa, an assistant professor at Penn. “What concerns me is that users of these platforms have very little control over the algorithms. The lack of transparency, restricted APIs and the current controversies surrounding the direction and ownership of X/Twitter make it a challenging space for people to find and trust quality news.”

Another significant finding involved users’ perception of news credibility. Even when users encountered news from reputable sources on social media, they were more likely to doubt its credibility, especially if it presented opposing views.

“Users reported that just the fact that they read the news on social media made it less credible,” added Alvin Zhou, an assistant professor at the University of Minnesota Twin Cities.

This sociotechnical audit marks the first of its kind in a social media news consumption study, and the research team plans to continue using this robust tool to examine other social media platforms.

“These types of audit studies are very important for any system that aims to instigate human action or behavior change,” said Metaxa. “We need to incorporate users’ experiences into our audits to evaluate how well these systems work.”

The study underscores the importance of platforms like X/Twitter in offering a safe, reliable and informative media environment. As misinformation continues to be a challenge, the researchers advocate for institutional-level solutions over individual user responsibilities.

This study suggests that while the algorithms of social media platforms like X/Twitter might not always push polarizing content, their significant influence and the lack of transparency highlight ongoing challenges in how we consume and perceive news.